GitHub
deepbox/ml

Ensemble Methods

Combine multiple models for better performance. Reduce variance (bagging) or bias (boosting) compared to individual models.

GradientBoostingClassifier

Sequential ensemble where each tree corrects errors of the previous one. Typically achieves higher accuracy than random forests but requires more careful tuning.

GradientBoostingRegressor

Gradient boosting for regression. Builds trees sequentially to minimize a differentiable loss function.

Key Parameters

  • nEstimators: number — Number of trees in the forest/boosting rounds (default: 100).
  • maxDepth: number — Maximum depth of each tree.
  • learningRate: number — Step size for boosting (GradientBoosting/AdaBoost only).
  • subsample: number — Fraction of samples per tree (GradientBoosting only).
  • maxFeatures: 'sqrt' | 'log2' | number — Features considered per split.
ensemble-models.ts
import { RandomForestClassifier, GradientBoostingClassifier } from "deepbox/ml";import { tensor } from "deepbox/ndarray";const X = tensor([[1, 2], [2, 3], [3, 1], [4, 3], [5, 2]]);const y = tensor([0, 0, 1, 1, 1]);// Random Forestconst rf = new RandomForestClassifier({ nEstimators: 100, maxDepth: 10 });rf.fit(X, y);rf.predict(tensor([[3, 2]])); // [1]// Gradient Boostingconst gb = new GradientBoostingClassifier({  nEstimators: 50,  learningRate: 0.1,  maxDepth: 3,});gb.fit(X, y);gb.predict(tensor([[3, 2]]));