GitHub
deepbox/ml

Tree-Based Models

Decision tree models that learn hierarchical rules from data. Non-parametric, handle non-linear relationships, and require no feature scaling.

DecisionTreeClassifier

Classification tree. Recursively splits data on the feature/threshold that maximizes information gain (Gini impurity or entropy). Interpretable but prone to overfitting without depth limits.

DecisionTreeRegressor

Regression tree. Splits to minimize MSE within each leaf. Predicts the mean of training samples in each leaf node.

RandomForestClassifier

Ensemble of decision trees trained on random subsets of data and features (bagging). Reduces overfitting compared to a single tree. Robust, parallelizable, and requires minimal tuning.

RandomForestRegressor

Random forest for regression tasks. Averages predictions from multiple decision trees.

Constructor Options

  • maxDepth: number — Maximum tree depth (default: unlimited). Lower values prevent overfitting.
  • minSamplesSplit: number — Minimum samples required to split a node (default: 2).
  • minSamplesLeaf: number — Minimum samples in a leaf node (default: 1).
  • maxFeatures: number | 'sqrt' | 'log2' — Number of features to consider at each split.
  • criterion: 'gini' | 'entropy' — Split quality criterion (classifier only).
tree-models.ts
import { DecisionTreeClassifier, DecisionTreeRegressor } from "deepbox/ml";import { tensor } from "deepbox/ndarray";const X = tensor([[1, 2], [2, 3], [3, 1], [4, 3], [5, 2]]);const y = tensor([0, 0, 1, 1, 1]);const tree = new DecisionTreeClassifier({ maxDepth: 5, minSamplesSplit: 2 });tree.fit(X, y);const pred = tree.predict(tensor([[3, 2]])); // [1]// Regression treeconst yReg = tensor([1.5, 2.3, 3.1, 4.0, 5.2]);const regTree = new DecisionTreeRegressor({ maxDepth: 3 });regTree.fit(X, yReg);regTree.predict(tensor([[3.5, 2.5]]));