tune
Hyperparameter search over a parameter grid. Cross-validated internally. Returns the best model and the full trial history.
Signature
ml.tune(data, target, *, algorithm=None, params=None, method="random", n_trials=20, cv_folds=3, seed)
ml_tune(data, target, algorithm = NULL, n_trials = 20L, cv_folds = 3L, method = "random", seed = NULL, params = NULL)
Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
data | DataFrame | — | Training data |
target | str | — | Target column |
algorithm | str | None | None | Algorithm to tune. None uses the model's algorithm or "auto". |
params | dict | None | None | Parameter ranges. None uses sensible defaults. |
method | str | "random" | "random" or "grid" |
n_trials | int | 20 | Number of search trials |
cv_folds | int | 3 | Cross-validation folds per trial |
seed | int | — | Random seed |
Returns
TuningResult with:
.best_model— the model with the best CV score.best_params_— winning hyperparameters.trials— DataFrame of all trials and scores
Examples
Default tuning
result = ml.tune(s.train, "target", algorithm="xgboost", seed=42)
print(result.best_params_)
ml.evaluate(result.best_model, s.valid) result <- ml_tune(s$train, "target", algorithm = "xgboost", seed = 42)
result$best_params_
ml_evaluate(result$best_model, s$valid) Custom parameter grid
result = ml.tune(
s.train, "target",
algorithm="random_forest",
params={
"n_estimators": [100, 200, 500],
"max_depth": [5, 10, 20, None],
},
method="grid",
seed=42,
) result <- ml_tune(
s$train, "target",
algorithm = "random_forest",
params = list(
n_estimators = c(100, 200, 500),
max_depth = c(5, 10, 20)
),
method = "grid",
seed = 42
)