explain

Feature importance via permutation or SHAP. Returns an Explanation with ranked features and importance scores.

Signature

ml.explain(model, *, data=None, method="auto", seed=None)
# R: use ml_plot(model, kind = "importance")

Parameters

ParameterTypeDefaultDescription
modelModelA fitted model
dataDataFrame | NoneNoneData for permutation importance. If None, uses tree-based feature importance.
methodstr"auto""auto", "permutation", or "shap"
seedint | NoneNoneRandom seed for permutation shuffling

Returns

Explanation — a DataFrame with feature and importance columns, sorted by importance.

Examples

exp = ml.explain(model, data=s.valid, seed=42)
print(exp)

# Visualize
ml.plot(model, kind="importance")
ml_plot(model, kind = "importance")
  feature    importance
  fare       0.423
  age        0.376
  embarked   0.051
  pclass     0.051
  sex        0.040
  sibsp      0.031
  parch      0.028

Notes

  • method="auto" uses tree-based importance when available (fast), permutation otherwise.
  • Permutation importance is model-agnostic and measures actual predictive contribution, not just split frequency.
  • SHAP values are available when the shap package is installed.