ElixirConf 2023 - Andrés Alejos - Nx Powered Decision Trees

Explore the power of Nx powered decision trees in machine learning, covering classification, regression, feature selection, and more, as presented at ElixirConf 2023 by Andrés Alejos.

Key takeaways
  • Decision trees can be used for classification problems where the target is a class.
  • Decision trees can be grown to a specified depth or limited by a stopping criterion.
  • The optimal splits of a decision tree can be determined using various criteria such as Gini impurity or information gain.
  • Decision trees can be trained using a variety of algorithms such as gradient boosting.
  • XGBoost is a popular and efficient algorithm for training decision trees.
  • Decision trees can be used for feature selection and dimensional reduction.
  • Decision trees can be used for regression problems where the target is a continuous value.
  • Decision trees can be used for clustering problems where the target is a cluster label.
  • Decision trees can be used for data preprocessing and feature engineering.
  • Decision trees can be used for model interpretability and variable importance.
  • Decision trees can be used for model ensembling and stacking.
  • Decision trees can be used for time series forecasting and prediction.
  • Decision trees can be used for recommender systems and content-based filtering.
  • Decision trees can be used for information retrieval and text classification.
  • Decision trees can be used for natural language processing and speech recognition.