The rapid growth of machine learning (ML) has resulted in an almost overwhelmingly large number of modelling techniques, demanding better elucidation of their strengths and weaknesses in applied contexts. Tree-based methods such as Random Forests (RF) and Boosted Regression Trees (BRT) are powerful ML approaches that make no assumptions about the functional forms of the relationship with predictors, are flexible in handling missing data, and can easily capture complex, non-linear interactions. As with many ML methods, however, RF and BRT are potentially vulnerable to overfitting and a subsequent loss of generalizability.
Leave a Reply