We are extremely excited to announce the release of a foundational piece of our ML pipeline: hyperparameter tuning. This is one of a series of updates we are making to make sure we deliver the highest quality predictions via our predictive models.
When a new model is fit, in addition to selecting the features and their contributions during the training process, model configurations (aka hyperparameters) can be tuned to boost performance. We use hyperparameters extensively in AmpID as well as AmpIQ. With respect to predictive customer lifetime value (or pCLV), some of the hyperparameters we consider include the number of trees in the random forest, the outlier definition, and the selection of features that go into the model.
In this latest update, we've added automated tuning to our pCLV model. What does this mean? Simply that we see better results in less training time, which means more value for you.
Hyperparameter tuning is the heart and soul of an Automated ML pipeline. In fact, the latest research suggests that hyperparameter selection can be even more important than algorithm selection. Over the coming months, we will further invest in our hyperparameter tuning capabilities and provide more insights about the resulting improvements in our models.