Webinar 2021 Scaling Hyperparameter Tuning with Dask-ML on Clusters

From SHARCNETHelp
Jump to navigationJump to search

Hyperparameter tuning is necessary in machine learning, especially when a good performance is desired in building a machine learning model. Scikit-learn usually works well especially when the search space is relatively small and the dataset fits in the memory of a single computer. Scaling up the hyperparameter tuning could be challenging though because of the computations becoming much more intense. Some of the problems can be "memory constrained", e.g. when a model needs to be tuned for a much larger dataset after a local development, others can be "compute constrained", e.g. when the search space is much larger and many hyperparameters need to be tuned which takes days or weeks. Dask-ML provides a framework and tools to help with these issues. This webinar will provide some suggestions and demonstrate some examples on Graham cluster.