Timezone: »
Sherpa is a free open-source hyperparameter optimization library for machine learning models. It is designed for problems with computationally expensive iterative function evaluations, such as the hyperparameter tuning of deep neural networks. With Sherpa, scientists can quickly optimize hyperparameters using a variety of powerful and interchangeable algorithms. Additionally, the framework makes it easy to implement custom algorithms. Sherpa can be run on either a single machine or a cluster via a grid scheduler with minimal configuration. Finally, an interactive dashboard enables users to view the progress of models as they are trained, cancel trials, and explore which hyperparameter combinations are working best. Sherpa empowers machine learning researchers by automating the tedious aspects of model tuning and providing an extensible framework for developing automated hyperparameter-tuning strategies. Its source code and documentation are available at https://github.com/LarsHH/sherpa and https://parameter-sherpa.readthedocs.io/, respectively. A demo can be found at https://youtu.be/L95sasMLgP4.
Author Information
Peter Sadowski (University of Hawai‘i)
More from the Same Authors
-
2020 : Hip Fracture Risk Modeling Using DXA and Deep Learning »
Peter Sadowski -
2022 : Self-supervised detection of atmospheric phenomena from remotely sensed synthetic aperture radar imagery »
Yannik Glaser · Peter Sadowski · Justin Stopa -
2020 : Nowcasting Solar Irradiance Over Oahu »
Peter Sadowski -
2014 Poster: Searching for Higgs Boson Decay Modes with Deep Learning »
Peter Sadowski · Daniel Whiteson · Pierre Baldi -
2014 Spotlight: Searching for Higgs Boson Decay Modes with Deep Learning »
Peter Sadowski · Daniel Whiteson · Pierre Baldi -
2013 Poster: Understanding Dropout »
Pierre Baldi · Peter Sadowski -
2013 Oral: Understanding Dropout »
Pierre Baldi · Peter Sadowski