Skip to yearly menu bar Skip to main content


Keynote Talk
in
Workshop: Efficient Natural Language and Speech Processing (Models, Training, and Inference)

Benchmarks for Multi-objective Hyperparameter Optimization

Kevin Duh


Abstract:

The speed, size, and accuracy of deep neural networks often depend on hyperparameters such as network depth and architecture type. Hyperparameter optimization and neural architecture search are promising techniques that help developers build the best-possible network under budget constraints. I will discuss the importance of building benchmarks to evaluate these techniques in a multi-objective way. By incorporating multiple objectives such as training time, inference speed, and model size into hyperparameter optimization, we ensure a more holistic evaluation of the entire model development and deployment process.