Multi-Objective Optimization of Hyperparameter Tuning

Advisor(s)

Dr. Ian Kropp

Confirmation

1

Document Type

Poster

Location

ONU McIntosh Center; Activities Room

Start Date

11-4-2025 12:00 PM

End Date

11-4-2025 12:50 PM

Abstract

Hyperparameter tuning is crucial in optimizing deep learning models, often requiring a balance between computational efficiency and model performance. This research explores multi-objective optimization performance for hyperparameter tuning, focusing on the trade-off between compiling time and the resulting model accuracy. To achieve results in this research, we used the Pymoo library, a Python library used for multi-objective optimization, and its documentation in order to amend previously worked-on problems to fit our needs. A custom-made dataset was used where a default configuration of a set image classification model using simple CNNs where hyperparameters are systematically altered upon running the model, recording of the training time and resulting accuracy of the model was done. This dataset was fed into our outlined problem statement and constraints, resulting in the convergence of the research multi-objective model. This problem and our model do not converge on any specific result; hence, an outside client chooses the best-fit model for their needs. This research aims to identify multiple optimal configurations that maximize accuracy while minimizing computational cost, providing insights into efficient model training strategies. Results highlight the benefits of adaptive tuning approaches in achieving an optimal balance between performance and resource consumption.

This document is currently not available here.

Restricted

Available to ONU community via local IP address and ONU login.

Share

COinS
 
Apr 11th, 12:00 PM Apr 11th, 12:50 PM

Multi-Objective Optimization of Hyperparameter Tuning

ONU McIntosh Center; Activities Room

Hyperparameter tuning is crucial in optimizing deep learning models, often requiring a balance between computational efficiency and model performance. This research explores multi-objective optimization performance for hyperparameter tuning, focusing on the trade-off between compiling time and the resulting model accuracy. To achieve results in this research, we used the Pymoo library, a Python library used for multi-objective optimization, and its documentation in order to amend previously worked-on problems to fit our needs. A custom-made dataset was used where a default configuration of a set image classification model using simple CNNs where hyperparameters are systematically altered upon running the model, recording of the training time and resulting accuracy of the model was done. This dataset was fed into our outlined problem statement and constraints, resulting in the convergence of the research multi-objective model. This problem and our model do not converge on any specific result; hence, an outside client chooses the best-fit model for their needs. This research aims to identify multiple optimal configurations that maximize accuracy while minimizing computational cost, providing insights into efficient model training strategies. Results highlight the benefits of adaptive tuning approaches in achieving an optimal balance between performance and resource consumption.