Nature-based Hyperparameter Tuning of a Multilayer Perceptron Algorithm in Task Classification: A Case Study on Fear of Failure in Entrepreneurship
Abstract
Entrepreneurship plays a key role in generating economic growth, encouraging innovation, and creating job opportunities. Understanding which demographic, psychological, and socio-economic factors contribute to fear of failure in entrepreneurship is essential to developing proper standards in entrepreneurship education and policy. However, it remains challenging to accurately classify these factors, especially when balancing model performance with model complexity in a multilayer perceptron algorithm. An effective model requires the correct parameter setting via a hyperparameter tuning process. Adjusting each hyperparameter by hand requires significant effort and knowledge, as there are frequently multiple combinations to consider. Furthermore, manual tuning is prone to human error and may overlook optimal configurations, resulting in inferior model performance and prediction accuracy. This study evaluates nature-inspired optimization techniques, including particle swarm optimization (PSO), genetic algorithm (GA), and grey wolf optimization (GWO). Several parameters are tuned in the present multilayer perceptron model, including the number of hidden layers and the number of nodes in each hidden layer, learning rate, and activation functions. The used dataset which consists of 39 features from 333 samples captured individual fears, loss score, and computational efficiency as the required amount of time for finding the best parameter combination. Model accuracy performance scores are 45.16%, 53.76%, and 58.61% for GA, PSO, and GWO, respectively. Meanwhile their execution time are 10 minutes, 27 minutes, and 23 minutes, for GA, PSO, and GWO, respectively. Experiment results further reveal that each optimization algorithm has distinct advantages: GA excels at speedy convergence, PSO provides a robust exploration of hyperparameter space, and GWO offers remarkable adaptability to complicated parameter interdependencies. This study provides empirical evidence for the efficacy of nature-inspired hyperparameter modification in improving multilayer perceptron performance for fear of failure categorization tasks.
Article Metrics
Abstract: 23 Viewers PDF: 11 ViewersKeywords
Full Text:
PDFRefbacks
- There are currently no refbacks.
Journal of Applied Data Sciences
ISSN | : | 2723-6471 (Online) |
Organized by | : | Computer Science and Systems Information Technology, King Abdulaziz University, Kingdom of Saudi Arabia. |
Website | : | http://bright-journal.org/JADS |
: | taqwa@amikompurwokerto.ac.id (principal contact) | |
support@bright-journal.org (technical issues) |
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0