Mostra i principali dati dell'item

Feature Selection in Classification by means of Optimization and Multi-Objective Optimization

dc.contributor.authorPirouz, Behzad
dc.contributor.authorFortino, Giancarlo
dc.contributor.authorGaudioso, Manlio
dc.date.accessioned2025-06-20T10:38:03Z
dc.date.available2025-06-20T10:38:03Z
dc.date.issued2023-05-10
dc.identifier.urihttps://hdl.handle.net/10955/5626
dc.descriptionUNIVERSITA’ DELLA CALABRIA Dipartimento di Ingegneria Informatica, Modellistica, Elettronica e Sistemistica - DIMES Dottorato di Ricerca in Information and Communication Technologies (ICT). Ciclo XXXVen_US
dc.description.abstractThe thesis is in the area of mathematical optimization with application to Machine Learning. The focus is on Feature Selection (FS) in the framework of binary classification via Support Vector Machine paradigm. We concentrate on the use of sparse optimization techniques, which are widely considered as the election tool for tackling FS. We study the problem both in terms of single and multi-objective optimization. We propose first a novel Mixed-Integer Nonlinear Programming (MINLP) model for sparse optimization based on the polyhedral k-norm. We introduce a new way to take into account the k-norm for sparse optimization by setting a model based on fractional programming (FP). Then we address the continuous relaxation of the problem, which is reformulated via a DC (Difference of Convex) decomposition. On the other hand, designing supervised learning systems, in general, is a multi-objective problem. It requires finding appropriate trade-offs between several objectives, for example, between the number of misclassified training data (minimizing the squared error) and the number of nonzero elements separating the hyperplane (minimizing the number of nonzero elements). When we deal with multi-objective optimization problems, the optimization problem has yet to have a single solution that represents the best solution for all objectives simultaneously. Consequently, there is not a single solution but a set of solutions, known as the Pareto-optimal solutions. We overview the SVM models and the related Feature Selection in terms of multi-objective optimization. Our multi-objective approach considers two simultaneous objectives: minimizing the squared error and minimizing the number of nonzero elements of the normal vector of the separator hyperplane. In this thesis, we propose a multi-objective model for sparse optimization. Our primary purpose is to demonstrate the advantages of considering SVM models as multi-objective optimization problems. In multi-objective cases, we can obtain a set of Pareto optimal solutions instead of one in single-objective cases. Therefore, our main contribution in this thesis is of two levels: first, we propose a new model for sparse optimization based on the polyhedral k-norm for SVM classification, and second, use multi-objective optimization to consider this new model. The results of several numerical experiments on some classification datasets are reported. We used all the datasets for single-objective and multi-objective models.en_US
dc.description.sponsorshipLa borsa di dottorato è stata cofinanziata con risorse del Programma Operativo Nazionale Ricerca e Innovazione 2014-202 (CCI 2014IT16M2OP005) Fondo Sociale Europeo, Azione I.1 “Dottorati Innovativi con caratterizzazione Industriale”en_US
dc.language.isoenen_US
dc.publisherUniversità della Calabriaen_US
dc.relation.ispartofseriesMAT/09;
dc.subjectFeature Selectionen_US
dc.subjectClassificationen_US
dc.subjectSparse Optimizationen_US
dc.subjectK-normen_US
dc.subjectMulti-Objective Optimizationen_US
dc.titleFeature Selection in Classification by means of Optimization and Multi-Objective Optimizationen_US
dc.typeThesisen_US


Files in questo item

Questo item appare nelle seguenti collezioni

Mostra i principali dati dell'item