Mostra i principali dati dell'item
Feature Selection in Classification by means of Optimization and Multi-Objective Optimization
dc.contributor.author | Pirouz, Behzad | |
dc.contributor.author | Fortino, Giancarlo | |
dc.contributor.author | Gaudioso, Manlio | |
dc.date.accessioned | 2025-06-20T10:38:03Z | |
dc.date.available | 2025-06-20T10:38:03Z | |
dc.date.issued | 2023-05-10 | |
dc.identifier.uri | https://hdl.handle.net/10955/5626 | |
dc.description | UNIVERSITA’ DELLA CALABRIA Dipartimento di Ingegneria Informatica, Modellistica, Elettronica e Sistemistica - DIMES Dottorato di Ricerca in Information and Communication Technologies (ICT). Ciclo XXXV | en_US |
dc.description.abstract | The thesis is in the area of mathematical optimization with application to Machine Learning. The focus is on Feature Selection (FS) in the framework of binary classification via Support Vector Machine paradigm. We concentrate on the use of sparse optimization techniques, which are widely considered as the election tool for tackling FS. We study the problem both in terms of single and multi-objective optimization. We propose first a novel Mixed-Integer Nonlinear Programming (MINLP) model for sparse optimization based on the polyhedral k-norm. We introduce a new way to take into account the k-norm for sparse optimization by setting a model based on fractional programming (FP). Then we address the continuous relaxation of the problem, which is reformulated via a DC (Difference of Convex) decomposition. On the other hand, designing supervised learning systems, in general, is a multi-objective problem. It requires finding appropriate trade-offs between several objectives, for example, between the number of misclassified training data (minimizing the squared error) and the number of nonzero elements separating the hyperplane (minimizing the number of nonzero elements). When we deal with multi-objective optimization problems, the optimization problem has yet to have a single solution that represents the best solution for all objectives simultaneously. Consequently, there is not a single solution but a set of solutions, known as the Pareto-optimal solutions. We overview the SVM models and the related Feature Selection in terms of multi-objective optimization. Our multi-objective approach considers two simultaneous objectives: minimizing the squared error and minimizing the number of nonzero elements of the normal vector of the separator hyperplane. In this thesis, we propose a multi-objective model for sparse optimization. Our primary purpose is to demonstrate the advantages of considering SVM models as multi-objective optimization problems. In multi-objective cases, we can obtain a set of Pareto optimal solutions instead of one in single-objective cases. Therefore, our main contribution in this thesis is of two levels: first, we propose a new model for sparse optimization based on the polyhedral k-norm for SVM classification, and second, use multi-objective optimization to consider this new model. The results of several numerical experiments on some classification datasets are reported. We used all the datasets for single-objective and multi-objective models. | en_US |
dc.description.sponsorship | La borsa di dottorato è stata cofinanziata con risorse del Programma Operativo Nazionale Ricerca e Innovazione 2014-202 (CCI 2014IT16M2OP005) Fondo Sociale Europeo, Azione I.1 “Dottorati Innovativi con caratterizzazione Industriale” | en_US |
dc.language.iso | en | en_US |
dc.publisher | Università della Calabria | en_US |
dc.relation.ispartofseries | MAT/09; | |
dc.subject | Feature Selection | en_US |
dc.subject | Classification | en_US |
dc.subject | Sparse Optimization | en_US |
dc.subject | K-norm | en_US |
dc.subject | Multi-Objective Optimization | en_US |
dc.title | Feature Selection in Classification by means of Optimization and Multi-Objective Optimization | en_US |
dc.type | Thesis | en_US |