Amplification d'arbres de régression compatibles avec l'encodage de la sortie, application à la reconnaissance des images de chiffres manuscrits

Authors: Ben Fadhel, Khalil
Advisor: Marchand, MarioLaviolette, François
Abstract: Boosting is a widely used approach for solving classification and regression problems. Its strength lies in its ability to improve the performance of individual weak classifiers to construct a strong one. The theory of boosting is well established as a gradient descent in functional space. However, the design of a boostable weak learner is still an open issue. Inspired by the algorithms Adaboost-MH and XGBoost, we propose a new family of weak learners called confidence rated multi-class Hamming trees where a tree supports output coding, performs a single disjoint partitioning of the input space, and outputs a real valued vector in order to better approximate the negative functional gradient of the cost function. We also propose ajoint boosting algorithm, called QuadBoost-MHCR for Quadratic Loss Boosting with Multi-class Hamming output encoding, and Confidence Rated predictions. The algorithm minimizes a multi-class L2-loss function, and it is easy to extend it, in an XGBoost fashion, to minimize any twice differentiable loss function.
Document Type: Mémoire de maîtrise
Issue Date: 2019
Open Access Date: 2 August 2019
Grantor: Université Laval
Collection:Thèses et mémoires

Files in this item:
Description SizeFormat 
35311.pdf2.22 MBAdobe PDFThumbnail
All documents in CorpusUL are protected by Copyright Act of Canada.