En poursuivant votre navigation sur ce site, vous acceptez l'utilisation d'un simple cookie d'identification. Aucune autre exploitation n'est faite de ce cookie. OK
1

High-dimensional classification by sparse logistic regression

Bookmarks Report an error
Virtualconference
Authors : Abramovich, Felix (Author of the conference)
CIRM (Publisher )

Loading the player...

Abstract : In this talk we consider high-dimensional classification. We discuss first high-dimensional binary classification by sparse logistic regression, propose a model/feature selection procedure based on penalized maximum likelihood with a complexity penalty on the model size and derive the non-asymptotic bounds for the resulting misclassification excess risk. Implementation of any complexity penalty-based criterion, however, requires a combinatorial search over all possible models. To find a model selection procedure computationally feasible for high-dimensional data, we consider logistic Lasso and Slope classifiers and show that they also achieve the optimal rate. We extend further the proposed approach to multiclass classification by sparse multinomial logistic regression.

This is joint work with Vadim Grinshtein and Tomer Levy.

Keywords : Complexity penalty; convex relaxation; feature selection; high-dimensionality; minimaxity; misclassification excess risk; sparsity

MSC Codes :
62C20 - Minimax procedures
62H30 - Classification and discrimination; cluster analysis

Additional resources :
https://www.cirm-math.fr/RepOrga/2146/Slides/ABRAMOVICH_Talk.pdf

    Information on the Video

    Film maker : Hennenfent, Guillaume
    Language : English
    Available date : 15/06/2020
    Conference Date : 03/06/2020
    Subseries : Research talks
    arXiv category : Statistics Theory ; Machine Learning ; Methodology
    Mathematical Area(s) : Probability & Statistics
    Format : MP4 (.mp4) - HD
    Video Time : 00:39:06
    Targeted Audience : Researchers
    Download : https://videos.cirm-math.fr/2020-06-03_Abramovitch.mp4

Information on the Event

Event Title : Mathematical Methods of Modern Statistics 2 / Méthodes mathématiques en statistiques modernes 2
Event Organizers : Bogdan, Malgorzata ; Graczyk, Piotr ; Panloup, Fabien ; Proïa, Frédéric ; Roquain, Etienne
Dates : 15/06/2020 - 19/06/2020
Event Year : 2020
Event URL : https://www.cirm-math.com/cirm-virtual-...

Citation Data

DOI : 10.24350/CIRM.V.19640203
Cite this video as: Abramovich, Felix (2020). High-dimensional classification by sparse logistic regression. CIRM. Audiovisual resource. doi:10.24350/CIRM.V.19640203
URI : http://dx.doi.org/10.24350/CIRM.V.19640203

See Also

Bibliography

  • ABRAMOVICH, Felix, GRINSHTEIN, Vadim, et LEVY, Tomer. Multiclass classification by sparse multinomial logistic regression. arXiv preprint arXiv:2003.01951, 2020. - https://arxiv.org/abs/2003.01951

  • ABRAMOVICH, Felix et GRINSHTEIN, Vadim. High-dimensional classification by sparse logistic regression. IEEE Transactions on Information Theory, 2018, vol. 65, no 5, p. 3068-3079. - https://doi.org/10.1109/TIT.2018.2884963

  • ALQUIER, Pierre, COTTET, Vincent, LECUÉ, Guillaume, et al. Estimation bounds and sharp oracle inequalities of regularized procedures with Lipschitz loss functions. The Annals of Statistics, 2019, vol. 47, no 4, p. 2117-2144. - http://dx.doi.org/10.1214/18-AOS1742

  • BARTLETT, Peter L., JORDAN, Michael I., et MCAULIFFE, Jon D. Convexity, classification, and risk bounds. Journal of the American Statistical Association, 2006, vol. 101, no 473, p. 138-156. - https://www.jstor.org/stable/30047445

  • BELLEC, Pierre C., LECUÉ, Guillaume, TSYBAKOV, Alexandre B., et al. Slope meets lasso: improved oracle bounds and optimality. The Annals of Statistics, 2018, vol. 46, no 6B, p. 3603-3642. - http://dx.doi.org/10.1214/17-AOS1670

  • DANIELY, Amit, SABATO, Sivan, BEN-DAVID, Shai, et al. Multiclass learnability and the erm principle. The Journal of Machine Learning Research, 2015, vol. 16, no 1, p. 2377-2404. - http://jmlr.org/papers/volume16/daniely15a/daniely15a.pdf



Bookmarks Report an error