En poursuivant votre navigation sur ce site, vous acceptez l'utilisation d'un simple cookie d'identification. Aucune autre exploitation n'est faite de ce cookie. OK
1

The price of competition: effect size heterogeneity matters in high dimensions!

Sélection Signaler une erreur
Virtualconference
Auteurs : Wang, Hua (Auteur de la Conférence)
CIRM (Editeur )

Loading the player...

Résumé : In high-dimensional regression, the number of explanatory variables with nonzero effects - often referred to as sparsity - is an important measure of the difficulty of the variable selection problem. As a complement to sparsity, this paper introduces a new measure termed effect size heterogeneity for a finer-grained understanding of the trade-off between type I and type II errorsor, equivalently, false and true positive rates using the Lasso. Roughly speaking, a regression coefficient vector has higher effect size heterogeneity than another vector (of the same sparsity) if the nonzero entries of the former are more heterogeneous than those of the latter in terms of magnitudes. From the perspective of this new measure, we prove that in a regime of linear sparsity, false and true positive rates achieve the optimal trade-off uniformly along the Lasso path when this measure is maximum in the sense that all nonzero effect sizes have very differentmagnitudes, and the worst-case trade-off is achieved when it is minimum in the sense that allnonzero effect sizes are about equal. Moreover, we demonstrate that the Lasso path produces anoptimal ranking of explanatory variables in terms of the rank of the first false variable when the effect size heterogeneity is maximum, and vice versa. Metaphorically, these two findings suggest that variables with comparable effect sizes—no matter how large they are—would compete with each other along the Lasso path, leading to an increased hardness of the variable selection problem. Our proofs use techniques from approximate message passing theory as well as a novel argument for estimating the rank of the first false variable.

Keywords : Lasso; Lasso path; false variable; false discovery rate; true positive rate; approximate message passing

Codes MSC :
62F03 - Hypothesis testing
62J05 - Linear regression
62J07 - "Ridge regression; James-Stein estimators"

Ressources complémentaires :
https://www.cirm-math.fr/RepOrga/2146/Slides/Wang.pdf

    Informations sur la Vidéo

    Réalisateur : Hennenfent, Guillaume
    Langue : Anglais
    Date de publication : 15/06/2020
    Date de captation : 02/06/2020
    Sous collection : Research talks
    arXiv category : Statistics Theory ; Machine Learning
    Domaine : Probability & Statistics
    Format : MP4 (.mp4) - HD
    Durée : 00:34:07
    Audience : Researchers
    Download : https://videos.cirm-math.fr/2020-06-02_Wang.mp4

Informations sur la Rencontre

Nom de la rencontre : Mathematical Methods of Modern Statistics 2 / Méthodes mathématiques en statistiques modernes 2
Organisateurs de la rencontre : Bogdan, Malgorzata ; Graczyk, Piotr ; Panloup, Fabien ; Proïa, Frédéric ; Roquain, Etienne
Dates : 15/06/2020 - 19/06/2020
Année de la rencontre : 2020
URL Congrès : https://www.cirm-math.com/cirm-virtual-e...

Données de citation

DOI : 10.24350/CIRM.V.19644303
Citer cette vidéo: Wang, Hua (2020). The price of competition: effect size heterogeneity matters in high dimensions!. CIRM. Audiovisual resource. doi:10.24350/CIRM.V.19644303
URI : http://dx.doi.org/10.24350/CIRM.V.19644303

Voir aussi

Bibliographie

  • ANDERSON, Theodore W. An introduction to multivariate statistical analysis. 1958. -

  • BARANIUK, Richard, DAVENPORT, Mark, DEVORE, Ronald, et al. A simple proof of the restricted isometry property for random matrices. Constructive Approximation, 2008, vol. 28, no 3, p. 253-263. - https://doi.org/10.1007/s00365-007-9003-x

  • BAYATI, Mohsen et MONTANARI, Andrea. The dynamics of message passing on dense graphs, with applications to compressed sensing. IEEE Transactions on Information Theory, 2011, vol. 57, no 2, p. 764-785. - https://doi.org/10.1109/TIT.2010.2094817

  • BAYATI, Mohsen et MONTANARI, Andrea. The LASSO risk for Gaussian matrices. IEEE Transactions on Information Theory, 2011, vol. 58, no 4, p. 1997-2017. - https://doi.org/10.1109/TIT.2011.2174612

  • BICKEL, Peter J., RITOV, Ya'acov, TSYBAKOV, Alexandre B., et al. Simultaneous analysis of Lasso and Dantzig selector. The Annals of Statistics, 2009, vol. 37, no 4, p. 1705-1732. - http://dx.doi.org/10.1214/08-AOS620

  • BÜHLMANN, P. Invited discussion on” regression shrinkage and selection via the lasso: a retrospective (r. tibshirani)”. Journal of the Royal Statistical Society: Series B, 2011, vol. 73, p. 277-279. - https://doi.org/10.1111/j.1467-9868.2011.00771.x

  • BÜHLMANN, Peter et VAN DE GEER, Sara. Statistics for high-dimensional data: methods, theory and applications. Springer Science & Business Media, 2011. - http://dx.doi.org/10.1007/978-3-642-20192-9

  • CANDES, Emmanuel J. et TAO, Terence. Decoding by linear programming. IEEE transactions on information theory, 2005, vol. 51, no 12, p. 4203-4215. - https://doi.org/10.1109/TIT.2005.858979

  • DONOHO, David et MONTANARI, Andrea. High dimensional robust m-estimation: Asymptotic variance via approximate message passing. Probability Theory and Related Fields, 2016, vol. 166, no 3-4, p. 935-969. - http://dx.doi.org/10.1007/s00440-015-0675-z

  • DONOHO, David et TANNER, Jared. Observed universality of phase transitions in high-dimensional geometry, with implications for modern data analysis and signal processing. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 2009, vol. 367, no 1906, p. 4273-4293. - https://doi.org/10.1098/rsta.2009.0152

  • DONOHO, David L., MALEKI, Arian, et MONTANARI, Andrea. Message-passing algorithms for compressed sensing. Proceedings of the National Academy of Sciences, 2009, vol. 106, no 45, p. 18914-18919. - https://doi.org/10.1073/pnas.0909892106

  • DONOHO, David L., JAVANMARD, Adel, et MONTANARI, Andrea. Information-theoretically optimal compressed sensing via spatial coupling and approximate message passing. IEEE transactions on information theory, 2013, vol. 59, no 11, p. 7434-7464. - https://doi.org/10.1109/TIT.2013.2274513

  • EFRON, Bradley, HASTIE, Trevor, JOHNSTONE, Iain, et al. Least angle regression. The Annals of statistics, 2004, vol. 32, no 2, p. 407-499. - http://dx.doi.org/10.1214/009053604000000067

  • FAN, Jianqing, SONG, Rui, et al. Sure independence screening in generalized linear models with NP-dimensionality. The Annals of Statistics, 2010, vol. 38, no 6, p. 3567-3604. - http://dx.doi.org/10.1214/10-AOS798

  • BARBER, Rina Foygel, CANDÈS, Emmanuel J., et al. A knockoff filter for high-dimensional selective inference. The Annals of Statistics, 2019, vol. 47, no 5, p. 2504-2537. - http://dx.doi.org/10.1214/18-AOS1755

  • G'SELL, Max Grazier, WAGER, Stefan, CHOULDECHOVA, Alexandra, et al. Sequential selection procedures and false discovery rate control. Journal of the royal statistical society: series B (statistical methodology), 2016, vol. 78, no 2, p. 423-444. - https://doi.org/10.1111/rssb.12122

  • JANSON, Lucas, SU, Weijie, et al. Familywise error rate control via knockoffs. Electronic Journal of Statistics, 2016, vol. 10, no 1, p. 960-975. - http://dx.doi.org/10.1214/16-EJS1129

  • MONTANARI, Andrea et RICHARD, Emile. Non-negative principal component analysis: Message passing algorithms and sharp asymptotics. IEEE Transactions on Information Theory, 2015, vol. 62, no 3, p. 1458-1484. - https://doi.org/10.1109/TIT.2015.2457942

  • MOUSAVI, Ali, MALEKI, Arian, BARANIUK, Richard G., et al. Consistent parameter estimation for LASSO and approximate message passing. The Annals of Statistics, 2018, vol. 46, no 1, p. 119-148. - http://dx.doi.org/10.1214/17-AOS1544

  • POKAROWSKI, Piotr et MIELNICZUK, Jan. Combined l1 and greedy l0 penalized least squares for linear model selection. Journal of Machine Learning Research, 2015, vol. 16, no 5, p. 961-992. - http://www.jmlr.org/papers/volume16/pokarowski15a/pokarowski15a.pdf

  • REEVES, Galen et GASTPAR, Michael C. Approximate sparsity pattern recovery: Information-theoretic lower bounds. IEEE Transactions on Information Theory, 2013, vol. 59, no 6, p. 3451-3465. - https://doi.org/10.1109/TIT.2013.2253852

  • RHEE, Soo-Yon, TAYLOR, Jonathan, WADHERA, Gauhar, et al. Genotypic predictors of human immunodeficiency virus type 1 drug resistance. Proceedings of the National Academy of Sciences, 2006, vol. 103, no 46, p. 17355-17360. - https://doi.org/10.1073/pnas.0607274103

  • SISKIND, Victor. Second moments of inverse Wishart-matrix elements. Biometrika, 1972, vol. 59, no 3, p. 690-691. - https://doi.org/10.1093/biomet/59.3.690

  • SU, Weijie, BOGDAN, Małgorzata, CANDES, Emmanuel, et al. False discoveries occur early on the lasso path. The Annals of statistics, 2017, vol. 45, no 5, p. 2133-2150. - http://dx.doi.org/10.1214/16-AOS1521

  • SU, Weijie J. When is the first spurious variable selected by sequential regression procedures?. Biometrika, 2018, vol. 105, no 3, p. 517-527. - https://doi.org/10.1093/biomet/asy032

  • TIBSHIRANI, Robert. Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society: Series B (Methodological), 1996, vol. 58, no 1, p. 267-288. - https://doi.org/10.1111/j.2517-6161.1996.tb02080.x

  • WAINWRIGHT, Martin J. Sharp thresholds for High-Dimensional and noisy sparsity recovery using $\ell _ {1} $-Constrained Quadratic Programming (Lasso). IEEE transactions on information theory, 2009, vol. 55, no 5, p. 2183-2202. - https://doi.org/10.1109/TIT.2009.2016018

  • WAINWRIGHT, Martin J. Information-theoretic limits on sparsity recovery in the high-dimensional and noisy setting. IEEE Transactions on Information Theory, 2009, vol. 55, no 12, p. 5728-5741. - https://doi.org/10.1109/TIT.2009.2032816

  • WAINWRIGHT, Martin J. High-dimensional statistics: A non-asymptotic viewpoint. Cambridge University Press, 2019. - https://doi.org/10.1017/9781108627771



Sélection Signaler une erreur