En poursuivant votre navigation sur ce site, vous acceptez l'utilisation d'un simple cookie d'identification. Aucune autre exploitation n'est faite de ce cookie. OK

Documents 62E20 2 results

Filter
Select: All / None
Q
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y
Modern machine learning architectures often embed their inputs into a lower-dimensional latent space before generating a final output. A vast set of empirical results---and some emerging theory---predicts that these lower-dimensional codes often are highly structured, capturing lower-dimensional variation in the data. Based on this observation, in this talk I will describe efforts in my group to develop lightweight algorithms that navigate, restructure, and reshape learned latent spaces. Along the way, I will consider a variety of practical problems in machine learning, including low-rank adaptation of large models, regularization to promote local latent structure, and efficient training/evaluation of generative models.[-]
Modern machine learning architectures often embed their inputs into a lower-dimensional latent space before generating a final output. A vast set of empirical results---and some emerging theory---predicts that these lower-dimensional codes often are highly structured, capturing lower-dimensional variation in the data. Based on this observation, in this talk I will describe efforts in my group to develop lightweight algorithms that navigate, ...[+]

62E20 ; 62F99 ; 62G07 ; 62P30 ; 65C50 ; 68T99

Bookmarks Report an error
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y
In this talk I will introduce the multilinear empirical copula for discrete or mixed data and its asymptotic behavior will be studied. This result will then be used to construct inference procedures for multivariate data. Applications for testing independence will be presented.

62E20 ; 62G10 ; 62G20 ; 62H15

Bookmarks Report an error