Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y
Polynomial optimization methods often encompass many major scalability issues on the practical side. Fortunately, for many real-world problems, we can look at them in the eyes and exploit the inherent data structure arising from the input cost and constraints. The first part of my lecture will focus on the notion of 'correlative sparsity', occurring when there are few correlations between the variables of the input problem. The second part will present a complementary framework, where we show how to exploit a distinct notion of sparsity, called 'term sparsity', occurring when there are a small number of terms involved in the input problem by comparison with the fully dense case. At last but not least, I will present a very recently developed type of sparsity that we call 'ideal-sparsity', which exploits the presence of equality constraints. Several illustrations will be provided on important applications arising from various fields, including computer arithmetic, robustness of deep networks, quantum entanglement, optimal power-flow, and matrix factorization ranks.
[-]
Polynomial optimization methods often encompass many major scalability issues on the practical side. Fortunately, for many real-world problems, we can look at them in the eyes and exploit the inherent data structure arising from the input cost and constraints. The first part of my lecture will focus on the notion of 'correlative sparsity', occurring when there are few correlations between the variables of the input problem. The second part will ...
[+]
65F50 ; 90C22 ; 90C23
Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y
Polynomial optimization methods often encompass many major scalability issues on the practical side. Fortunately, for many real-world problems, we can look at them in the eyes and exploit the inherent data structure arising from the input cost and constraints. The first part of my lecture will focus on the notion of 'correlative sparsity', occurring when there are few correlations between the variables of the input problem. The second part will present a complementary framework, where we show how to exploit a distinct notion of sparsity, called 'term sparsity', occurring when there are a small number of terms involved in the input problem by comparison with the fully dense case. At last but not least, I will present a very recently developed type of sparsity that we call 'ideal-sparsity', which exploits the presence of equality constraints. Several illustrations will be provided on important applications arising from various fields, including computer arithmetic, robustness of deep networks, quantum entanglement, optimal power-flow, and matrix factorization ranks.
[-]
Polynomial optimization methods often encompass many major scalability issues on the practical side. Fortunately, for many real-world problems, we can look at them in the eyes and exploit the inherent data structure arising from the input cost and constraints. The first part of my lecture will focus on the notion of 'correlative sparsity', occurring when there are few correlations between the variables of the input problem. The second part will ...
[+]
65F50 ; 90C22 ; 90C23