Déposez votre fichier ici pour le déplacer vers cet enregistrement.
y
A new Central Limit Theorem (CLT) is developed for random variables of the form ξ=z⊤f(z)−divf(z) where z∼N(0,In).
The normal approximation is proved to hold when the squared norm of f(z) dominates the squared Frobenius norm of ∇f(z) in expectation.
Applications of this CLT are given for the asymptotic normality of de-biased estimators in linear regression with correlated design and convex penalty in the regime p/n→γ∈(0,∞). For the estimation of linear functions ⟨a,β⟩ of the unknown coefficient vector β, this analysis leads to asymptotic normality of the de-biased estimate for most normalized directions a0, where "most" is quantified in a precise sense. This asymptotic normality holds for any coercive convex penalty if γ<1 and for any strongly convex penalty if γ≥1. In particular the penalty needs not be separable or permutation invariant.
[-]
A new Central Limit Theorem (CLT) is developed for random variables of the form ξ=z⊤f(z)−divf(z) where z∼N(0,In).
The normal approximation is proved to hold when the squared norm of f(z) dominates the squared Frobenius norm of ∇f(z) in expectation.
Applications of this CLT are given for the asymptotic normality of de-biased estimators in linear regression with correlated design and convex penalty in the regime p/n→γ∈(0,∞). For the estimation of ...
[+]