Sampling can be faster than optimization
Optimization algorithms and Monte Carlo sampling algorithms have provided the
computational foundations for the rapid growth in applications of statistical machine learning …
computational foundations for the rapid growth in applications of statistical machine learning …
[图书][B] Statistical foundations of actuarial learning and its applications
MV Wüthrich, M Merz - 2023 - library.oapen.org
This open access book discusses the statistical modeling of insurance problems, a process
which comprises data collection, data analysis and statistical model building to forecast …
which comprises data collection, data analysis and statistical model building to forecast …
Sliced wasserstein distance for learning gaussian mixture models
Gaussian mixture models (GMM) are powerful parametric tools with many applications in
machine learning and computer vision. Expectation maximization (EM) is the most popular …
machine learning and computer vision. Expectation maximization (EM) is the most popular …
Local maxima in the likelihood of gaussian mixture models: Structural results and algorithmic consequences
We provide two fundamental results on the population (infinite-sample) likelihood function of
Gaussian mixture models with $ M\geq 3$ components. Our first main result shows that the …
Gaussian mixture models with $ M\geq 3$ components. Our first main result shows that the …
Guaranteed bounds on information-theoretic measures of univariate mixtures using piecewise log-sum-exp inequalities
Information-theoretic measures, such as the entropy, the cross-entropy and the Kullback–
Leibler divergence between two mixture models, are core primitives in many signal …
Leibler divergence between two mixture models, are core primitives in many signal …
Moment varieties of Gaussian mixtures
C Améndola, JC Faugere, B Sturmfels - arXiv preprint arXiv:1510.04654, 2015 - arxiv.org
The points of a moment variety are the vectors of all moments up to some order of a family of
probability distributions. We study this variety for mixtures of Gaussians. Following up on …
probability distributions. We study this variety for mixtures of Gaussians. Following up on …
The Gaussian conditional independence inference problem
T Boege - 2022 - repo.bibliothek.uni-halle.de
The present thesis deals with Gaussian conditional independence structures and their
inference problem. Conditional independence (CI) is a notion from statistics and information …
inference problem. Conditional independence (CI) is a notion from statistics and information …
Fast approximations of the Jeffreys divergence between univariate Gaussian mixtures via mixture conversions to exponential-polynomial distributions
F Nielsen - Entropy, 2021 - mdpi.com
The Jeffreys divergence is a renown arithmetic symmetrization of the oriented Kullback–
Leibler divergence broadly used in information sciences. Since the Jeffreys divergence …
Leibler divergence broadly used in information sciences. Since the Jeffreys divergence …
Maximum number of modes of Gaussian mixtures
C Améndola, A Engström… - Information and Inference …, 2020 - academic.oup.com
Gaussian mixture models are widely used in Statistics. A fundamental aspect of these
distributions is the study of the local maxima of the density or modes. In particular, it is not …
distributions is the study of the local maxima of the density or modes. In particular, it is not …
[HTML][HTML] Diffusion model conditioning on Gaussian mixture model and negative Gaussian mixture gradient
Diffusion models (DMs) are a type of generative model that has had a significant impact on
image synthesis and beyond. They can incorporate a wide variety of conditioning inputs …
image synthesis and beyond. They can incorporate a wide variety of conditioning inputs …