Hierarchical Bayesian nonparametric models with applications
Hierarchical modeling is a fundamental concept in Bayesian statistics. The basic idea is that
parameters are endowed with distributions which may themselves introduce new …
parameters are endowed with distributions which may themselves introduce new …
[HTML][HTML] Unsupervised automatic speech recognition: A review
Abstract Automatic Speech Recognition (ASR) systems can be trained to achieve
remarkable performance given large amounts of manually transcribed speech, but large …
remarkable performance given large amounts of manually transcribed speech, but large …
First language acquisition
EV Clark, M Casillas - The Routledge handbook of linguistics, 2015 - api.taylorfrancis.com
It is with renewed appreciation for the size of the task that I come to writing the present
chapter. The number of special issues of journals dedicated to im/politeness these days …
chapter. The number of special issues of journals dedicated to im/politeness these days …
Between words and characters: A brief history of open-vocabulary modeling and tokenization in NLP
What are the units of text that we want to model? From bytes to multi-word expressions, text
can be analyzed and generated at many granularities. Until recently, most natural language …
can be analyzed and generated at many granularities. Until recently, most natural language …
The nested chinese restaurant process and bayesian nonparametric inference of topic hierarchies
We present the nested Chinese restaurant process (nCRP), a stochastic process that
assigns probability distributions to ensembles of infinitely deep, infinitely branching trees …
assigns probability distributions to ensembles of infinitely deep, infinitely branching trees …
Care and feeding of topic models: Problems, diagnostics, and improvements
J Boyd-Graber, D Mimno… - Handbook of mixed …, 2014 - api.taylorfrancis.com
Topic models are statistical models for learning the latent structure in document collections,
and have gained much attention in the machine learning community over the last decade …
and have gained much attention in the machine learning community over the last decade …
A Bayesian framework for word segmentation: Exploring the effects of context
Since the experiments of Saffran et al.[Saffran, J., Aslin, R., & Newport, E.(1996). Statistical
learning in 8-month-old infants. Science, 274, 1926–1928], there has been a great deal of …
learning in 8-month-old infants. Science, 274, 1926–1928], there has been a great deal of …
Unsupervised pattern discovery in speech
AS Park, JR Glass - IEEE Transactions on Audio, Speech, and …, 2007 - ieeexplore.ieee.org
We present a novel approach to speech processing based on the principle of pattern
discovery. Our work represents a departure from traditional models of speech recognition …
discovery. Our work represents a departure from traditional models of speech recognition …
[PDF][PDF] Bayesian unsupervised word segmentation with nested Pitman-Yor language modeling
In this paper, we propose a new Bayesian model for fully unsupervised word segmentation
and an efficient blocked Gibbs sampler combined with dynamic programming for inference …
and an efficient blocked Gibbs sampler combined with dynamic programming for inference …
Adaptor grammars: A framework for specifying compositional nonparametric Bayesian models
M Johnson, T Griffiths… - Advances in neural …, 2006 - proceedings.neurips.cc
This paper introduces adaptor grammars, a class of probabilistic models of language that
generalize probabilistic context-free grammars (PCFGs). Adaptor grammars augment the …
generalize probabilistic context-free grammars (PCFGs). Adaptor grammars augment the …