ACL RD-TEC 1.0 Summarization of P06-1124
Paper Title:
A HIERARCHICAL BAYESIAN LANGUAGE MODEL BASED ON PITMAN-YOR PROCESSES
A HIERARCHICAL BAYESIAN LANGUAGE MODEL BASED ON PITMAN-YOR PROCESSES
Primarily assigned technology terms:
- approximate inference
- approximation
- bayesian approach
- character recognition
- clustering
- computational linguistics
- computing
- cross-validation
- gibbs sampler
- gibbs sampling
- handwriting recognition
- language modelling
- language processing
- learning
- machine learning
- machine translation
- maximum-entropy
- maximum-likelihood
- model selection
- modelling
- natural language processing
- optical character recognition
- processing
- recognition
- sampling
- smoothing
- smoothing techniques
- speech recognition
- structure learning
- suffix tree
- tractable inference
- validation
Other assigned terms:
- approach
- association for computational linguistics
- bayesian model
- break
- conditional distribution
- cross entropy
- data sets
- dirichlet distribution
- distribution
- entropy
- experimental results
- fact
- handwriting
- hierarchical bayesian n-gram model
- index
- interpolation
- interpretation
- knowledge
- language model
- language models
- linguistic
- linguistics
- markov chain
- metaphor
- method
- n-gram
- n-gram language model
- n-gram model
- n-gram models
- n-grams
- natural language
- natural languages
- parameter values
- pitman-yor language model
- posterior
- posterior distribution
- priori
- probabilistic model
- probabilistic models
- probabilities
- probability
- probability estimates
- probability theory
- procedure
- process
- sentence
- sentences
- set size
- statistics
- suffix
- terms
- test data
- test set
- theory
- training
- training data
- training set
- training set size
- tree
- trigram
- unigram
- unigram language model
- vocabulary
- vocabulary size
- word
- word corpus
- words