Stochastic grammar

A stochastic grammar (statistical grammar) is a grammar framework with a probabilistic notion of grammaticality:

Statistical natural language processing uses stochastic, probabilistic and statistical methods, especially to resolve difficulties that arise because longer sentences are highly ambiguous when processed with realistic grammars, yielding thousands or millions of possible analyses. Methods for disambiguation often involve the use of corpora and Markov models. "A probabilistic model consists of a non-probabilistic model plus some numerical quantities; it is not true that probabilistic models are inherently simpler or less structural than non-probabilistic models."[1]

The technology for statistical NLP comes mainly from machine learning and data mining, both of which are fields of artificial intelligence that involve learning from data.

See also

References

  1. John Goldsmith. 2002. "Probabilistic Models of Grammar: Phonology as Information Minimization." Phonological Studies #5: 2146.

Further reading

This article is issued from Wikipedia - version of the Friday, December 04, 2015. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.