|Warning: The content displayed has been partially or fully included from Wikipedia::Baum–Welch_algorithm. Please refer to the source wiki's license and our policy on Interwiki Importing for more information.|
In electrical engineering, computer science, statistical computing and bioinformatics, the Baum–Welch algorithm is used to find the unknown parameters of a hidden Markov model (HMM). It makes use of the forward-backward algorithm and is named for Leonard E. Baum and Lloyd R. Welch.
The Baum–Welch algorithm is a particular case of a generalized expectation-maximization (GEM) algorithm. It can compute maximum likelihood estimates and posterior mode estimates for the parameters (transition and emission probabilities) of an HMM, when given only emissions as training data.
For a given cell in the transition matrix, all paths to that cell are summed. There is a link (transition from that cell to a cell ). The joint probability of , the link, and can be calculated and normalized by the probability of the entire string. Call this .
Now, calculate the probability of all paths with all links emanating from . Normalize this by the probability of the entire string. Call this .
Now divide by . This is dividing the expected transition from to by the expected transitions from . As the corpus grows, and particular transitions are reinforced, they will increase in value, reaching a local maximum. No way to ascertain a global maximum is known.
The α-HMM estimation algorithm  by Yasuo Matsuyama is a generalized version of the Baum–Welch algorithm derived from the α-expectation-maximization algorithm  . This algorithm utilizes past information so that the convergence is sped up.
- jhmm implementation in Java.
- HMMFit function in the RHmm package for R.
- ghmm C library with Python bindings that supports both discrete and continuous emissions.
See also 
The algorithm was introduced in the paper:
- L. E. Baum, T. Petrie, G. Soules, and N. Weiss, "A maximization technique occurring in the statistical analysis of probabilistic functions of Markov chains", Ann. Math. Statist., vol. 41, no. 1, pp. 164–171, 1970.
The Shannon Lecture by Welch, which speaks to how the algorithm can be implemented efficiently:
- Hidden Markov Models and the Baum–Welch Algorithm, IEEE Information Theory Society Newsletter, Dec. 2003.
An alternative to the Baum–Welch algorithm, the Viterbi Path Counting algorithm:
- R. I. A. Davis, B. C. Lovell, "Comparing and evaluating HMM ensemble training algorithms using train and test and condition number criteria", Pattern Analysis and Applications, vol. 6, no. 4, pp. 327–336, 2003.
- Matsuyama, Yasuo (2011). "Hidden Markov model estimation based on alpha-EM algorithm: Discrete and continuous alpha-HMMs". International Joint Conference on Neural Networks: 808–816.
- Matsuyama, Yasuo (2003). "The α-EM algorithm: Surrogate likelihood maximization using α-logarithmic information measures". IEEE Transactions on Information Theory 49 (3): 692706.
- An Interactive Spreadsheet for Teaching the Forward-Backward Algorithm (spreadsheet and article with step-by-step walkthrough)
- Formal derivation of the Baum–Welch algorithm
- Implementation of the Baum–Welch algorithm