Friday, 2020-02-28 BioInfo Pakistan
Section categories
 Related Subjects [38]This category includes brief overview of all related subjects. Defining BioInformatics [7]In this section we tried to briefly explain what bioinformatics is ? Unviersities [30]This contains information about universities that are offering bioinformatics degree programs. Resources [24]Contains information about bioinformatics resources including databases, tools and techniques. Algorithms [31]This category includes some of the basic algorithms that are usually used by bioinformaticians.
 Our poll Pakistani Students Should Join Bio-Informatics Yes No Total of answers: 35
 Chat Box Only authorized users can post messages
 Statistics Total online: 1 Guests: 1 Users: 0
Home » 2011 » August » 21 » Forward-backward Algorithm
11:51 AM
Forward-backward Algorithm

 Forward-Backward Algorithm1. Description:   The forward-backward algorithm is an inference algorithm for hidden Markov models which computes the posterior marginals of all hidden state variables given a sequence of observations/emissions , i.e. it computes, for all hidden state variables , the distribution . This inference task is usually called smoothing. The algorithm makes use of the principle of dynamic programming to efficiently compute the values that are required to obtain the posterior marginal distributions in two passes. The first pass goes forward in time while the second goes backward in time; hence the name forward-backward algorithm. •2.Evaluation problem:Evaluation problem:Given the HMM  M=(A, B, p) and the observation sequence O=o1 o2 ... oK, calculate the probability that model M has generated sequence O. Trying to find probability of observations O=o1 o2 ... oK  by means of considering all hidden state sequences (as was done in example) is impractical: NK hidden state sequences - exponential complexity. Use Forward-Backward HMM algorithms for efficient calculations.Define the forward variable ak(i) as the joint probability of the partial observation sequence o1 o2 ... ok  and that the hidden state at time k is si : ak(i)= P(o1 o2 ... ok , qk= si)3. Trellis representation of an HMM4. Forward Recursion For HMM•Initialization:a1(i)= P(o1,q1= si)=pi bi (o1) , 1<=i<=N. •Forward recursion:      ak+1(i)= P(o1 o2 ... ok+1 , qk+1= sj ) = Si P(o1 o2 ... ok+1 , qk= si , qk+1= sj ) =  Si P(o1 o2 ... ok , qk= si) aij bj (ok+1 ) =  [Si ak(i) aij ] bj (ok+1 ) , 1<=j<=N, 1<=k<=K-1. •Termination:  P(o1 o2 ... oK) = Si P(o1 o2 ... oK , qK= si) = Si aK(i) • Complexity :  N2K operations. 5. Backward Recursion For HMM:Define the forward variable bk(i) as the joint probability of the partial observation sequence ok+1 ok+2 ... oK  given  that the hidden state at time k is si  : bk(i)= P(ok+1 ok+2 ... oK |qk= si ) Initialization:bK(i)= 1  , 1<=i<=N. Backward Recursion:bk(j)= P(ok+1 ok+2 ... oK | qk= sj ) = Si P(ok+1 ok+2 ... oK , qk+1= si  | qk= sj ) = Si P(ok+2 ok+3 ... oK | qk+1= si) aji bi (ok+1) = Si bk+1(i) aji bi (ok+1 ), 1<=j<=N, 1<=k<=K-1. Termination:P(o1 o2 ... oK) = Si P(o1 o2 ... oK , q1= si) = Si P(o1 o2 ... oK  |q1= si) P(q1= si) = Si b1(i) bi (o1) pi
Category: Algorithms | Views: 1770 | Added by: Ansari | Rating: 0.0/0