We are now ready to discuss the most difficult challenge in applications of Hidden Marker Models, how to find out the parameters of the HMM. So far we can assume that the transition and emission probabilities of the HMM are known. For example when you walked in the group kazina, you knew that the dealer changes the coins with probability 0.1, and you also knew the biases of both fair and biased coin. However, imagine that you only know that the dealer is using two coins, and know nothing else, know nothing about probabilities of biased and fair coins or probabilities of change in coins. Suppose you observe this sequence. How can you figure out that are the biases of the coins and how often the dealer changes the coin? Can we develop an algorithm for parameter estimation for an arbitrary HMM? Let's start answering this difficult question from the unlikely scenario when the dealer reveals what is the hidden paths. In this case we will face HMM parameter estimation problem: find optimal parameter explaining the emitted string and the hidden path. So hidden path and emitted string [INAUDIBLE] given at the bottom of the slide. How would we derive transition and emission probabilities? In the case the hidden path is known let's define Tlk is number transitions from state l to state k in the hidden path pi. For example in this case the number of transition from b to f equal to five and then common sense suggest us since the transition probabilities from state l to state k can be defined as number transition from state L to state k divided by the number of all transition from state l and in this case it will be five over nine. Now after we defined the number the estimated probability of transition to state l to state k. Let's now estimate the emission probability of symbol b from state k, and to do this, let's define the parameter Ek(b) number of times symbol b is emitted when path pie is in state k. In our case, EF(T)=6 x. And the common sense again suggests to define emission probability of symbol b from state k as the number of times symbol b is emitted in state k divided by the total number of symbols emitted in state k. Now k's at six over 11. It turn out that transmission and emission probabilities defined in this way defines a optimal solution of the HMM parameter estimation problem. However, the real problem they are trying to solve is the problem of where dealer is not so neither hidden parts. No parameters are known and it results in HMM Parameter Learning Problem. More difficult problems and HMM parameter rise to estimation problem. In HMM parameter learning problem, when you estimate the parameters of an HMM explaining an emitted string. Only the emitted string does not. So let's try the following for solving this problem. The only given is string. But let's start from an arbitrary wild guess of parameters. As soon as we guess the parameter, we can derive hidden paths by solving the decoding problem. As soon as hidden path is known, we can pretend that we don't know the parameter, and we can solve HMM parameter estimation problem deriving a new central path. Afterwards, we can once again forget about the hidden path we have constructed and use emitted string and new set of parameters to derive the new hidden path. And iterate. This algorithm is called the Viterbi learning for HMM. The deficiency of Viterbi learning is that it is based on the Viterbi algorithm that gives strict yes or no answer to the question, was the HMM in state k at time i, give that it emitted string X. This question fails to account for how certain we are in the yes or no answer. And in the next section we'll try to design a soft answer to this question and design a new parameter learning category. That improves on the Viterbi learning.