Hidden Markov Models: A Hidden Markov Model (HMM) is a Markov process that is split into two components: An observable component and an unobservable, or ”hidden” component that follows a Markov process.
HMMs naturally describe setups where a stochastic system is observed through noisy measurements, for instance, stock prices, that are affected by an unobserved economic factor.
A basic HMM contains the following components:
▶ A set of M states S = {s1, ..., sN }
▶ A transition probability matrix P
▶ A sequence of T, possibly vector-valued, observations YT = {y1, ..., yT }
▶ A sequence of observation marginal likelihoods f (yt|st = i) for each i = 1, .., N.
▶ An initial probability distribution π = {π1, ..., πN }.
An important assumption is that the hidden Markov process is independent of past observations Yt−1, i.e., P{st = j|st = i, Yt−1} = P{st = j|st = i} = pij.
Estimation basics of a Hidden Markov Model
Our interest lies in making inferences about the probability of being in each state si at each date t, as well as estimating the model parameters of the transition matrix P, the initial distribution π, and the vector
(μ1, ..., μN , σ).
Collecting all the parameters in a vector θ the log-likelihood function of the process becomes: