I completed my PhD in 2020 from the Language Technologies Institute at Carnegie Mellon University. Probability of the inactive and active states at each time point, with the average weekly profile below.My group works on machine learning for natural language processing and knowledge representation. Total acceleration data and transformed data (log differences) with viterbi sequence. A sine wave and cosine wave with 24 hour period to ensure no repeating coviarate values for each 24 hour period.
Two time varying series are used as covariates. Example 3 - Estimation of sleep/wake (rest/active) from single total acceleration observations (N=2) with a time varying covariate Probability of the inactive state (yellow), partially active state (red) and active state (blue) at each timepoint using multivariate acceleration observations. Three state HMM with three orthogonal acceleration observations. Probability of the inactive state (red) and active state (blue) at each timepoint using multivariate acceleration observations. Example 2 - Estimation of sleep/wake (rest/active) from the three orthogonal acceleration components which make up Example 1 (N=2)Īcceleration data in each of the three orthogonal directions. Inactive state is marked by the blue area and active state by the red area. Viterbi sequence of hidden states with observation data, and the probability of each state at each time point. Gaussian distributions for the two hidden states (active and inactive) after running the Baum-Welch algorithm. Total acceleration data and transformed data (log differences). Example 1 - Estimation of sleep/wake (rest/active) from single total acceleration observations (N=2) Finally, a single continuous observation is used with a time varying covariate which allows the transition probabilities to vary with time. Second, three continuous observations, the three orthogonal components of acceleration are used to predict active and inactive periods as well as a third intermediate activity period. First using a single continuous observation, total acceleration, to predict active and inactive periods.
Sequence of most likely hidden states from the Viterbi algorithm for each of the observation sequences for testing.Įxample code which attempts to determine active and inactive periods from accelerometry data. N by N matrix of transitions between each state, or N by N by T of time varying transitions between each state. Observations may be continuous or discrete.Ī - transition matrix of probabilities. Observation Sequences - a set of sequences of observations corresponding, each sequence contains O observations and may be of variable length. State Sequences - a set of sequences of observed (non-hidden) states, each sequence may be of variable length. Time Varying Viterbi Algorithm - function to determine the most likely sequence of hidden states from a set of observations and time varying HMM parameters.
Viterbi algorithm - function to determine the most likely sequence of hidden states from a set of observations and HMM parameters. Time Varying Transition Probability Baum-Welch - Baum-Welch algorithm with time varying transition probabilities. Functionsīaum-Welch algorithm - function to determine HMM parameters from unsupervised set of observations. Finkenstädt, “Hidden Markov models for monitoring circadian rhythmicity in telemetric activity data,” J.
De Vos, “Monitoring Depression in Bipolar Disorder using Circadian Measures from Smartphone Accelerometers”. and applied in Monitoring Depression in Bipolar Disorder using Circadian Measures from Smartphone Accelerometers. The time varying hidden Markov model has been developed from the work of Huang et al. Matlab implementation of standard hidden Markov models (HMMs) with continuous emissions, and dependent HMMs which allow the parameters to vary with time.