We consider a problem of sequential detection of changes in general time series, in which case the observations are dependent and non-identically distributed, e.g., follow Markov, hidden Markov or even more general stochastic models. It is assumed that the pre-change model is completely known, but the post-change model contains an unknown (possibly vector) parameter. Imposing a distribution on the unknown post-change parameter, we design a mixture Shiryaev-Roberts change detection procedure in such a way that the maximal local probability of a false alarm (MLPFA) in a prespecified time window does not exceed a given level and show that this procedure is nearly optimal as the MLPFA goes to zero in the sense of minimizing the expected delay to detection uniformly over all points of change under very general conditions. These conditions are formulated in terms of the rate of convergence in the strong law of large numbers for the log-likelihood ratios between the “change” and “nochange” hypotheses. An example related to a multivariate Markov model where these conditions hold is given.
Asymptotic optimality | Change-point detection | Composite post-change hypothesis | Quickest detection | Weighted Shiryaev-Roberts procedure