Improving De-noising Methods for Stochastic Differential Equations

Comparison of probability distributions of A and τ determined by least-square fit and maximum likelihood analysis. We plotted the estimated probability distributions from the samples using Gaussian kernel density estimation (KDE …

Comparison of probability distributions of A and τ determined by least-square fit and maximum likelihood analysis. We plotted the estimated probability distributions from the samples using Gaussian kernel density estimation (KDE implementation in SciPy: scipy.stats.gaussian_ked). The least-square estimates for τ are significantly wider than the Bayesian estimate. (a) Histogram of A; (b) histogram of decay time τ.

One area of active inquiry by our group is the development of algorithms to ‘unmix’ the noise in stochastic signals, so as to clean the signal in situations where we have no known ground truth.

We previously developed a Bayesian approach to estimate parameters from time traces that originate from an overdamped Brownian particle in a harmonic potential, or Ornstein-Uhlenbeck process (OU). We show that least-square fitting the autocorrelation function, which is often the standard way of analyzing such data, is significantly underestimating the confidence intervals of the fitted parameters. Here, we develop a rigorous maximum likelihood theory that properly captures the underlying statistics. From the analytic solution, we found that there exists an optimal measurement spacing (Δt=0.7968τ) that maximizes the statistical accuracy of the estimate for the decay-time τ of the process for a fixed number of samples N, which plays a similar role than the Nyquist-Shannon theorem for the OU process. To support our claims, we simulated time series with subsequent application of least-square and our maximum likelihood method. Our results suggest that it is quite dangerous to apply least-squares to autocorrelation functions both in terms of systematic deviations from the true parameter values and an order-of-magnitude underestimation of confidence intervals. To see whether our findings apply to other methods where autocorrelation functions are typically fitted by least-squares, we explored the analysis of membrane fluctuations and fluorescence correlation spectroscopy. In both cases, least-square fits exhibit systematic deviations from the true parameter values and significantly underestimate their confidence intervals. This fact emphasizes the need for the development of proper maximum likelihood approaches for such methods. In summary, our results have strong implications for parameter estimation for processes that result in a single exponential decay in the autocorrelation function. Our analysis can directly be applied to single-component dynamic light scattering experiments or optical trap calibration experiments.

For more information, see:

Estimation of parameters from time traces originating from an Ornstein-Uhlenbeck process.

Helmut H. Strey. Physical Review E. 2019 Dec;100(6-1):062142. doi: 10.1103/PhysRevE.100.062142.


Previous
Previous

Predicting infection and detecting illness using Oura rings

Next
Next

Ground-truth resting-state signal provides data-driven estimation and correction for scanner distortion of fMRI time-series dynamics