Let us consider a time series x(t). We assume that it can be decomposed to a deterministic η (trend) and a random ε part
where i are indices of moments in time in which x(t) function has been measured. Extracting a deterministic part of time series is important for analysis of deterministic sources of a time signal. For each xi value one can compute the moving average over 2k + 1 points
i+k | ||
ui = 1/(2k+1) | ∑ | xj |
j=i-k |
where xj are measured in time moments
A ηj function in the above - defined range of 2k+1 length is assumed to be a polynomial function of l order in t variable
l and k parameters must fulfill the relation l < 2k+1. To find a set of coefficients {am}l+1m=1 the least square method is applied. The vector of coefficients is given by
where A is a matrix of (2k+1)✗(l+1)
A = - | ( |
|
). |
For j = 0 i. e. in the center of averaging range η is estimated by
where x is a column vector. By analogy, η* value for each i value is given by
Edge effects are present in the case of the first and the last k time points. Thus new estimators are needed. They are given by the expressions
where a*(k+1) coefficients are expanded for the first averaging range (with the center in the k+1 time point) while a*(n - k) coefficients are found for the last averaging range (with the center in the n-k time point).
To estimate variance of xj measurements in the range of length 2k+1 one can use the expression
k | ||
sx2 = 1/(2k-l) | ∑ | (xj - η*j)2 |
j=-k |
where η*j is given by
It leads to the conclusion that at the confidence level of 1-α
and thus limits of confidence ranges are as follows
where t1-α/2 is a quantile of the Student's distribution of 2k-l degrees of freedom. The real trend value lies between η+(i) and η-(i).
Limits of confidence ranges for the first and the last averaging range (i. e. for the j = i - k - 1 and j = i + k - n time point) are given by
where
and
Package for machine learning - OptFinderML.