Nonlinear
Misc
- Gaps in the time variable can be a problem if you are trying to interpolate between those gaps. (see bkmk,
method = "reml" + s(x, m = 1)
) - Resources
- Nonlinear Time Series Analysis, Kantz, Schreiber (See R >> Documents >> Time Series)
- Packages
- Time Series Task View “Nonlinear Time Series Analysis”
- {nonlinearTseries} (Vignette) - Facilitates the computation of the most-used nonlinear statistics/algorithms including generalized correlation dimension, information dimension, largest Lyapunov exponent, sample entropy and Recurrence Quantification Analysis (RQA), among others. Basic routines for surrogate data testing are also included.
- {tsDyn}
- AR: standard linear AR (auto-regressive)
- SETAR: self-exciting threshold AR
- LSTAR: Logistic smooth transition AR
- NNET: neural-network
- AAR: additive AR
- {probcast} - Has function wrappers around gams, gamlss, and boosted gamlss models from {mgcv}, {mboost}, {gamlss}, etc. for use in forecasting. Supports high-dimensional dependency modeling based on Gaussian Copulas (paper, use case)
- {{EristroPy}} (Paper) - Bayesian optimization of Sample Entropy’s hyperparameters
- Copulas
- Also see Association, Copulas
- TUTORIAL julia, copulas + ARMA model, example w/exonential distribution - ARMA Forecasting for Non-Gaussian Time-Series Data Using Copulas | by Sarem Seitz | Jun, 2022 | Towards Data Science
- Issues
- When the size of the observed time-series becomes very large.
- In that case, the unconditional covariance matrix will scale poorly and the model fitting step will likely become impossible.
- Potential Solution: Implicit Copulas which define a Copula density through a chain of conditional densities
- MLE for distributions where the derivatives of their cdfs becomes complex
- Exponental distribution’s is simple (used in article)
- See article for potential solutions
- When the size of the observed time-series becomes very large.
STL-ELM
- Seasonal Trend Decomposition using LOESS (STL) - Extreme Learning Machine (ELM)
- Hybrid univariate forecasting model for complex series (non-stationary, non-linear)
- The univariate series is decomposed into subseries using STL and each subseries is forecast using ELM, then those subseries forecasts are ensembled for the final forecast
- Each subseries is simpler and stationary
VMD-TDNN
- Variational Mode Decomposition (VMD) Based Time Delay Neural Network Model (TDNN)
- Hybrid univariate forecasting model for complex series (non-stationary, non-linear)
- The univariate series is decomposed into “modes” using VMD and each mode is forecast using TDNN, then those mode forecasts are recombined for the final forecast
- The modes are generated by Intrinsic Mode Functions (IMFs)
- Orthogonal to each other, stationary, and non-linear
- Think the recombination method is simply to sum the forecasts
- The modes are generated by Intrinsic Mode Functions (IMFs)
- Misc
- The number of modes you choose is very important
- Methods for choosing the number of modes
- Central Frequency Method (CFM)
- Signal Difference Average (SDA)
Taken’s Embedding
A Dynamic Systems Model that transforms the time series into space where the dimensions are determined by multiples of lags of the time series. This transformation removes the autocorrelation between the data points and allows it to be forecasted.
- Kind of like a SVM
Misc
- {nonlinearTseries}
Parameters
d or τ - Called the time delay, this will tell us how many time lags each axis of the phase space will represent
# tau (time delay) estimation based on the average mutual information function = timeLag(ts, technique = "ami", tau.ami lag.max = 100, do.plot = T)
m - Called the embedding dimension, this parameter will tell us the dimension of the phase space
# m (embedding dimension) uses the tau estimation = estimateEmbeddingDim(ts, time.lag = tau.ami, emb.dim max.embedding.dim = 15)
- Estimated using Cao’s algorithm
Phase Space Embedding Matrix
- Where f(t) is the univariate time series
Build the Taken’s model
= buildTakens(ts,embedding.dim = emb.dim, time.lag = tau.ami) tak