
Overview
The long-memory dynamic factor model (LMDFM) algorithm is to make (1) analysis of observed multiple (vector) time-series, (2) multi-step forecasts of multivariate (vector) time-series, and (3) multi-step forecasts of multivariate volatility (variance-covariance matrix) of vector time-series.
The LMDFM assumes the large set of time-series are influenced by evolution histories of a number of unobserved factors commonly affecting all or many of the time-series. LMDFM is estimated by an implementation of dynamic principal components analysis (DPCA), reviewed by Doz and Fuleky (2020), with 2-dimensional discrete Fourier transform (2D-DFT). LMDFM algorithm can estimate the influences of longer histories of common factors.
the LMDFM algorithm estimates (a) dynamic factor loadings matrixes, (b) vector autoregressive (VAR) coefficients of dynamic factor scores, (c) multi-step forecasts of multivariate values and variance-covariance matrix of the factor scores and the observed time-series.
Highlights
- WHY DFM? Dynamic factor models (DFMs) can be used to analyze and forecast large set of time-series, such as measurements and indicators of national or multinational economies, prices of products or instruments constantly traded in markets, measurements and observations of natural or engineering processes, social or political trends, and scores of sports. The evolutions of these time-series are influenced by evolutions of a number of unobserved dynamic factors commonly affecting all or many of the time-series.
- WHY Long-Memory? The LMDFM algorithm estimates influence of longer histories of dynamic common factors with concepts of DPCA (dynamic principal components analysis) and 2D or 1D and inverse DFT (discrete Fourier transform). Such longer-memory estimates make LMDFM accommodate wider range of values of model learning parameters. The wider ranges can further enhance the power of machine learning.
- WHAT next? Many real-world large sets of time-series are nonstationary. In general, a filtering approach could be the best for analysis and forecasts on nonstationary time-series. Bayesian filters are among more adaptive filters: more powerful due to fewer restrictive conditions. A variational Bayesian filtering is the fastest one. We, i4cast LLC, is an advanced developer in variational Bayesian filtering: we listed VBfFA algorithm here on AWS. We are now working on developing and offering a long memory dynamic factor model estimated by a variational Bayesian filtering for better forecasts.
Details
Unlock automation with AI agent solutions

Features and programs
Financing for AWS Marketplace purchases
Pricing
Free trial
- ...
Dimension | Description | Cost/host/hour |
|---|---|---|
ml.m5.xlarge Inference (Batch) Recommended | Model inference on the ml.m5.xlarge instance type, batch mode | $0.10 |
ml.m5.xlarge Inference (Real-Time) Recommended | Model inference on the ml.m5.xlarge instance type, real-time mode | $0.10 |
ml.m5.xlarge Training Recommended | Algorithm training on the ml.m5.xlarge instance type | $0.10 |
ml.m4.4xlarge Inference (Batch) | Model inference on the ml.m4.4xlarge instance type, batch mode | $0.10 |
ml.g4dn.4xlarge Inference (Batch) | Model inference on the ml.g4dn.4xlarge instance type, batch mode | $0.10 |
ml.m5.4xlarge Inference (Batch) | Model inference on the ml.m5.4xlarge instance type, batch mode | $0.10 |
ml.m4.16xlarge Inference (Batch) | Model inference on the ml.m4.16xlarge instance type, batch mode | $0.10 |
ml.p3.16xlarge Inference (Batch) | Model inference on the ml.p3.16xlarge instance type, batch mode | $0.10 |
ml.m5.2xlarge Inference (Batch) | Model inference on the ml.m5.2xlarge instance type, batch mode | $0.10 |
ml.g4dn.2xlarge Inference (Batch) | Model inference on the ml.g4dn.2xlarge instance type, batch mode | $0.10 |
Vendor refund policy
We offer full refund for academic works. Other refunds are offered according to common practices.
How can we make this page better?
Legal
Vendor terms and conditions
Content disclaimer
Delivery details
Amazon SageMaker algorithm
An Amazon SageMaker algorithm is a machine learning model that requires your training data to make predictions. Use the included training algorithm to generate your unique model artifact. Then deploy the model on Amazon SageMaker for real-time inference or batch processing. Amazon SageMaker is a fully managed platform for building, training, and deploying machine learning models at scale.
Version release notes
The long-memory dynamic factor model (LMDFM) algorithm is to make (1) analysis of observed multiple (vector) time-series, (2) multi-step forecasts of multivariate (vector) time-series, and (3) multi-step forecasts of multivariate volatility (variance-covariance matrix) of vector time-series.
the LMDFM algorithm estimates (a) dynamic factor loadings matrixes, (b) vector autoregressive (VAR) coefficients of dynamic factor scores, (c) multi-step forecasts of multivariate values and variance-covariance matrix of the factor scores and the observed time-series.
Additional details
Inputs
- Summary
The LMDFM (long-memory dynamic factor model) algorithm takes, as input data, multiple time-series data contained in a CSV (comma separated value) data table, in a format of a CSV text-string or a CSV text-file.
Each row of the data table is for values of an individual time-series (TS). Row header is the label or symbol of the time-series. Each column is for values of all time-series at a specific moment in time. Column header is the time-index or time-stamp of the moment.
- Input MIME type
- text/csv
Input data descriptions
The following table describes supported input data fields for real-time inference and batch transform.
Field name | Description | Constraints | Required |
|---|---|---|---|
Values of time stamp | Each row of the data table is for values of an individual time-series (TS). Row header is the label or symbol of the time-series. Each column is for values of all time-series at a specific moment in time. Column header is the time-index or time-stamp of the moment. The first data column is for the earliest time and the last column for the most recent time. The current version of LMDFM requires equally spaced time-stamps. | Type: FreeText | Yes |
Support
Vendor support
For questions or call-back number, please send email to i4cast LLC at prod.i4cast@gmail.com .
AWS infrastructure support
AWS Support is a one-on-one, fast-response support channel that is staffed 24x7x365 with experienced and technical support engineers. The service helps customers of all sizes and technical abilities to successfully utilize the products and features provided by Amazon Web Services.
Similar products

