Understanding Univariate Time Series Analysis

Time Series Analysis: Univariate Overview

Univariate Time Series: A univariate time series is a sequence of measurements of the same variable collected over time, often at regular intervals.

Unlike standard linear regression, the data in time series are not necessarily independent nor necessarily identically distributed. One defining characteristic of a time series is that it is a list of observations where the ordering matters. This ordering is crucial because there is a dependency, and changing the order could alter the meaning of the data.

Basic Objectives of the Analysis

The primary goals of time series analysis are:

  1. Describing Patterns: To describe the important features of the time series pattern.
  2. Explaining Past and Future: To explain how the past affects the future or how two time series can interact.
  3. Forecasting: To predict future values of the series.
  4. Control Standard: To serve as a control standard for a variable that measures product quality in manufacturing.

Important Characteristics to Consider

When first analyzing a time series, consider the following questions:

  1. Trend: Is there a trend, meaning that, on average, the measurements tend to increase or decrease over time?
  2. Seasonality: Is there a regularly repeating pattern of highs and lows related to calendar time, such as seasons, quarters, months, or days of the week?
  3. Outliers: Are there outliers? In time series data, outliers are far away from your other data points.
  4. Long-Run Cycles: Is there a long-run cycle or period unrelated to seasonality factors?
  5. Variance: Is there constant variance over time, or is the variance non-constant?
  6. Abrupt Changes: Are there any abrupt changes in the level of the series or the variance?

Understanding these aspects helps in building a robust model that can accurately describe and predict the behavior of the time series data. ?

As an example, plotting the stock of Apple over time gives

Autoregressive (AR) model

An AR(1) model is the simplest form, where only the immediately preceding value is used to predict the current value:

y_t = c + \phi_1 y_{t-1} + \epsilon_t

If \phi_1 is close to 1, the series will show high persistence, meaning past values have a strong influence on future values. If \phi_1 is close to 0, the series is less dependent on past values.

The general form of an Autoregressive model of order p , denoted as AR(p), is:

y_t = c + \phi_1 y_{t-1} + \phi_2 y_{t-2} + \cdots + \phi_p y_{t-p} + \epsilon_t

where:

  • y_t is the value at time t .
  • c is a constant.
  • \phi_1, \phi_2, \ldots, \phi_p are the autoregressive coefficients.
  • y_{t-1}, y_{t-2}, \ldots, y_{t-p} are the past values of the series.
  • \epsilon_t is the error term at time t (also known as white noise).

Key Concepts:

  1. Order of the Model (p): This indicates the number of past values (lags) used to predict the current value. For example, an AR(1) model uses only the previous value, while an AR(2) model uses the last two values.
  2. Coefficients (\phi_i ): These coefficients measure the influence of the past values on the current value. They can be estimated using methods like least squares estimation.
  3. Stationarity: For the AR model to be valid, the time series data should be stationary, meaning its statistical properties like mean, variance, and autocorrelation are constant over time.

Discover more from Science Comics

Subscribe to get the latest posts sent to your email.

Leave a Reply

error: Content is protected !!