Time Series Analysis: Univariate and Multivariate Methods (Classic Version), 2nd edition
Your access includes:
 Search, highlight, and take notes
 Easily create flashcards
 Use the app for access anywhere
 14day refund guarantee
$10.99per month
4month term, pay monthly or pay $43.96
Learn more, spend less

Special partners and offers
Enjoy perks from special partners and offers for students

Find it fast
Quickly navigate your eTextbook with search

Stay organized
Access all your eTextbooks in one place

Easily continue access
Keep learning with autorenew
Overview
Time Series Analysis, 2nd Edition is a thorough introduction to both timedomain and frequencydomain analyses of univariate and multivariate time series methods, with coverage of the most recently developed techniques in the field. With its broad coverage of methodology, it is a useful reference for those in applied sciences where analysis and research of time series is useful. Its plentiful examples show the operational details and purpose of a variety of univariate and multivariate time series methods. Numerous figures, tables and reallife time series data sets illustrate the models and methods useful for analyzing, modeling, and forecasting data collected sequentially in time. It offers a balanced treatment between theory and applications.
This title is part of the Pearson Modern Classics series. Pearson Modern Classics are acclaimed titles at a value price.
Published by Pearson (May 26th 2023)  Copyright © 2023
ISBN13: 9780137981465
Table of contents
Table of Contents
 Overview
 1.1 Introduction
 1.2 Examples and Scope of This Book
 Fundamental Concepts
 2.1 Stochastic Processes
 2.2 The Autocovariance and Autocorrelation Functions
 2.3 The Partial Autocorrelation Function
 2.4 White Noise Processes
 2.5 Estimation of the Mean, Autocovariances, and Autocorrelations
 2.5.1 Sample Mean
 2.5.2 Sample Autocovariance Function
 2.5.3 Sample Autocorrelation Function
 2.5.4 Sample Partial Autocorrelation Function
 2.6 Moving Average and Autoregressive Representations of Time Series Processes
 2.7 Linear Difference Equations
 Stationary Time Series Models
 3.1 Autoregressive Processes
 3.1.1 The FirstOrder Autoregressive AR(1) Process
 3.1.2 The SecondOrder Autoregressive AR(2) Process
 3.1.3 The General pthOrder Autoregressive AR(p) Process
 3.2 Moving Average Processes
 3.2.1 The FirstOrder Moving Average MA(1) Process
 3.2.2 The SecondOrder Moving Average MA(2) Process
 3.2.3 The General qthOrder Moving Average MA(q) Process
 3.3 The Dual Relationship Between AR(p) and MA(q) Processes
 3.4 Autoregressive Moving Average ARMA(p, q) Processes
 3.4.1 The General Mixed ARMA(p, q) Process
 3.4.2 The ARMA(1, 1) Process
 3.1 Autoregressive Processes
 Nonstationary Time Series Models
 4.1 Nonstationarity in the Mean
 4.1.1 Deterministic Trend Models
 4.1.2 Stochastic Trend Models and Differencing
 4.2 Autoregressive Integrated Moving Average (ARIMA) Models
 4.2.1 The General ARIMA Model
 4.2.2 The Random Walk Model
 4.2.3 The ARIMA(0, 1, 1) or IMA(1, 1) Model
 4.3 Nonstationarity in the Variance and the Autocovariance
 4.3.1 Variance and Autocovariance of the ARIMA Models
 4.3.2 Variance Stabilizing Transformations
 4.1 Nonstationarity in the Mean
 Forecasting
 5.1 Introduction
 5.2 Minimum Mean Square Error Forecasts
 5.2.1 Minimum Mean Square Error Forecasts for ARMA Models
 5.2.2 Minimum Mean Square Error Forecasts for ARIMA Models
 5.3 Computation of Forecasts
 5.4 The ARIMA Forecast as a Weighted Average of Previous Observations
 5.5 Updating Forecasts
 5.6 Eventual Forecast Functions
 5.7 A Numerical Example
 Model Identification
 6.1 Steps for Model Identification
 6.2 Empirical Examples
 6.3 The Inverse Autocorrelation Function (IACF)
 6.4 Extended Sample Autocorrelation Function and Other Identification Procedures
 6.4.1 The Extended Sample Autocorrelation Function (ESACF)
 6.4.2 Other Identification Procedures
 Parameter Estimation, Diagnostic Checking, and Model Selection
 7.1 The Method of Moments
 7.2 Maximum Likelihood Method
 7.2.1 Conditional Maximum Likelihood Estimation
 7.2.2 Unconditional Maximum Likelihood Estimation and Backcasting Method
 7.2.3 Exact Likelihood Functions
 7.3 Nonlinear Estimation
 7.4 Ordinary Least Squares (OLS) Estimation in Time Series Analysis
 7.5 Diagnostic Checking
 7.6 Empirical Examples for Series W1—W7
 7.7 Model Selection Criteria
 Seasonal Time Series Models
 8.1 General Concepts
 8.2 Traditional Methods
 8.2.1 Regression Method
 8.2.2 Moving Average Method
 8.3 Seasonal ARIMA Models
 8.4 Empirical Examples
 Testing for a Unit Root
 9.1 Introduction
 9.2 Some Useful Limiting Distributions
 9.3 Testing for a Unit Root in the AR(1) Model
 9.3.1 Testing the AR(1) Model without a Constant Term
 9.3.2 Testing the AR(1) Model with a Constant Term
 9.3.3 Testing the AR(1) Model with a Linear Time Trend
 9.4 Testing for a Unit Root in a More General Model
 9.5 Testing for a Unit Root in Seasonal Time Series Models
 9.5.1 Testing the Simple Zero Mean Seasonal Model
 9.5.2 Testing the General Multiplicative Zero Mean Seasonal Model
 Intervention Analysis and Outlier Detection
 10.1 Intervention Models
 10.2 Examples of Intervention Analysis
 10.3 Time Series Outliers
 10.3.1 Additive and Innovational Outliers
 10.3.2 Estimation of the Outlier Effect When the Timing of the Outlier Is Known
 10.3.3 Detection of Outliers Using an Iterative Procedure
 10.4 Examples of Outlier Analysis
 10.5 Model Identification in the Presence of Outliers
 Fourier Analysis
 11.1 General Concepts
 11.2 Orthogonal Functions
 11.3 Fourier Representation of Finite Sequences
 11.4 Fourier Representation of Periodic Sequences
 11.5 Fourier Representation of Nonperiodic Sequences: The DiscreteTime Fourier Transform
 11.6 Fourier Representation of ContinuousTime Functions
 11.6.1 Fourier Representation of Periodic Functions
 11.6.2 Fourier Representation of Nonperiodic Functions: The ContinuousTime Fourier Transform
 11.7 The Fast Fourier Transform
 Spectral Theory of Stationary Processes
 12.1 The Spectrum
 12.1.1 The Spectrum and Its Properties
 12.1.2 The Spectral Representation of Autocovariance Functions: The Spectral Distribution Function
 12.1.3 Wold’s Decomposition of a Stationary Process
 12.1.4 The Spectral Representation of Stationary Processes
 12.2 The Spectrum of Some Common Processes
 12.2.1 The Spectrum and the Autocovariance Generating Function
 12.2.2 The Spectrum of ARMA Models
 12.2.3 The Spectrum of the Sum of Two Independent Processes
 12.2.4 The Spectrum of Seasonal Models
 12.3 The Spectrum of Linear Filters
 12.3.1 The Filter Function
 12.3.2 Effect of Moving Average
 12.3.3 Effect of Differencing
 12.4 Aliasing
 12.1 The Spectrum
 Estimation of the Spectrum
 13.1 Periodogram Analysis
 13.1.1 The Periodogram
 13.1.2 Sampling Properties of the Periodogram
 13.1.3 Tests for Hidden Periodic Components
 13.2 The Sample Spectrum
 13.3 The Smoothed Spectrum
 13.3.1 Smoothing in the Frequency Domain: The Spectral Window
 13.3.2 Smoothing in the Time Domain: The Lag Window
 13.3.3 Some Commonly Used Windows
 13.3.4 Approximate Confidence Intervals for Spectral Ordinates
 13.4 ARMA Spectral Estimation
 13.1 Periodogram Analysis
 Transfer Function Models
 14.1 SingleInput Transfer Function Models
 14.1.1 General Concepts
 14.1.2 Some Typical Impulse Response Functions
 14.2 The CrossCorrelation Function and Transfer Function Models
 14.2.1 The CrossCorrelation Function (CCF)
 14.2.2 The Relationship between the CrossCorrelation Function and the Transfer Function
 14.3 Construction of Transfer Function Models
 14.3.1 Sample CrossCorrelation Function
 14.3.2 Identification of Transfer Function Models
 14.3.3 Estimation of Transfer Function Models
 14.3.4 Diagnostic Checking of Transfer Function Models
 14.3.5 An Empirical Example
 14.4 Forecasting Using Transfer Function Models
 14.4.1 Minimum Mean Square Error Forecasts for Stationary Input and Output Series
 14.4.2 Minimum Mean Square Error Forecasts for Nonstationary Input and Output Series
 14.4.3 An Example
 14.5 Bivariate FrequencyDomain Analysis
 14.5.1 CrossCovariance Generating Functions and the CrossSpectrum
 14.5.2 Interpretation of the CrossSpectral Functions
 14.5.3 Examples
 14.5.4 Estimation of the CrossSpectrum
 14.6 The CrossSpectrum and Transfer Function Models
 14.6.1 Construction of Transfer Function Models through CrossSpectrum Analysis
 14.6.2 CrossSpectral Functions of Transfer Function Models
 14.7 MultipleInput Transfer Function Models
 14.1 SingleInput Transfer Function Models
 Time Series Regression and GARCH Models
 15.1 Regression with Autocorrelated Errors
 15.2 ARCH and GARCH Models
 15.3 Estimation of GARCH Models
 15.3.1 Maximum Likelihood Estimation
 15.3.2 Iterative Estimation
 15.4 Computation of Forecast Error Variance
 15.5 Illustrative Examples
 Vector Time Series Models
 16.1 Covariance and Correlation Matrix Functions
 16.2 Moving Average and Autoregressive Representations of Vector Processes
 16.3 The Vector Autoregressive Moving Average Process
 16.3.1 Covariance Matrix Function for the Vector AR(1) Model
 16.3.2 Vector AR(p) Models
 16.3.3 Vector MA(1) Models
 16.3.4 Vector MA(q) Models
 16.3.5 Vector ARMA(1, 1) Models
 16.4 Nonstationary Vector Autoregressive Moving Average Models
 16.5 Identification of Vector Time Series Models
 16.5.1 Sample Correlation Matrix Function
 16.5.2 Partial Autoregression Matrices
 16.5.3 Partial Lag Correlation Matrix Function
 16.6 Model Fitting and Forecasting
 16.7 An Empirical Example
 16.7.1 Model Identification
 16.7.2 Parameter Estimation
 16.7.3 Diagnostic Checking
 16.7.4 Forecasting
 16.7.5 Further Remarks
 16.8 Spectral Properties of Vector Processes
 Supplement 16.A Multivariate Linear Regression Models
 More on Vector Time Series
 17.1 Unit Roots and Cointegration in Vector Processes
 17.1.1 Representations of Nonstationary Cointegrated Processes
 17.1.2 Decomposition of Zt
 17.1.3 Testing and Estimating Cointegration
 17.2 Partial Process and Partial Process Correlation Matrices
 17.2.1 Covariance Matrix Generating Function
 17.2.2 Partial Covariance Matrix Generating Function
 17.2.3 Partial Process Sample Correlation Matrix Functions
 17.2.4 An Empirical Example: The U.S. Hog Data
 17.3 Equivalent Representations of a Vector ARMA Model
 17.3.1 FiniteOrder Representations of a Vector Time Series Process
 17.3.2 Some Implications
 17.1 Unit Roots and Cointegration in Vector Processes
 State Space Models and the Kalman Filter
 18.1 State Space Representation
 18.2 The Relationship between State Space and ARMA Models
 18.3 State Space Model Fitting and Canonical Correlation Analysis
 18.4 Empirical Examples
 18.5 The Kalman Filter and Its Applications
 Supplement 18.A Canonical Correlations
 Long Memory and Nonlinear Processes
 19.1 Long Memory Processes and Fractional Differencing
 19.1.1 Fractionally Integrated ARMA Models and Their ACF
 19.1.2 Practical Implications of the ARFIMA Processes
 19.1.3 Estimation of the Fractional Difference
 19.2 Nonlinear Processes
 19.2.1 Cumulants, Polyspectrum, and Tests for Linearity and Normality
 19.2.2 Some Nonlinear Time Series Models
 19.3 Threshold Autoregressive Models
 19.3.1 Tests for TAR Models
 19.3.2 Modeling TAR Models
 19.1 Long Memory Processes and Fractional Differencing
 Aggregation and Systematic Sampling in Time Series
 20.1 Temporal Aggregation of the ARIMA Process
 20.1.1 The Relationship of Autocovariances between the Nonaggregate and Aggregate Series
 20.1.2 Temporal Aggregation of the IMA(d, q) Process
 20.1.3 Temporal Aggregation of the AR(p) Process
 20.1.4 Temporal Aggregation of the ARIMA(p, d, q) Process
 20.1.5 The Limiting Behavior of Time Series Aggregates
 20.2 The Effects of Aggregation on Forecasting and Parameter Estimation
 20.2.1 Hilbert Space
 20.2.2 The Application of Hilbert Space in Forecasting
 20.2.3 The Effect of Temporal Aggregation on Forecasting
 20.2.4 Information Loss Due to Aggregation in Parameter Estimation
 20.3 Systematic Sampling of the ARIMA Process
 20.4 The Effects of Systematic Sampling and Temporal Aggregation on Causality
 20.4.1 Decomposition of Linear Relationship between Two Time Series
 20.4.2 An Illustrative Underlying Model
 20.4.3 The Effects of Systematic Sampling and Temporal Aggregation on Causality
 20.5 The Effects of Aggregation on Testing for Linearity and Normality
 20.5.1 Testing for Linearity and Normality
 20.5.2 The Effects of Temporal Aggregation on Testing for Linearity and Normality
 20.6 The Effects of Aggregation on Testing for a Unit Root
 20.6.1 The Model of Aggregate Series
 20.6.2 The Effects of Aggregation on the Distribution of the Test Statistics
 20.6.3 The Effects of Aggregation on the Significance Level and the Power of the Test
 20.6.4 Examples
 20.6.5 General Cases and Concluding Remarks
 20.7 Further Comments
 20.1 Temporal Aggregation of the ARIMA Process
References
Appendix
 Time Series Data Used for Illustrations
 Statistical Tables
Author Index
Subject Index
Your questions answered
Pearson+ is your onestop shop, with eTextbooks and study videos designed to help students get better grades in college.
A Pearson eTextbook is an easy‑to‑use digital version of the book. You'll get upgraded study tools, including enhanced search, highlights and notes, flashcards and audio. Plus learn on the go with the Pearson+ app.
Your eTextbook subscription gives you access for 4 months. You can make a one‑time payment for the initial 4‑month term or pay monthly. If you opt for monthly payments, we will charge your payment method each month until your 4‑month term ends. You can turn on auto‑renew in My account at any time to continue your subscription before your 4‑month term ends.
When you purchase an eTextbook subscription, it will last 4 months. You can renew your subscription by selecting Extend subscription on the Manage subscription page in My account before your initial term ends.
If you extend your subscription, we'll automatically charge you every month. If you made a one‑time payment for your initial 4‑month term, you'll now pay monthly. To make sure your learning is uninterrupted, please check your card details.
To avoid the next payment charge, select Cancel subscription on the Manage subscription page in My account before the renewal date. You can subscribe again in the future by purchasing another eTextbook subscription.
Channels is a video platform with thousands of explanations, solutions and practice problems to help you do homework and prep for exams. Videos are personalized to your course, and tutors walk you through solutions. Plus, interactive AI‑powered summaries and a social community help you better understand lessons from class.
Channels is an additional tool to help you with your studies. This means you can use Channels even if your course uses a non‑Pearson textbook.
When you choose a Channels subscription, you're signing up for a 1‑month, 3‑month or 12‑month term and you make an upfront payment for your subscription. By default, these subscriptions auto‑renew at the frequency you select during checkout.
When you purchase a Channels subscription it will last 1 month, 3 months or 12 months, depending on the plan you chose. Your subscription will automatically renew at the end of your term unless you cancel it.
We use your credit card to renew your subscription automatically. To make sure your learning is uninterrupted, please check your card details.