Студопедия
Случайная страница | ТОМ-1 | ТОМ-2 | ТОМ-3
АрхитектураБиологияГеографияДругоеИностранные языки
ИнформатикаИсторияКультураЛитератураМатематика
МедицинаМеханикаОбразованиеОхрана трудаПедагогика
ПолитикаПравоПрограммированиеПсихологияРелигия
СоциологияСпортСтроительствоФизикаФилософия
ФинансыХимияЭкологияЭкономикаЭлектроника

Chapter 13. Introduction to nonstationary time series

Chapter 1. Simple regression analysis | Chapter 2. Properties of regression coefficients and hypothesis testing | Chapter 3. Multiple regression analysis | Chapter 8. Stochastic regressors and measurement errors | Chapter 9 . Simultaneous equations estimation | Identification | Chapter 10. Binary choice models, tobit model and ML estimation | Chapter 11. Models using time-series data |


Читайте также:
  1. A) While Reading activities (p. 47, chapters 5, 6)
  2. After method: an introduction
  3. An Introduction to Oxford University
  4. Application using an introduction
  5. AUTHOR'S INTRODUCTION TO WHITE LIGHT/WHITE HEAT: THE VELVET UNDERGROUND DAY-BY-DAY
  6. BLEAK HOUSE”, Chapters 2-5
  7. BLEAK HOUSE”, Chapters 6-11

1. Stationary and nonstationary processes

A time series is called stationary if a) do not depend on and b) covariances depend only on .

Example. The process , where have mean zero, variance and are uncorrelated and (stability condition) is stationary. Prove that

.

Example. The random walk , where is the same as above, is not stationary. A more general example is random walk with drift .

Example. A process with a time trend

(1) is not stationary.

Consequences of nonstationarity

The consistency of OLS estimators in the classical model can be proved if the sample variances and covariances of the explanatory variables tend to their population counterparts in the limit. If the time series are nonstationary, those population counterparts may be infinite. The least squares will be inconsistent and diagnostic statistics will not have their standard limiting distributions. As a consequence, the regression coefficient of an explanatory variable may be significantly different from 0 when in fact it is not a determinant of the dependent variable.

2. Difference-stationary and trend-stationary processes.

If a nonstationary process can be transformed into a stationary process by differencing once, it is said to be difference-stationary.

Differencing is one way to detrend the series.

A nonstationary time series is said to be trend-stationary if it can be transformed into a stationary process by extracting a time trend.

Example. (1) can be detrended by fitting the equation and defining a new variable which is just the residuals from the regression of on .

Spurious regressions

Example. If two variables contain a time trend, they will have a high correlation coefficient and will be high. The slope coefficient will be significant despite the fact that none of the variables is a determinant of the other.

Example. Granger and Newbold regressed one simulated random walk on another and found that the slope is significant in too many simulations. Spurious regression = false regression with significant coefficients when in fact there is no relationship between variables.

3. Graphical techniques for detecting of nonstationarity

ARMA(p,q) processes

An autoregressive moving average process ARMA() is defined by

.

If there are no lags of innovations, we obtain AR(p). If there are no lags of the dependent variable, we obtain MA(q). If the difference is ARMA(p,q), we say that is ARIMA(p,d,q).

The autocorrelation function is defined by

Its graphical representation is called a correlogram. It is shown that MA(q) has nonzero weights for only the first q lags and zero weights thereafter. Stable AR(p) have all autocorrelations different from zero. In both cases autocorrelations tend to zero geometrically. For a random walk they decline very slowly (linearly). Illustrate in Eviews.

Partial autocorrelations are more complicated; they measure the correlation of the current and lagged series after taking into account the predictive power of all the values of the series with smaller lags.

A series is ARIMA(p,d,q) if its th difference is ARMA(p,q). Identification of ARIMA(p,d,q) consists of two stages:

1) If the correlogram exhibits slowly declining coefficients, the series is differenced d times until the series exhibits a stationary pattern.

2) Inspect the correlogram of the differenced series and its partial correlogram to determine orders p and q.

4. Tests of nonstationarity

State the classification of different cases from Section 13.4

Dickey-Fuller test. Describe three Dickey-Fuller tests and the augmented Dickey-Fuller test.

5. Cointegration

Prove that a sum of two independent stationary processes is stationary.

Variables are said to be cointegrated if each of the series taken individually is , while some linear combination is stationary, or . The stationary linear combination is called the cointegrating equation and may be interpreted as a long-run equilibrium relationship among the variables. To see if variables are cointegrated, just regress one of them on the others and test if the disturbance term is stationary.

6. Fitting models with nonstationary time series

a) For models where the variables possess deterministic trends, the fitting of spurious relationships can be avoided by detrending the variables before use.

However, if the variables are difference-stationary rather than trend-stationary, the detrending procedure is likely to give rise to misleading results. In particular, if a random walk is regressed on a time trends as in , the null that is zero will be rejected more often than it should.

Further, if a series is difference-stationary, the procedure does not make it stationary. For example, in the case of a random walk, extracting a non-existent trend in the mean of the series can do nothing to alter the trend in its variance.

b) If the disturbance term in a model is subject to , the suggestion is to run the regression in differences rather than levels:

.

If is close to 1, then is close to 0. If both are unrelated I(1) processes, they are stationary in the differenced model, and the absence of any relationship will be revealed by hypothesis testing.

A major shortcoming of differencing is that it precludes the investigation of a long-run relationship (cointegration).

c) The idea of error-correction models is to employ cointegration information in the regression model, to combine short-run and long-run dynamics.

Example. Suppose that the relationship between two I(1) variables is characterized by the ADL(1,1) model:

(2) .

Use it to find the long-run and cointegrating relationships. Manipulate (2) and the cointegrating relationship to obtain

(3)

This equation states that the change in in any period is governed by the change in and the discrepancy between and the value predicted by the cointegrating relationship. The latter term is denoted the error-correction mechanism. The point of this rearrangement is that all of the terms are I(0).

Exercises. 13.1, 13.7, 13.10, 13.13.


Дата добавления: 2015-11-14; просмотров: 80 | Нарушение авторских прав


<== предыдущая страница | следующая страница ==>
Chapter 12. Autocorrelation| 1 страница

mybiblioteka.su - 2015-2024 год. (0.008 сек.)