Студопедия
Случайная страница | ТОМ-1 | ТОМ-2 | ТОМ-3
АвтомобилиАстрономияБиологияГеографияДом и садДругие языкиДругоеИнформатика
ИсторияКультураЛитератураЛогикаМатематикаМедицинаМеталлургияМеханика
ОбразованиеОхрана трудаПедагогикаПолитикаПравоПсихологияРелигияРиторика
СоциологияСпортСтроительствоТехнологияТуризмФизикаФилософияФинансы
ХимияЧерчениеЭкологияЭкономикаЭлектроника

Economic Applications: The Beveridge–Nelson

Читайте также:
  1. Basics of Economics
  2. Belgorod Region (social and economic survey)
  3. Belgorod University 2007 M.S. in Economics
  4. Business and Management/economics ps
  5. by Sanjeev Sanyal, Deutsche Bank's Global Strategist. Named "Young Global Leader 2010" by the World Economic Forum
  6. Chinese Economic presence in Kazakhstan
  7. Consumption, take account of, manager, state, environment, include, capital, macroeconomics, economy

 

Decomposition............................................... 325

 

20.1 The Beveridge–Nelson Decomposition..................... 328

 

20.2 State Space Form and Applications......................... 330

 

20.3 Extensions of the Beveridge–Nelson Decomposition

 

to Nonlinear Processes.................................... 334

 

20.4 Conclusion............................................... 336

 

20.5 Exercises................................................. 336


Contents XIII

 

References........................................................ 339

 

Author Index..................................................... 349

 

Data Index........................................................ 353

 

Subject Index..................................................... 355


 

Basic Concepts

 

1.1 Time Series Patterns

 

Time series arise in many different contexts including minute-by-minute stock prices, hourly temperatures at a weather station, daily numbers of arrivals at a medical clinic, weekly sales of a product, monthly unem-ployment figures for a region, quarterly imports of a country, and annual turnover of a company. That is, time series arise whenever something is observed over time. While a time series may be observed either continuously or at discrete times, the focus of this book is on discrete time series that are observed at regular intervals over time.

 

A graph of a time series often exhibits patterns, such as an upward or downward movement (trend) or a pattern that repeats (seasonal variation), that might be used to forecast future values. Graphs of four time series that display such features are presented in Fig. 1.1.

 

• Figure 1.1a shows 125 monthly US government bond yields (percent per annum) from January 1994 to May 2004. This time series appears to have a changing level with a downward drift that one would be reluctant to forecast as continuing into the future, and it seems to have no discernable seasonal pattern.

• Figure 1.1b displays 55 observations of annual US net electricity gen-eration (billion kwh) for 1949 through 2003. This time series contains a definite upward trend that changes somewhat over time.

• Figure 1.1c presents 113 quarterly observations of passenger motor vehi-cle production in the UK (thousands of cars) for the first quarter of 1977 through the first quarter of 2005. For this time series there is a constant variation around a changing level. As with Fig. 1.1a, there is no trend that one would want to forecast as continuing into the future. However, there is a possibility of a seasonal pattern.

• Figure 1.1d shows 240 monthly observations of the number of short term overseas visitors to Australia from May 1985 to April 2005. There is a


4 1 Basic Concepts

 

    (a) US 10−year bonds yield   (b) US net electricity generation
Percentage per annum 4 5 6 7 8               Billion kwh 500 1500 2500 3500          
                           
          Year           Year    
  (c) UK passenger vehicle production   (d) Overseas visitors to Australia
                               
Thousands of cars 200 300 400               Thousands of people 100 300 500          
                         
          Year           Year    

Fig. 1.1. Four time series showing patterns typical of business and economic data.

 

 

definite seasonal pattern in this time series, and the variation increases as the level of the time series increases. It is not possible to tell visually whether the increase is due to an increase in the seasonal fluctuations or is caused by some other factors. While there is an upward drift, it might not be a good idea to forecast it as continuing into the future.

 

From these few examples it is clear that there is frequently a need for forecasting that takes into account trend, seasonality, and other features of the data. Specifically, we are interested in the situation where we observe a time series y 1,..., yn, and we wish to forecast a future observation at time n + h. In order to exploit the patterns like those in Fig. 1.1, many differentforecasting methods and models have been proposed.

 

 

1.2 Forecasting Methods and Models

 

A forecasting method is an algorithm that provides a point forecast: a sin-gle value that is a prediction of the value at a future time period. On the other hand, a statistical model provides a stochastic data generating process that may be used to produce an entire probability distribution for a future


1.3 History of Exponential Smoothing  

 

time period n + h. A point forecast can then be obtained easily by tak-ing the mean (or median) of the probability distribution. A model also allows the computation of prediction (forecast) intervals with a given level of confidence.

 

We use the notation y ˆ n + h|n to denote a point forecast of yn + h using the information available at time n. This notation does not need to distinguish between point forecasts that arise from forecasting methods and those that are derived from statistical models, because the statistical models will lead directly to point forecasting methods.

 

 

1.3 History of Exponential Smoothing

 

Historically, exponential smoothing describes a class of forecasting methods. In fact, some of the most successful forecasting methods are based on the con-cept of exponential smoothing. There are a variety of methods that fall into the exponential smoothing family, each having the property that forecasts are weighted combinations of past observations, with recent observations given relatively more weight than older observations. The name “exponen-tial smoothing” reflects the fact that the weights decrease exponentially as the observations get older.

 

The idea seems to have originated with Robert G. Brown in about 1944 while he was working for the US Navy as an Operations Research analyst. He used the idea in a mechanical computing device for tracking the velocity and angle used in firing at submarines (Gardner 2006). In the 1950s he extended this method from continuous to discrete time series, and included terms to handle trend and seasonality. One of his first applications was forecasting demand for spare parts in the US Navy inventory system. This latter work was presented at a meeting of the Operations Research Society of America in 1956 and formed the basis of his first book on inventory control (Brown 1959). The ideas were further developed in Brown’s second book (1963).

 

Independently, Charles Holt was also working on an exponential smooth-ing method for the US Office of Naval Research (ONR). Holt’s method dif-fered from Brown’s with respect to the smoothing of the trend and seasonal components. His original work was reproduced in an ONR memorandum (Holt 1957), which has been very widely cited, but was unpublished until recently when it appeared in the International Journal of Forecasting in 2004. Holt’s work on additive and multiplicative seasonal exponential smooth-ing became well known through a paper by his student Peter Winters (1960) which provided empirical tests for Holt’s methods. As a result, the seasonal versions of Holt’s methods are usually called Holt-Winters’ methods (and sometimes just Winters’ methods, which is rather unfair to Holt).

 

Another of Holt’s collaborators was John Muth, who later became famous in economics for formulating the concept of rational expectations. In exponential smoothing he is known for introducing two statistical models


6 1 Basic Concepts

 

(Muth 1960) for which the optimal forecasts are equivalent to those obtained from simple exponential smoothing.

 

Muth’s models were the first in a long series of statistical models that are related to forecasting using exponential smoothing. The success of the expo-nential smoothing methods for forecasting, and for controlling inventory, has resulted in many researchers looking for models that produce the same point forecasts as these methods. Many of these models, including those of Muth, are state space models for which the minimum mean squared error forecasts are the forecasts from simple exponential smoothing.

 

1.4 State Space Models

 

State space models allow considerable flexibility in the specification of the parametric structure. In this book, we will use the innovations formulation of the model (e.g., Anderson and Moore 1979; Aoki 1987; Hannan and Deistler 1988). Let yt denote the observation at time t, and let xt be a “state vector” containing unobserved components that describe the level, trend and sea-sonality of the series. Then a linear innovations state space model can be written as

yt = w_xt 1+ ε t, (1.1a)
xt = F xt 1+ t, (1.1b)

where t } is a white noise series and F, g and w are coefficients. Equa-tion (1.1a) is known as the measurement (or observation) equation; it describes the relationship between the unobserved states xt 1 and the observation yt. Equation (1.1b) is known as the transition (or state) equation; it describes the evolution of the states over time. The use of identical errors (or innovations) in these two equations makes it an “innovations” state space model. Several exponential smoothing methods are equivalent to point forecasts of special cases of model (1.1); examples are given in Sect. 2.5.

 

The philosophy of state space models fits well with the approach of expo-nential smoothing because the level, trend and seasonal components are stated explicitly in the models. In contrast, one cannot see these components as easily in autoregressive integrated moving average (ARIMA) models (Box et al. 1994).

 

Nonlinear state space models are also possible. One form that we use in Chap. 2 is

yt = w (xt 1) + r (xt 1) ε t, (1.2a)
xt = f (xt 1) + g (xt 1) ε t. (1.2b)

An alternative, and more common, specification is to assume that the errors in the two equations are mutually independent. That is, t in (1.1b) is replaced by zt, where zt consists of independent white noise series that


1.4 State Space Models  

 

are also independent of ε t, the error in the measurement equation. The assumption that zt and ε t are independent provides enough constraints to ensure that the remaining parameters are estimable (termed just identified in the econometrics literature).

 

There are an infinite number of ways in which the parameter space could be constrained to achieve estimability. The purpose of this book is to present the theory and applications of the innovations formulation, wherein all of the error sources are perfectly correlated. In some papers, these are known as single source of error (SSOE) models (e.g., Ord et al. 1997). By contrast, werefer to the more common form of the state space model as having multiple sources of error (MSOE).

 

At first it may seem that innovations state space models are more restric-tive than MSOE models, but this is not the case. In fact, the reverse is true. Any linear MSOE model can be written in innovations form, and any linear innovations model can be written in MSOE form. However, the innovations models can have a larger parameter space. The innovations models have sev-eral other advantages over the models with multiple sources of error, as will be seen in Chap. 13.

 

Moreover, MSOE state space models, like ARIMA models, are linear models that require both the components and the error terms to be additive. While nonlinear versions of both MSOE and ARIMA models exist, these are much more difficult to work with. In contrast, it is relatively easy to use a nonlinear innovations state space model for describing and forecasting time series data and we will use them frequently in this book.

 

MSOE models that are similar to the types of models considered in this book include dynamic linear models (Harrison and Stevens 1976; Duncan and Horn 1972; West and Harrison 1997) and structural models (Harvey 1989).

 

Modern work on state space models began with Kalman (1960) and Kalman and Bucy (1961), following which a considerable body of litera-ture developed in engineering (e.g., Jazwinski 1970; Anderson and Moore 1979). Early work in the statistical area included the Markovian representa-tion developed by Akaike (1973, 1974). Hannan and Deistler (1988) provided a unifying presentation of the work by engineers and statistical time series analysts for stationary time series. In economics, Aoki and Havenner (1991) looked at multivariate state space models and suggested procedures for both stationary and nonstationary data. For a review of the books in the area, see Durbin and Koopman (2001, p. 5).


 

Getting Started

 

Although exponential smoothing methods have been around since the 1950s, a modeling framework incorporating stochastic models, likelihood calcu-lations, prediction intervals, and procedures for model selection was not developed until relatively recently, with the work of Ord et al. (1997) and Hyndman et al. (2002). In these (and other) papers, a class of state space models has been developed that underlies all of the exponential smoothing methods.

 

In this chapter, we provide an introduction to the ideas underlying expo-nential smoothing and the associated state space models. Many of the details will be skipped over in this chapter, but will be covered in later chapters.

 

Figure 2.1 shows the four time series from Fig. 1.1, along with point forecasts and 80% prediction intervals. These were all produced using expo-nential smoothing state space models. In each case, the particular models and all model parameters were chosen automatically with no intervention by the user. This demonstrates one very useful feature of state space models for exponential smoothing—they are easy to use in a completely automated way. In these cases, the models were able to handle data exhibiting a range of features, including very little trend, strong trend, no seasonality, a seasonal pattern that stays constant, and a seasonal pattern with increasing variation as the level of the series increases.

 

 

2.1 Time Series Decomposition

 

It is common in business and economics to think of a time series as a combi-nation of various components such as the trend (T), cycle (C), seasonal (S), and irregular or error (E) components. These can be defined as follows:

 

Trend (T): The long-term direction of the series

 

Seasonal (S): A pattern that repeats with a known periodicity (e.g., 12 months per year, or 7 days per week)


10 2 Getting Started

 


Дата добавления: 2015-10-24; просмотров: 155 | Нарушение авторских прав


Читайте в этой же книге: UK passenger motor vehicle production Overseas visitors to Australia 2 страница | UK passenger motor vehicle production Overseas visitors to Australia 3 страница | UK passenger motor vehicle production Overseas visitors to Australia 4 страница | UK passenger motor vehicle production Overseas visitors to Australia 5 страница | B) Local trend approximation 1 страница | B) Local trend approximation 2 страница | B) Local trend approximation 3 страница | B) Local trend approximation 4 страница | Parsimonious Seasonal Model | Quarterly sales distribution: 16 steps ahead |
<== предыдущая страница | следующая страница ==>
Springer Series in Statistics| UK passenger motor vehicle production Overseas visitors to Australia 1 страница

mybiblioteka.su - 2015-2024 год. (0.015 сек.)