Студопедия
Случайная страница | ТОМ-1 | ТОМ-2 | ТОМ-3
АвтомобилиАстрономияБиологияГеографияДом и садДругие языкиДругоеИнформатика
ИсторияКультураЛитератураЛогикаМатематикаМедицинаМеталлургияМеханика
ОбразованиеОхрана трудаПедагогикаПолитикаПравоПсихологияРелигияРиторика
СоциологияСпортСтроительствоТехнологияТуризмФизикаФилософияФинансы
ХимияЧерчениеЭкологияЭкономикаЭлектроника

Springer Series in Statistics

Читайте также:
  1. B) Complete the following table of statistics. Fill in the missing percentage using figures from the box.
  2. Basic notions of a system, subsystem, complex, series, cycle, group of exercises
  3. Dota2 AM Series SLTV
  4. Dota2 Semi Series SLTV
  5. E. Raevneva – DcSc, Professor, Head of Department of Statistics and Economic Forecasting, Kharkiv National University of Economics, Ukraine
  6. English language statistics
  7. European Union (Eurostat) and United States Bureau of Labor Statistics.

Springer Series in Statistics

 

Forecasting with Exponential Smoothing

 

The State Space Approach


Springer Series in Statistics

 

Advisors:

 

P. Bickel, P. Diggle, S. Fienberg, U. Gather,

 

I. Olkin, S. Zeger


Rob J. Hyndman, Anne B. Koehler,

 

J. Keith Ord and Ralph D. Snyder

 

 

Forecasting with Exponential Smoothing

 

 

The State Space Approach


Professor Rob Hyndman Professor Anne Koehler
Department of Econometrics & Business Department of Decision
Statistics Sciences & Management Information Systems
Monash University Miami University
Clayton VIC 3800 Oxford, Ohio 45056
Australia USA
Rob.Hyndman@buseco.monash.edu.au koehleab@muohio.edu

 

 

Professor Keith Ord Associate Professor Ralph Snyder
McDonough School of Business Department of Econometrics & Business
Georgetown University Statistics
Washington DC 20057 Monash University
USA Clayton VIC 3800
ordk@georgetown.edu Australia
  Ralph.Snyder@buseco.monash.edu.au

 

 

ISBN 978-3-540-71916-8 e-ISBN 978-3-540-71918-2

 

Library of Congress Control Number: 2008924784

 

© 2008 Springer-Verlag Berlin Heidelberg

 

This work is subject to copyright. All rights are reserved, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilm or in any other way, and storage in data banks. Duplication of this publication or parts thereof is permitted only under the provisions of the German Copyright Law of September 9, 1965, in its current version, and permissions for use must always be obtained from Springer-Verlag. Violations are liable for prosecution under the German Copyright Law.

 

The use of general descriptive names, registered names, trademarks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use.

 

Cover design: Deblik, Berlin, Germany

 

Printed on acid-free paper

 

9 8 7 6 5 4 3 2 1

 

springer.com


 

Preface

 

Exponential smoothing methods have been around since the 1950s, and are still the most popular forecasting methods used in business and industry. Initially, a big attraction was the limited requirements for computer storage. More importantly today, the equations in exponential smoothing methods for estimating the parameters and generating the forecasts are very intu-itive and easy to understand. As a result, these methods have been widely implemented in business applications.

 

However, a shortcoming of exponential smoothing has been the lack of a statistical framework that produces both prediction intervals and point fore-casts. The innovations state space approach provides this framework while retaining the intuitive nature of exponential smoothing in its measurement and state equations. It provides prediction intervals, maximum likelihood estimation, procedures for model selection, and much more.

 

As a result of this framework, the area of exponential smoothing has undergone a substantial revolution in the past ten years. The new innova-tions state space framework for exponential smoothing has been discussed in numerous journal articles, but until now there has been no systematic explanation and development of the ideas. Furthermore, the notation used in the journal articles tends to change from paper to paper. Consequently, researchers and practitioners struggle to use the new models in applications. In writing this book, we have attempted to compile all of the material related to innovations state space models and exponential smoothing into one coher-ent presentation. In the process, we have also extended results, filled in gaps and developed totally new material. Our goal has been to provide a compre-hensive exposition of the innovations state space framework for forecasting time series with exponential smoothing.


VI Preface

 

Outline of the Book

 

We have written this book for people wanting to apply exponential smooth-ing methods in their own area of interest, as well as for researchers wanting to take the ideas in new directions. In attempting to cater for this broad audience, the book has been structured into four parts, providing increasing levels of detail and complexity.

 

Part I: Introduction (Chaps. 1 and 2)

 

If you only want a snack, then read Part I. It provides an overview of our approach to forecasting and an introduction to the state space models that underlie exponential smoothing. You will then be able to appreciate how to implement exponential smoothing in the statistical framework of innovations state space models.

 

Chapter 1 includes some general information on forecasting time series and provides an historical context. In Chap. 2, we establish the linkage between standard exponential smoothing methods and the innovations state space models. Then, we describe all parts of the forecasting process using innovations state space models in the following order: initializa-tion, estimation, forecasting, evaluation of forecasts, model selection, and an automatic procedure for the entire process which includes finding prediction intervals.

 

Part II: Essentials (Chaps. 3–7)

 

Readers wanting a more substantial meal should go on to read Chaps. 3–7. They fill out many of the details and provide links to the most important papers in the literature. Anyone finishing the first seven chap-ters will be ready to begin using the models for themselves in applied work.

 

We examine linear models more closely in Chap. 3, before adding the complexity of nonlinear and heteroscedastic models in Chap. 4. These two chapters also introduce the concepts of stationarity, stability, and forecastability. Because the linear models are a subset of the general innovations state space model, the material on estimation (Chap. 5), pre-diction (Chap. 6), and model selection (Chap. 7) relates to the general model, with considerations of linear models and other special subgroups where informative.

 

Part III: Further Topics (Chaps. 8–17)

 

If you want the full banquet, then you should go on to read the rest of the book. Chapters 8–17 provide more advanced considerations of the details of the models, their mathematical properties, and extensions of the mod-els. These chapters are intended for people wanting to understand the modeling framework in some depth, including other researchers in the field.


Preface VII

 

We consider the normalization of seasonal components in Chap. 8, and the addition of regressors to the model in Chap. 9. In Chap. 10, we address the important issue of parameter space specification, along with the concept of the minimal dimension of a model. The relationship with other standard time series models is investigated. In particular, Chap. 11 looks at ARIMA models, and Chap. 13 examines conventional state space models, which have multiple sources of randomness. An information fil-ter for estimating the parameters in a state space model with a random seed vector is detailed in Chap. 12. The advantages of the information fil-ter over the Kalman filter, which was originally developed for stationary data, are explained. The remaining four chapters address special issues and models for specific types of time series as follows: time series with multiple seasonal patterns in Chap. 14, time series with strictly positive values in Chap. 15, count data in Chap. 16, and vectors of time series in Chap. 17.

 

Part IV: Applications (Chaps. 18–20)

 

The final part of the book provides the after-dinner cocktails and contains applications to inventory control, economics and finance.

These applications are intended to illustrate the potentially wide reach and usefulness of the innovations state space models. Procedures for addressing the important inventory problems of nonstationary demand and the use of sales data when true demand is unknown are covered in Chap. 18 for a reorder inventory system. In Chap. 19, the natural imple-mentation of conditional heteroscedasticity in the innovations state space models framework (i.e., a GARCH-type model) is shown and applied to examples of financial time series. In Chap. 20, the Beveridge-Nelson decomposition of a univariate time series into transitory and permanent components is presented in the linear innovations state space frame-work. The advantages of this formulation over other approaches to the Beveridge-Nelson decomposition are explained.

 

 

Website

 

The website http://www.exponentialsmoothing.net provides supplemen-tary material for this book, including data sets, computer code, additional exercises, and links to other resources.

 

 

Forecasting Software


 

Time series forecasting is not a spectator sport, and any serious forecaster needs access to adequate computing power. Most of the analyses pre-sented in this book can readily be performed using the forecast package for


VIII Preface

 

R (Hyndman 2007), which is available on CRAN (http://cran.r-project. org/). All of the data in the book are available in the expsmooth package for

 

R. In addition, we provide R code at http://www.exponentialsmoothing. net for producing most of the examples in the book.

 

 

Acknowledgements

 

No writing project of this size is undertaken without assistance from many people. We gratefully acknowledge the contributions of several colleagues who co-authored individual chapters in the book. Their collective expertise has greatly added to the depth and breadth of coverage of the book. They are:

 

Muhammad Akram Chap. 15;
Heather Anderson Chap. 20;
Ashton de Silva Chap. 17;
Phillip Gould Chap. 14;
Chin Nam Low Chap. 20;
Farshid Vahid-Araghi Chap. 14.

 

We are particularly grateful to Cathy Morgan for her careful copyedit-ing work throughout. We also received valuable suggestions from Andrey Kostenko and programming assistance from Adrian Beaumont. Their atten-tion to detail has been greatly appreciated.

 

Each of us owes gratitude to our universities for providing excellent envi-ronments in which to work, and the research facilities necessary to write this book. Monash University was especially helpful in providing the opportu-nity for all four authors to spend some time together at crucial points in the process.

 

We would also like to thank Lilith Braun from Springer for keeping us on-track and for making the finished product possible.

Finally, we are each thankful to our families and friends for their support, even when we were neglectful and distracted. We do appreciate it.

 

Melbourne, Australia, Rob J. Hyndman
Oxford, Ohio, USA, Anne B. Koehler
Washington DC, USA, J. Keith Ord
Melbourne, Australia, Ralph D. Snyder
February 2008  

 

Contents

 

 

Part I Introduction  
  Basic Concepts................................................  
  1.1 Time Series Patterns.......................................  
  1.2 Forecasting Methods and Models..........................  
  1.3 History of Exponential Smoothing.........................  
  1.4 State Space Models.......................................  
  Getting Started...............................................  
  2.1 Time Series Decomposition................................  
  2.2 Classification of Exponential Smoothing Methods...........  
  2.3 Point Forecasts for the Best-Known Methods................  
  2.4 Point Forecasts for All Methods............................  
  2.5 State Space Models.......................................  
  2.6 Initialization and Estimation...............................  
  2.7 Assessing Forecast Accuracy...............................  
  2.8 Model Selection..........................................  
  2.9 Exercises.................................................  
Part II Essentials  
3Linear Innovations State Space Models.........................  
  3.1 The General Linear Innovations State Space Model..........  
  3.2 Innovations and One-Step-Ahead Forecasts.................  
  3.3 Model Properties.........................................  
  3.4 Basic Special Cases........................................  
  3.5 Variations on the Common Models.........................  
  3.6 Exercises.................................................  


X Contents

 

4 Nonlinear and Heteroscedastic Innovations State

 

Space Models................................................. 534.1 Innovations Form of the General State Space Model......... 53 4.2 Basic Special Cases........................................ 56 4.3 Nonlinear Seasonal Models................................ 61 4.4 Variations on the Common Models......................... 64 4.5 Exercises................................................. 66

 

5Estimation of Innovations State Space Models..................  
  5.1 Maximum Likelihood Estimation..........................  
  5.2 A Heuristic Approach to Estimation........................  
  5.3 Exercises.................................................  
6Prediction Distributions and Intervals.........................  
  6.1 Simulated Prediction Distributions and Intervals............  
  6.2 Class 1: Linear Homoscedastic State Space Models..........  
  6.3 Class 2: Linear Heteroscedastic State Space Models..........  
  6.4 Class 3: Some Nonlinear Seasonal State Space Models.......  
  6.5 Prediction Intervals.......................................  
  6.6 Lead-Time Demand Forecasts for Linear Homoscedastic  
    Models...................................................  
  6.7 Exercises.................................................  
    Appendix: Derivations....................................  
  Selection of Models...........................................  
  7.1 Information Criteria for Model Selection....................  
  7.2 Choosing a Model Selection Procedure.....................  
  7.3 Implications for Model Selection Procedures................  
  7.4 Exercises.................................................  
    Appendix: Model Selection Algorithms.....................  
Part III Further Topics  
  Normalizing Seasonal Components............................  
  8.1 Normalizing Additive Seasonal Components...............  
  8.2 Normalizing Multiplicative Seasonal Components...........  
  8.3 Application: Canadian Gas Production.....................  
  8.4 Exercises.................................................  
    Appendix: Derivations for Additive Seasonality.............  
9Models with Regressor Variables..............................  
  9.1 The Linear Innovations Model with Regressors..............  
  9.2 Some Examples...........................................  
  9.3 Diagnostics for Regression Models.........................  
  9.4 Exercises.................................................  


Contents XI

 

10 Some Properties of Linear Models............................. 14910.1 Minimal Dimensionality for Linear Models................. 149 10.2 Stability and the Parameter Space.......................... 152 10.3 Conclusions.............................................. 161 10.4 Exercises................................................. 161

 

11 Reduced Forms and Relationships with ARIMA Models........ 16311.1 ARIMA Models........................................... 164 11.2 Reduced Forms for Two Simple Cases...................... 168 11.3 Reduced Form for the General Linear Innovations Model.... 170 11.4 Stationarity and Invertibility............................... 171 11.5 ARIMA Models in Innovations State Space Form............ 173 11.6 Cyclical Models........................................... 176 11.7 Exercises................................................. 176

 


Дата добавления: 2015-10-24; просмотров: 158 | Нарушение авторских прав


Читайте в этой же книге: UK passenger motor vehicle production Overseas visitors to Australia 1 страница | UK passenger motor vehicle production Overseas visitors to Australia 2 страница | UK passenger motor vehicle production Overseas visitors to Australia 3 страница | UK passenger motor vehicle production Overseas visitors to Australia 4 страница | UK passenger motor vehicle production Overseas visitors to Australia 5 страница | B) Local trend approximation 1 страница | B) Local trend approximation 2 страница | B) Local trend approximation 3 страница | B) Local trend approximation 4 страница | Parsimonious Seasonal Model |
<== предыдущая страница | следующая страница ==>
С уважением, редакция| Economic Applications: The Beveridge–Nelson

mybiblioteka.su - 2015-2024 год. (0.015 сек.)