chapitre 1 part 2 imea 1

Upload: marthy23

Post on 06-Apr-2018

229 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/2/2019 Chapitre 1 Part 2 IMEA 1

    1/40

    Lecture 1: Fundamental concepts in Time Series Analysis(part 2)

    Florian Pelgrin

    University of Lausanne, Ecole des HECDepartment of mathematics (IMEA-Nice)

    Sept. 2011 - Jan. 2012

    Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 1 / 40

    http://find/
  • 8/2/2019 Chapitre 1 Part 2 IMEA 1

    2/40

    Stationarity, stability, and invertibility

    6. Stationarity, stability, and invertibility

    Consider again a situation where the value of a time series at time t,Xt, is a linear function of a constant term, the last p values of Xt, thecontemporaneous and last q values of a white noise process, denotedby t :

    Xt = +

    pk=1

    kXtk +

    qj=0

    jtj.

    This process rewrites :

    (L)Xt Autoregressive part

    = + (L)t Moving average part

    with (L) = 1 1L 2L2 pLp and(L) = 1 + 1L +

    + qL

    q.

    Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 2 / 40

    http://find/
  • 8/2/2019 Chapitre 1 Part 2 IMEA 1

    3/40

    Stationarity, stability, and invertibility

    Which conditions ?

    Stationarity conditions regard the autoregressive part of the previous(linear) time series model (ARMA(p,q) model).

    Stability conditions also regard the autoregressive part of the previous(linear) time series model (ARMA(p,q) model).

    Stability conditions are generally required to avoid explosive solutions

    of the stochastic difference equation : (L)Xt = + (L)t.

    Invertibility conditions regard the moving average part

    Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 3 / 40

    http://find/
  • 8/2/2019 Chapitre 1 Part 2 IMEA 1

    4/40

    Stationarity, stability, and invertibility

    Implications of the stability (stationarity) conditions

    If stability conditions hold, stationarity conditions are satisfied (the

    converse is not true) and (Xt) is weakly stationary.

    If the stochastic process (Xt) is stable (and thus weakly stationary),then (Xt) has an infinite moving average representation (MA(

    )

    representation) :

    Xt = 1(L) ( + (L)t)

    = 1(1) + C(L)t

    = 1 pk=1 k +

    i=0

    citi

    where the coefficients ci satisfy

    i=0|ci|

  • 8/2/2019 Chapitre 1 Part 2 IMEA 1

    5/40

    Stationarity, stability, and invertibility

    This representation shows the impact of past shocks, ti, on the

    current value of Xt :

    dXtdti

    = ci

    This is an application of the Wolds decomposition

    Omitting the constant term, Xt and t are linked by a linear filter inwhich C(L) = 1(L)(L) is called the transfer function.

    The coefficients (ci) are also referred to as the impulse responsefunction coefficients.

    Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 5 / 40

    St ti it t bilit d i tibilit

    http://find/
  • 8/2/2019 Chapitre 1 Part 2 IMEA 1

    6/40

    Stationarity, stability, and invertibility

    If (Xt)tZ is weakly stationary but not stable, then (Xt) has thefollowing representation :

    Xt = 1(L) ( + (L)t)

    = 1(1) + C(L)t

    =

    1 pk=1 k +

    i=

    citi.

    where (cj) is sequence of constants with :

    i=

    |ci|

  • 8/2/2019 Chapitre 1 Part 2 IMEA 1

    7/40

    Stationarity, stability, and invertibility

    Determination of the stability and stationarity conditions

    DefinitionConsider the stochastic process defined by :

    (L)Xt = + (L)t

    with (L) = 1 1L 2L2 pLp, (L) = 1 + 1L + + qLq,and (t) is a white noise process. This process is called stable if themodulus of all the roots, i, i = 1, , p, of the (reverse) characteristicequation :

    () = 0 1 1 22 pp = 0

    are greater than one, |i| > 1 for all i = 1, , p.

    Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 7 / 40

    Stationarity stability and invertibility

    http://find/
  • 8/2/2019 Chapitre 1 Part 2 IMEA 1

    8/40

    Stationarity, stability, and invertibility

    Implications of the stability (stationarity) conditions

    Invertibility is the counterpart to stationarity for the moving averagepart of the process.

    If the stochastic process (Xt) is invertible, then (Xt) has an infiniteautoregressive representation (AR() representation) :

    1

    (L)(L)Xt = 1

    (L) + t

    or

    Xt = 1 q

    k=1 k+

    i=1

    diXti + t

    where the coefficients di satisfy

    i=0|di|

  • 8/2/2019 Chapitre 1 Part 2 IMEA 1

    9/40

    Stationarity, stability, and invertibility

    The AR(

    ) representation shows the dependence of the current value

    Xt on the past values of Xti.

    The coefficients are referred as the d-weights of an ARMA model.

    If the stochastic process (Xt) is not invertible, it may exist a

    representation of the following form (omitting the constant term) aslong as the magnitude of any (characteristic) root of () does notequal unity :

    Xt = i=0

    diXti + t

    Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 9 / 40

    Stationarity, stability, and invertibility

    http://find/
  • 8/2/2019 Chapitre 1 Part 2 IMEA 1

    10/40

    Stationarity, stability, and invertibility

    Determination of the invertibility conditions

    DefinitionConsider the stochastic process defined by :

    (L)Xt = + (L)t

    with (L) = 1 1L 2L2 pLp, (L) = 1 + 1L + + qLq,and (t) is a white noise process. This process is called invertible if themodulus of all the roots, i, i = 1, , q, of the (reverse) characteristicequation :

    () = 0 1 + 1 + 22 + + qq = 0

    are greater than one, |i| > 1 for all i = 1, , q.

    Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 10 / 40

    Stationarity, stability, and invertibility

    http://find/
  • 8/2/2019 Chapitre 1 Part 2 IMEA 1

    11/40

    y, y, y

    Three equivalent representations

    DefinitionA stable (and thus weakly stationary), invertible stochastic process,(L)Xt = + (L)t has two other equivalent representations forms :

    1 An infinite moving average representation :

    Xt =

    1 pk=1 k

    +

    i=0

    citi

    2 An infinite autoregressive representation :

    Xt =

    1 qk=1 k

    +i=1

    diXti + t

    Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 11 / 40

    Stationarity, stability, and invertibility

    http://find/
  • 8/2/2019 Chapitre 1 Part 2 IMEA 1

    12/40

    y, y, y

    Each of these representations can shed a light on the model from adifferent perspective.

    For instance, the stationarity properties, the estimation of parameters,and the computing of forecasts can use different representations forms(see further).

    Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 12 / 40

    Stationarity, stability, and invertibility

    http://find/
  • 8/2/2019 Chapitre 1 Part 2 IMEA 1

    13/40

    y y y

    Examples

    1. An autoregressive process of order 1 :

    (1 L)Xt = twhere t WN(0, 2 ) for all t.

    The stability and (weak) stationarity of (Xt) depends on the (reverse)

    characteristic root of () = 1 :

    () = 0 = 1.

    The stability condition writes :

    || > 1 || < 1.The stationarity condition writes :

    |

    | = 1

    |

    | = 1.

    Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 13 / 40

    Stationarity, stability, and invertibility

    http://find/
  • 8/2/2019 Chapitre 1 Part 2 IMEA 1

    14/40

    Therefore,

    1 If || < 1, then (Xt) is stable and weakly stationary :

    (1 L)1 =k=0

    kLk and Xt =k=0

    ktk

    2 If || > 1, then a non-causal stationary solution (Xt) exists :

    (1 L)1 = k=1

    kFk and Xt = k=1

    kt+k

    3 If || = 1, then (Xt) is not stable and stationary : (1 L) cannot beinverted !

    Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 14 / 40

    Stationarity, stability, and invertibility

    http://find/http://goback/
  • 8/2/2019 Chapitre 1 Part 2 IMEA 1

    15/40

    2. A moving average process of order 1 :

    Xt = t t1where t WN(0, 2 ) for all t.

    (Xt) is weakly stationary irrespective of the characteristic root of() = 1 (why ?).The invertibility of (Xt) depends on the (reverse) characteristic root of() = 1

    :

    () = 0 = 1.

    The invertibility condition writes :

    ||>

    1 || q) : MA(q) process has no memory beyond q

    periods.

    The autocorrelation (respectively, autocovariance) function of astationary AR(p) process exhibits exponential decay towards zero

    (but does not vanish for lags greater than p).

    The autocorrelation (respectively, autocovariance) function of astationary ARMA(p, q) process exhibits exponential decay towardszero : it does not cut off but gradually dies out as h increases.

    The autocorrelation function of a nonstationary process decreasesvery slowly even at very high lags, long after the autocorrelationsfrom stationary processes have declined to (almost) zero.

    Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 23 / 40

    Identification tools Autocorrelations

    http://find/http://goback/
  • 8/2/2019 Chapitre 1 Part 2 IMEA 1

    24/40

    The sample autocorrelation function

    DefinitionGiven a sample of T observations, x1, , xT, the sample autocorrelationfunction, denoted by (X(h)), is computed by :

    X(h) =

    T

    t=h+1(xt )(xth )T

    t=1(xt )2

    where is the sample mean :

    =1

    T

    Tt=1

    xt.

    Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 24 / 40

    Identification tools Autocorrelations

    http://find/http://goback/
  • 8/2/2019 Chapitre 1 Part 2 IMEA 1

    25/40

    Proposition

    If (Xt)tZ is a second order stationary process with :

    Xt = m +jZ

    ajtj

    where the ts are i.i.d. with mean zero and variance2 , jZ|aj|

  • 8/2/2019 Chapitre 1 Part 2 IMEA 1

    26/40

    The partial autocorrelation function

    The partial autocorrelation function is another tool for identifying theproperties of an ARMA process.

    It is particularly useful to identify pure autoregressive process(AR(p)).

    A partial correlation coefficient measures the correlation between tworandom variables at different lags after adjusting for the correlationthis pair may have with the intervening lags : the PACF thusrepresents the sequence of conditional correlations.

    A correlation coefficient between two random variables at differentlags does not adjust for the influence of the intervening lags : theACF thus represents the sequence of unconditional correlations.

    Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 26 / 40

    Identification tools Partial autocorrelation

    http://find/http://goback/
  • 8/2/2019 Chapitre 1 Part 2 IMEA 1

    27/40

    Definition

    The partial autocorrelation function of a stationary stochastic process(Xt)tZ is defined to be :

    aX(h) = Corr(Xt,Xth | Xt1, ,Xth+1)

    h > 0.

    Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 27 / 40

    Identification tools Partial autocorrelation

    http://find/http://goback/
  • 8/2/2019 Chapitre 1 Part 2 IMEA 1

    28/40

    Definition

    The partial autocorrelation function of order h is defined to be :

    aX(h) = Corr

    Xt, Xth

    =Cov

    Xt, Xth

    VXtVXth1/2

    where

    Xt = Xt

    EL(Xt

    |Xt1,

    ,Xth+1)

    Xth = Xth EL(Xth | Xt1, ,Xth+1).

    Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 28 / 40

    Identification tools Partial autocorrelation

    http://find/http://goback/
  • 8/2/2019 Chapitre 1 Part 2 IMEA 1

    29/40

    Definition

    The partial autocorrelation function of a second order stationary stochastic

    process (Xt)tZ satisfies :

    aX(h) =|R(h)||R(h)|

    where R(k) is the autocorrelation matrix of order h definie par :

    R(h) =

    X(0) X(1) X(h 1)X(1) X(0) X(h 2)

    ......

    . . ....

    X(h 1) X(h 2) X(0)

    and R(h) is the matrix that is obtained after raplacing the last columnofR(h) by = (X(1), , X(k))t.

    Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 29 / 40

    Identification tools Partial autocorrelation

    http://find/http://goback/
  • 8/2/2019 Chapitre 1 Part 2 IMEA 1

    30/40

    Definition

    The partial correlation function can be viewed as the sequence of the h-th

    autoregressive coefficients in a h-th order autoregression. Let ah denotethe -th autoregressive coefficient of an AR(h) process :

    Xt = ah1Xt1 + ah2Xt2 + + ahhXth + t.

    Then

    aX(h) = ahh

    for h = 1, 2, Remark : The theoretical partial autocorrelation function of an AR(p)model will be different from zero for the first p terms and exactly zero forhigher order terms (why ?).

    Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 30 / 40

    Identification tools Partial autocorrelation

    Th l i l l i f i

    http://find/http://goback/
  • 8/2/2019 Chapitre 1 Part 2 IMEA 1

    31/40

    The sample partial autocorrelation function

    The sample partial autocorrelation function can be obtained bydifferent methods :

    1

    Find the ordinary least squares (or maximum likelihood) estimates ofahh2 Use the recursive equations of the autocorrelation function

    (Yule-Walker equations) after replacing the autocorrelation coefficientsby their estimates (see further).

    3 Etc.

    Florian Pelgrin (HEC) Univariate time series Sept. 2011 - Jan. 2012 31 / 40

    Identification tools Partial autocorrelation

    http://find/http://goback/
  • 8/2/2019 Chapitre 1 Part 2 IMEA 1

    32/40

    Proposition

    If (Xt)tZ is a second order stationary process with :

    Xt = m +jZ

    ajtj

    where the ts are i.i.d. with mean zero and variance2 , jZ|

    aj

    |