What is a 'Direct Filter Approach'?


The predicate 'direct' refers to the fact that the optimization criterion emphasizes 'directly' filter performances (instead of classic one-step ahead mean-square error performances prevalent in maximum likelihood approaches). The term 'filter' refers to the fact that the outcome of the optimization is not a model, but a filter instead. The formal mathematical background (in a mean-square error perspective) is provided in Optimal Real-Time Filters for Linear Prediction Problems.



Scope/Generality

As shown in  Optimal Real-Time Filters for Linear Prediction Problems as well as in MDFA (chapter 9) the Direct Filter Approach (DFA for short) embeds classic time series approaches (ARIMA, VAR, state space, exponetial smoothing): it is more general. The approach also embeds classic filter designs like Hodrick-Prescott or Christiano-Fitzgerald or Henderson (to name a few), see MDFA (chapter 9). Embedding would be of no added value, per se. But once embedded, any of the above classic approaches can be customized, see The Trilemma Between Accuracy, Timeliness and Smoothness in Real-Time Signal Extraction and MDFA chapters 7 and 8. Briefly, a model-based proponent can tune a particular (model-based) design according to requirements which are felt relevant in typical economic applications. These topics will be illustrated extensively in later blog-entries.   

Efficiency Gains

Addressing the filter error directly, in the DFA-criterion, is associated with efficiency gains (in terms of smaller mean-square filter errors), when benchmarked against classic model-based approaches. A list of links to empirical results is provided in Advances in Signal Extraction and Forecasting (search for "efficiency" and click on any of the links on the corresponding slides). In a nutshell, the gains depend on the 'difficulty' of the estimation problem (how far does the target signal 'look' into the future) as well as on the degree of misspecification of the model (in the model-based approach), see (3) for reference.  In typical economic applications both components are likely to contribute significantly (in all possible acceptions of the word) to efficiency gains by a 'Direct Filter Approach'.

Beyond the MSE-Norm

A 'Direct Filter Approach' allows to address and to explore richer optimization criteria by splitting the MSE-norm into three (in fact four but the fourth term is negligible) constituents called Accuracy, Smoothness and Timeliness, see The Trilemma Between Accuracy, Timeliness and Smoothness in Real-Time Signal Extraction.This split --the ATS-trilemma-- allows to address SEF-issues that cannot be tackled by a classic model-based paradigm, see for example Advances in Signal Extraction and Forecasting (search for "forecast dilemma") or The Trilemma Between Accuracy, Timeliness and Smoothness in Real-Time Signal Extraction, section 2.5. Performance gains are illustrated in Advances in Signal Extraction and Forecasting (search for "efficiency"): an interesting/remarkable outcome of these comparisons is that a suitably customized univariate filter can outperform a bivariate leading-indicator design in terms of Smoothness (curvature measure) and of Timeliness (peak correlation measure). In contrast, this practically relevant tuning of (real-time) filter characteristics cannot be obtained explicitly in a classic model-based approach. 

Replication

We will address the above issues by supplying corresponding (R-) code-snippets from MDFA chapters 7 and 8 and from The Trilemma Between Accuracy, Timeliness and Smoothness in Real-Time Signal Extraction in due time on this blog. In particular, the above 'striking' outperformance of a bivariate leading indicator design by a suitably customized univariate filter will be subject to scrutiny.

Comments

Popular posts from this blog