Search Results

SORT BY: PREVIOUS / NEXT
Author:Runkle, David E. 

Journal Article
A fine time for monetary policy?

Recent research in evaluating the effects of monetary policy is potentially tainted by the problem of time aggregation: that is, effects may be incorrectly estimated using quarterly data if the effects of policy occur rapidly. This study evaluates whether time aggregation is a serious problem in a simple vector autoregression. It shows time aggregation has little impact on evaluating the effect of monetary policy in a simple vector autoregression including total reserves, nonborrowed reserves, and the federal funds rate. This finding suggests that time aggregation is unlikely to be important ...
Quarterly Review , Volume 19 , Issue Win , Pages 18-31

Journal Article
Are economic forecasts rational?

This paper discusses at an undergraduate level how forecast rationality can be tested. It explains that forecasters should correctly use any relevant information they knew in making their predictions. It shows that forecast rationality can be tested by determining whether the forecasters' prediction errors are predictable. After addressing what data and methods can be used for testing rationality, the paper presents tests of the price-forecast rationality of individual professional forecasters. Unlike results of previous studies, the test results show that those forecasters' price predictions ...
Quarterly Review , Volume 13 , Issue Spr , Pages 26-33

Journal Article
Revisionist history: how data revisions distort economic policy research

This article describes how and why official U.S. estimates of the growth in real economic output and inflation are revised over time, demonstrates how big those revisions tend to be, and evaluates whether the revisions matter for researchers trying to understand the economy?s performance and the contemporaneous reactions of policymakers. The conclusion may seem obvious, but it is a point ignored by most researchers: To have a good chance of understanding how policymakers make their decisions, researchers must use not the final data available, but the data available initially, when the policy ...
Quarterly Review , Volume 22 , Issue Fall , Pages 3-12

Journal Article
The U.S. economy in 1989 and 1990: walking a fine line

Quarterly Review , Volume 13 , Issue Win , Pages 3-10

Conference Paper
Another hole in the ozone layer: changes in FOMC operating procedure and the term structure

Proceedings , Paper 1, pt. 1

Journal Article
The Federal Reserve's Beige Book: A better mirror than crystal ball

The Region , Volume 13 , Issue Mar , Pages 10-13,28-32

Journal Article
No relief in sight for the U.S. economy

For at least the next two years, the U.S. economy will grow more slowly than it has on average since World War II. This is the forecast of a Bayesian vector autoregression model developed and used by researchers at the Minneapolis Federal Reserve Bank. The model's previous forecast?of a very weak start to the 1991?92 recovery?was remarkably accurate. Both forecasts are supported by evidence on long-term problems among consumers, in the commercial real estate industry, and at all levels of government. These problems will most likely constrain economic growth for years, although short spurts of ...
Quarterly Review , Volume 16 , Issue Fall , Pages 13-20

Journal Article
Bad news from a forecasting model of the U.S. economy

This paper describes and analyzes the 1990-92 economic forecasts of a Bayesian vector autoregression model developed by researchers at the Minneapolis Fed. The model's 1990 forecast was pretty bad - too optimistic about both inflation and economic growth, especially growth in consumption and housing. An analysis of the model's errors, however, turns up no reason to think the model is unsound. Based on data available on November 30, 1990, the model predicts weak economic conditions for the next two years: a likely recession in 1991 and moderate inflation and weak overall growth in 1991-92. The ...
Quarterly Review , Volume 14 , Issue Fall , Pages 2-10

Report
Statistical inference in the multinomial multiperiod probit model

Statistical inference in multinomial multiperiod probit models has been hindered in the past by the high dimensional numerical integrations necessary to form the likelihood functions, posterior distributions, or moment conditions in these models. We describe three alternative approaches to inference that circumvent the integration problem: Bayesian inference using Gibbs sampling and data augmentation to compute posterior moments, simulated maximum likelihood (SML) estimation using the GHK recursive probability simulator, and method of simulated moment (MSM) estimation using the GHK simulator. ...
Staff Report , Paper 177

Working Paper
Another hole in the ozone layer: changes in FOMC operating procedure and the term structure

FRB Atlanta Working Paper , Paper 92-15

FILTER BY year

FILTER BY Content Type

PREVIOUS / NEXT