Search Results

SORT BY: PREVIOUS / NEXT
Author:Rubio-Ramirez, Juan F. 

Working Paper
Comparing New Keynesian models in the Euro area: a Bayesian approach

This paper estimates and compares four versions of the sticky price New Keynesian model for the Euro area, using a Bayesian approach as described in Rabanal and Rubio-Ramrez (2003). We find that the average duration of price contracts is between four and eight quarters, similar to the one estimated in the United States, while price indexation is found to be smaller. On the other hand, average duration of wage contracts is estimated to between one and two quarters, lower than the one found for the United States, while wage indexation is higher. Finally, the marginal likelihood indicates that ...
FRB Atlanta Working Paper , Paper 2003-30

Working Paper
Fortune or virtue: time-variant volatilities versus parameter drifting

This paper compares the role of stochastic volatility versus changes in monetary policy rules in accounting for the time-varying volatility of U.S. aggregate data. Of special interest to the authors is understanding the sources of the great moderation of business cycle fluctuations that the U.S. economy experienced between 1984 and 2007. To explore this issue, the authors build a medium-scale dynamic stochastic general equilibrium (DSGE) model with both stochastic volatility and parameter drifting in the Taylor rule and they estimate it non-linearly using U.S. data and Bayesian methods. ...
Working Papers , Paper 10-14

Working Paper
Uniform Priors for Impulse Responses

There has been a call for caution when using the conventional method for Bayesian inference in set-identified structural vector autoregressions on the grounds that the uniform prior over the set of orthogonal matrices could be nonuniform for key objects of interest. This paper challenges this call. Although the prior distributions of individual impulse responses induced by the conventional method may be nonuniform, they typically do not drive the posteriors if one does not condition on the reduced-form parameters. Importantly, when the focus is on joint inference, the uniform prior over the ...
Working Papers , Paper 22-30

Working Paper
A Gibbs Sampler for Efficient Bayesian Inference in Sign-Identified SVARs

We develop a new algorithm for inference based on structural vector autoregressions (SVARs) identified with sign restrictions. The key insight of our algorithm is to break from the accept-reject tradition associated with sign-identified SVARs. We show that embedding an elliptical slice sampling within a Gibbs sampler approach can deliver dramatic gains in speed and turn previously infeasible applications into feasible ones. We provide a tractable example to illustrate the power of the elliptical slice sampling applied to sign-identified SVARs. We demonstrate the usefulness of our algorithm by ...
Working Papers , Paper 25-19

Working Paper
Fiscal policy and minimum wage for redistribution: an equivalence result

In this paper, we derive conditions under which a minimum-wage law combined with anonymous taxes and transfers and an agent-specific tax-transfer scheme are equivalent policies.
FRB Atlanta Working Paper , Paper 2005-08

Working Paper
Dividend Momentum and Stock Return Predictability: A Bayesian Approach

A long tradition in macro finance studies the joint dynamics of aggregate stock returns and dividends using vector autoregressions (VARs), imposing the cross-equation restrictions implied by the Campbell-Shiller (CS) identity to sharpen inference. We take a Bayesian perspective and develop methods to draw from any posterior distribution of a VAR that encodes a priori skepticism about large amounts of return predictability while imposing the CS restrictions. In doing so, we show how a common empirical practice of omitting dividend growth from the system amounts to imposing the extra ...
FRB Atlanta Working Paper , Paper 2021-25

Working Paper
Reading the recent monetary history of the U.S., 1959-2007

The authors report the results of the estimation of a rich dynamic stochastic general equilibrium model of the U.S. economy with both stochastic volatility and parameter drifting in the Taylor rule. They use the results of this estimation to examine the recent monetary history of the U.S. and to interpret, through this lens, the sources of the rise and fall of the great American inflation from the late 1960s to the early 1980s and of the great moderation of business cycle fluctuations between 1984 and 2007.
Working Papers , Paper 10-15

Working Paper
Estimating dynamic equilibrium economies: linear versus nonlinear likelihood

This paper compares two methods for undertaking likelihood-based inference in dynamic equilibrium economies: a sequential Monte Carlo filter proposed by Fernndez-Villaverde and Rubio-Ramrez (2004) and the Kalman filter. The sequential Monte Carlo filter exploits the nonlinear structure of the economy and evaluates the likelihood function of the model by simulation methods. The Kalman filter estimates a linearization of the economy around the steady state. The authors report two main results. First, both for simulated and for real data, the sequential Monte Carlo filter delivers a ...
FRB Atlanta Working Paper , Paper 2004-3

Working Paper
Supply-side policies and the zero lower bound

This paper examines how supply-side policies may play a role in fighting a low aggregate demand that traps an economy at the zero lower bound (ZLB) of nominal interest rates. Future increases in productivity or reductions in mark-ups triggered by supply-side policies generate a wealth effect that pulls current consumption and output up. Since the economy is at the ZLB, increases in the interest rates do not undo this wealth effect, as we will have in the case outside the ZLB. The authors illustrate this mechanism with a simple two-period New Keynesian model. They discuss possible objections ...
Working Papers , Paper 11-47

Working Paper
Estimating dynamic equilibrium models with stochastic volatility

We propose a novel method to estimate dynamic equilibrium models with stochastic volatility. First, we characterize the properties of the solution to this class of models. Second, we take advantage of the results about the structure of the solution to build a sequential Monte Carlo algorithm to evaluate the likelihood function of the model. The approach, which exploits the profusion of shocks in stochastic volatility models, is versatile and computationally tractable even in large-scale models, such as those often employed by policy-making institutions. As an application, we use our algorithm ...
Working Papers , Paper 13-19

FILTER BY year

FILTER BY Content Type

FILTER BY Author

FILTER BY Jel Classification

C32 12 items

C11 7 items

E52 6 items

C51 5 items

C15 3 items

E58 3 items

show more (19)

FILTER BY Keywords

SVARs 7 items

Econometric models 6 items

sign restrictions 6 items

identification 4 items

structural vector autoregressions 4 items

time-varying parameters 3 items

show more (85)

PREVIOUS / NEXT