Imagine that there is a central bank that targets the inflation rate successfully every period because there are no real shocks in the economy. In this model, inflation looks like this:

$$ \pi_t = \pi^* $$

where $ \pi_t $ and $ \pi^* $ are the inflation rate and the inflation target, respectively. The nominal interest rate in this model will always be $ \pi^* $ higher than the constant (no real shocks) real interest rate, $ \rho $ and can be written as

$$ i_t = \rho + \pi^* $$

Now suppose that the central bank is not omniscient and occasionally misses its target either on accident or because of some unforeseen shock. The inflation rate is now

$$ \pi_t = \pi^*+ \epsilon_t $$

where $ \epsilon_t $ is the central bank's error every period. Since the inflation rate is not serially correlated, the nominal interest rate remains equal to $ \pi^* \: \forall t $. Let's add some real shocks into this economy, so the real interest rate fluctuates over time, adjusts slowly, and is equal to $ r_t $.

$$ r_t = (1 - \rho^r)\rho + \rho^r r_{t-1} + \nu_t $$

The nominal interest rate now moves around with the real shocks:

$$ i_t = r_t + \pi^* $$

For some unknown reason, the central bank decides to adopt a floating inflation target, $ \bar\pi_t $, and sets it so that it becomes a weighted average of $ \pi_{t - 1} $ and $ \pi^* $.

$$ \bar\pi_t = (1 - \rho^\pi)\pi^* + \rho^\pi \pi_{t-1} $$

The central bank still occasionally misses its target, so $ \pi_t $ is not always equal to $ \bar\pi_t $ and is instead

$$ \pi_t = (1-\rho^\pi)\pi^* + \rho^\pi \pi_{t-1} + \epsilon_t $$

The nominal interest rate is related to the current inflation rate now because of the auto-regressive process that the rate of inflation fallows and can now be expressed as

$$ i_t = r_t + (1 - \rho^\pi)\pi^* + \rho^\pi \pi_t $$

By sheer assumption, let's say that $ \epsilon_t $ and $ \nu_t $ are negatively correlated

*.*What is this model now? Well, it's New Keynesian, isn't it.
Think about it: There is a central banker that knows it could keep the real interest rate constant (or equal to its natural rate) by pegging the inflation rate, but instead he or she chooses to make it serially correlated by following a Taylor Rule. The real interest rate falls when inflation is above "target" and the nominal interest rate and the rate of inflation are positively correlated. Really, all you need to do to make the dynamics exactly like that of a New Keynesian model is to add a variable $ x_t $, call it the "output gap" and say that $ \dot x_t = r_t - \rho $.

Of course the real aspects of this "model" are really irrelevant (and pretty weak as assumptions go, replacing nominal rigidity with "shocks are negatively correlated" is pretty bad, the AR part of the real interest rate can make sense if the capital stock takes time to adjust, for example). What's really important is that

*NK central bankers are stupid*. They know that they should be targeting a constant rate of inflation, but they abandon that for the sake of Taylor Rules and avoiding the money demand function.
I obviously don't think that central bankers can simply choose to their nominal target be achieved every period, but they should at least refrain from "endogenizing" the money supply in favor of a tool that can mean different things at different times depending on your assumptions.

## No comments:

## Post a Comment