17 January 2016

Choosing the Best Model For Each Context

In spite of perhaps attracting the wrath of Jason Smith, I think it is safe to say that economics is too complicated for there to be one generally applicable model of everything. Because of this, there is a veritable plethora of economic models available to the economic theorist. This simply leaves the question of which one to use in which circumstance.

Simon Wren-Lewis seems to think that economists should select between models in an ex-post manner -- that is, we should seen which model better represents the data and use that model from then on:
How do we know if most economic cycles are described by Real Business Cycles (RBC) or Keynesian dynamics. One big clue is layoffs: if employment is fall because workers are choosing not to work, we could have an RBC mechanism, but if workers are being laid off (and are deeply unhappy about is) this is more characteristic of a Keynesian downturn.
 The issue here is that we can only diagnose events after the fact, we cannot reasonably make predictions because of the impossibility of ex ante empirical validation: it is impossible to determine whether or not a recession is New Keynesian or if it is a Real Business Cycle before data are released.

This is why context-based validation of theory is superior to empirical validation in the case of economics. The context -- i.e. the sub-field of economics that is being studied -- should inform model choice almost entirely. If the field is business cycles, then the relevant model is a New Keynesian DSGE model and if the field is growth theory, then New Keynesian models are superfluous and should be tabled in favor of neoclassical models -- whose only difference from their New Keynesian counterparts is nominal rigidity, which is irrelevant over a time scale longer than a decade.

Predictions about the economy can now be made based on currently available information: it is possible to determine whether or not, e.g. financial frictions should be present in our business cycle model based on the current state of the economy: we knew by Q3 2008 that financial frictions were relevant, so we should have put them in a model if we were trying to predict the next few years.

Alternatively, the model I should choose to use depends on the kind of thought experiment I choose to embark on. Am I trying to compare PAYGO pensions with Social Security? If so, the obvious model to use is a simple OLG model without a labor-leisure trade-off or sticky prices. Choice of models is equivalent to choice of assumptions, at least when it comes to the DGE approach currently dominant in economics, and assumption choice depends entirely on the question being asked. Nominal rigidity is obviously relevant for business cycle theory, but completely useless when it comes to determining the level effect of a tax increase.

Hopefully this selection mechanism is specific enough to not be "basically feelings," as Jason Smith would suggest is the case for most of economics.


  1. I doubt you'll incur Jason's wrath, since I don't think he claims his framework or models cover all things economic. In fact, in a comment to me once he remarked something to the effect of "My models are already wrong." I can't recall what followed, but he had an example: I think it was something he doesn't even attempt to explain. The example might have been something like modeling when a "market coordination" (to use his words) is likely to happen and cause "non-ideal" information transfer (like a panic leading to a down turn). In any case I thought "wrong" was a bit strong, but I got his point.

    Also, you write:

    "DGE approach currently dominant in economics"

    DGE = dynamic general equilibrium?

    Any reason you didn't include the "S" in that usage of the term? (the "stochastic" part)?

    Have you used such models, BTW? Or created your own? Perhaps in the form of a Mathematica or Matlab script or some other language?

    Also, have you ever tried to recreate one of Jason's models and see if you can duplicate his results?

    1. Tom,

      The reason I omitted the stochastic part of DSGE is that the stochastic component that is present in, e.g., business cycle theory is not necessarily a requirement for models commonly used in the rest of macro -- perfect foresight growth models, for example. I considered removing the "dynamic" part as well so it would just be "general equilibrium," but I decided that that would be too general (excuse the pun).

      As for my personal use of D(S)GE models, the most recent time I simulated one was in this post: http://ramblingsofanamateureconomist.blogspot.com/2015/12/shut-up-about-ricardian-equivalence.html

      I use a program called Dynare (http://www.dynare.org), which works in conjunction wither either Matlab or Octave (its open source counterpart) to generate impulse response functions for stochastic models.

      Otherwise, I try to develop specific models to formally explain my reasoning in posts. A few examples of this are http://ramblingsofanamateureconomist.blogspot.com/2015/12/whats-significance-of-low-real-interest.html, http://ramblingsofanamateureconomist.blogspot.com/2015/11/demystifying-neo-fisherism.html, and http://ramblingsofanamateureconomist.blogspot.com/2015/11/monetary-policy-effectiveness-in.html

      I have tried to replicate Jason's DSGE version of the ITM, but never wrote a blog post about it (mainly because Dynare is rather particular about the way you input models; simplicity comes at a price, I guess). I might compare some of what I consider to be 'the stylized facts of Information Transfer Economics' (unless Jason has already come up with his own, in which case I'll just use his) with 'the stylized facts of the strange hybrid of Monetarism and New Keynesianism that I seem to embody.'

    2. John, thanks for the info and the links. I'm familiar with both Matlab and Octave.

      What's your favorite intro economics book and/or macro text? Or did you learn reading papers? Rowe said Mankiw or Krugman or others... doesn't matter.

    3. Tom,

      No problem!

      I haven't bothered to actually buy a proper economics textbook and read through it. Mostly, I tried to find professors who post the presentations for their lectures online (like Lutz Hendricks), browsed ideas.repec.org for papers dealing with topics I was interested in, and read a bunch of economics blogs. Otherwise, I occasionally find pdf's of certain chapters of textbooks (my current favorite is Woodford and Friedman's Handbook of Monetary Economics) or whole books, like Woodford's "Interest and Prices" -- which I have yet to read through.

      If anything, I'd say the way I learned most of what I currently know about economics has been really eclectic and left me with weird holes in my knowledge such as a completely lack of understanding of VAR models and a bit of a disdain for dynamic programming and continuous time models in general.

    4. Interesting. You've got time to fill those holes in I think (should you be so inclined).

      I took a look at your twitter page and saw this:

      "An iconoclast of there ever was one"

      If you Google that w/ quotes, it's all you, so I suspect the "of" should be an "if?"

      BTW, I have lots of cousins there in Torrance (which I think I read you're from). Three of them own a machine shop: DASCO I think the name is. They make aircraft parts.

    5. When did you get interested in macro?

    6. VAR models: have you read Dave Giles blog? I can't say I understand them, but I have a better sense for what's involved. I referred to it after looking at Mark Sadowski's series of guest posts (on Marcusd Nunes' site) in 2015 in which he attempts to show the efficacy of QE in the "age of ZIRP" through various channels. He and Jason got into it after that, in a long series of exchanges on Jason's site. But regardless, I found one post on Dave's site in particular (on Granger causality) which helped me understand what Sadowski was up to. Since, as usual (when trying to follow two people who's understanding far exceeds my own), I was struggling to follow their disagreement, I subtly tried to get Dave to render his opinion on parts of the exchange ... as unbiased expert commentary... but he didn't really take the bait. Except on one small point. Dave has a "Reader's Forum" in which you can ask him general questions.

    7. "better sense" ... that is better than 0, which is where I was at prior to attempting to understand the Smith/Sadowski cage match.

    8. Also, this guy does forecasts:

    9. Tom,

      "When did you get interested in macro?"

      Around September 2014, I think.

      "BTW, I have lots of cousins there in Torrance (which I think I read you're from). Three of them own a machine shop: DASCO I think the name is. They make aircraft parts."

      I looked it up and DASCO is about three blocks away from my house (although I will be moving to Tokyo in August, so it'll soon be more like 5,000 miles).

      "If you Google that w/ quotes, it's all you, so I suspect the "of" should be an "if?""

      That's been there forever, I can't believe no one caught it. Thanks.

      When it comes to VAR models, part of the reason I ignore them (aside from me not understanding how to use them) is that they strike me as a bit too a-theoretical; Sadowski can draw all the trend trend lines he wants to, but, so long as he doesn't explain theoretically how QE was effective, I won't really care. Especially when http://informationtransfereconomics.blogspot.com/2015/07/the-sadowski-theory-of-money.html is the case.

  2. Sometimes there are very different models that all really predict the same thing. The way they get to the result can be very different though. It can help you get a deeper understanding to be able to think about things in different ways.

    I can illustrate this with many very different ways of explaining hyperinflation. These are not in contradiction but different ways of thinking about what is going on.


    1. Vincent,

      "Sometimes there are very different models that all really predict the same thing."

      The problem is that different models provide a different diagnosis for different situations. This is why it is best to choose models based on their assumptions rather than their results.

      Or, if we take this a slightly different direction, we could have looked at the US monetary base starting in 2009 and naively concluded that a hyperinflation would occur because of the equation of exchange. This prediction has been obviously proven wrong, so clearly looking at monetary policy through the lens of exogenous velocity is completely wrong in this case.

      Now, if we decided to make predictions in 2009 using a model in which money demand becomes indeterminate at the zero lower bound, we would have rightly concluded that there would be at best anemic inflation at the zero lower bound and that any expansion of the monetary base would be pretty much useless.

      Regarding your list of models, despite the fact that all give the same result of a hyperinflation (which we've failed to observe in the US, by the way), clearly not all of them apply to a given situation, unless they really are just the same model with a different rationalization applied to each (This is why using formal mathematical models is useful, so the qualitative differences between models are easily understood).