Beruflich Dokumente
Kultur Dokumente
doi=10.1257/jep.24.4.85
86
from absolute truth to be so specialized and to make the kind of confident quantitative claims that often emerge from the core. On the policy front, this confused
precision creates the illusion that a minor adjustment in the standard policy framework will prevent future crises, and by doing so it leaves us overly exposed to the
new and unexpected.
To be fair to our field, an enormous amount of work at the intersection of
macroeconomics and corporate finance has been chasing many of the issues that
played a central role during the current crisis, including liquidity evaporation,
collateral shortages, bubbles, crises, panics, fire sales, risk-shifting, contagion, and
the like.1 However, much of this literature belongs to the periphery of macroeconomics rather than to its core. Is the solution then to replace the current core for
the periphery? I am temptedbut I think this would address only some of our problems. The dynamic stochastic general equilibrium strategy is so attractive, and even
plain addictive, because it allows one to generate impulse responses that can be fully
described in terms of seemingly scientific statements. The model is an irresistible
snake-charmer. In contrast, the periphery is not nearly as ambitious, and it provides
mostly qualitative insights. So we are left with the tension between a type of answer
to which we aspire but that has limited connection with reality (the core) and more
sensible but incomplete answers (the periphery).
This distinction between core and periphery is not a matter of freshwater versus
saltwater economics. Both the real business cycle approach and its New Keynesian
counterpart belong to the core. Moreover, there was a time when Keynesian economics was more like the current core, in the sense of trying to build quantitative
aggregative models starting from micro-founded consumption functions and the
like. At that time, it was the rational-expectations representatives that were in
the insight-building mode, identifying key concepts for macroeconomic policy such
as time-inconsistency and endogenous expectations, without any pretense of being
realistic in all dimensions of modeling in order to obtain quantitative answers.
Moreover, this tension is not new to macroeconomics or even to economics
more broadly. In his Nobel-prize acceptance lecture, Hayek (1974) writes: Of course,
compared with the precise predictions we have learnt to expect in the physical
sciences, this sort of mere pattern predictions is a second best with which one does
not like to have to be content. Yet the danger of which I want to warn is precisely the
belief that in order to have a claim be accepted as scientific it is necessary to achieve
more. This way lies charlatanism and worse. To act on the belief that we possess the
knowledge and the power which enable us to shape the process of society entirely to
our liking, knowledge which in fact we do not possess, is likely to make us do much
harm (1974).
One reading of Hayeks comment is as a reminder of the dangers of
presuming a precision and degree of knowledge we do not have. I suspect that if
Hayek was confronted with the limited choice between the core or the periphery
In fact, at MIT we divide the first-year Ph.D. macroeconomics sequence into four parts: methods,
growth, fluctuations, and crises. I will include specific references of this work in the main text, but see
Part VI of Tirole (2006) for a nice survey and unified explanation of several of these mechanisms.
Ricardo J. Caballero
87
of macroeconomics, his vote today would be cast for the periphery. This is the
starting point of the theme I will develop in the first part of this paper. There I will
discuss the distinction between the core and the periphery of macroeconomics in
greater detail, as well as the futile nature of the integrationist movementthat is,
the process of gradually bringing the insights of the periphery into the dynamic
stochastic general equilibrium structure.
However, when we consider Hayeks comment, we find a silver lining: a contemporary version of his paragraph, which would involve a discussion of the core
and periphery, would confront one modeling approach against other modeling
approaches, not models against narrative. This is good news. There is no doubt that
the formalization of macroeconomics over recent decades has increased its potential. We just need to be careful to not let this formalization gain its own life and
distract us from the ultimate goal, which is to understand the mechanisms that drive
the real economy. This progress also offers hope that we may find ways to explore
formally and explicitly the limits of our and economic agents knowledge. This is the
second theme I develop in this paper. The idea is to place at the center of the analysis
the fact that the complexity of macroeconomic interactions limits the knowledge
we can ever attain. In thinking about analytical tools and macroeconomic policies,
we should seek those that are robust to the enormous uncertainty to which we are
confined, and we should consider what this complexity does to the actions and reactions of the economic agents whose behavior we are supposed to be capturing.
I cannot be sure that shifting resources from the current core to the periphery
and focusing on the effects of (very) limited knowledge on our modeling strategy
and on the actions of the economic agents we are supposed to model is the best next
step. However, I am almost certain that if the goal of macroeconomics is to provide
formal frameworks to address real economic problems rather than purely literaturedriven ones, we better start trying something new rather soon. The alternative of
segmenting, with academic macroeconomics playing its internal games and leaving
the real world problems mostly to informal commentators and policy discussions,
is not very attractive either, for the latter often suffer from an even deeper pretenseof-knowledge syndrome than do academic macroeconomists.
88
89
If we were to stop there, and simply use these stylized structures as just one more
tool to understand a piece of the complex problem, and to explore some potentially
perverse general equilibrium effect which could affect the insights isolated in the
periphery, then I would be fine with it. My problems start when these structures
are given life on their own, and researchers choose to take the model seriously
(a statement that signals the time to leave a seminar, for it is always followed by a
sequence of naive and surreal claims).
The quantitative implications of this core approach, which are built on supposedly micro-founded calibrations of key parameters, are definitely on the surreal
side. Take for example the preferred microfoundation of the supply of capital
in the workhorse models of the core approach. A key parameter to calibrate in
these models is the intertemporal substitution elasticity of a representative agent,
which is to be estimated from micro-data. A whole literature develops around this
estimation, which narrows the parameter to certain values, which are then to be
used and honored by anyone wanting to say something about modern macroeconomics. This parameter may be a reasonable estimate for an individual agent facing
a specific micro decision, but what does it have to do with the aggregate? What
happened with the role of Chinese bureaucrats, Gulf autocrats, and the like, in the
supply of capital? A typical answer is not to worry about it, because this is all as if.
But then, why do we call this strategy microfoundations rather than reduced-form?
My point is that by some strange herding process the core of macroeconomics
seems to transform things that may have been useful modeling short-cuts into a part
of a new and artificial reality, and now suddenly everyone uses the same language,
which in the next iteration gets confused with, and eventually replaces, reality.
Along the way, this process of make-believe substitution raises our presumption of
knowledge about the workings of a complex economy and increases the risks of a
pretense of knowledge about which Hayek warned us.
After much trial and error, these core models have managed to generate
reasonable numbers for quantities during plain-vanilla, second-order business cycle
fluctuations. However, the structural interpretation attributed to these results is
often nave at best, and more often is worse than that. For example, while these
models have been successful in matching some aggregate quantities, they have done
much more poorly on prices. But in what sense is it a good general equilibrium fit
if the quantities are right but not the prices?
Incidentally, this process of selective measures of success also weakens the
initial motivation for building the microfoundations of macroeconomics, which is
to make the theory testable. A theory is no longer testable when rejection is used not
to discard the theory, but to select the data moments under which the core model
is to be judged. This practice means that well-known major failures just become
puzzles, which are soon presumed to be orthogonal to the output from the quantitative model that is to be taken seriously.2
2
In a similar spirit, but applied to ad-hoc dynamics models, Lucas (2003) writes: Theres an interesting
footnote in Patinkins book. Milton Friedman had told him that the rate of change of price in any one
market ought to depend on excess demand and supply in all markets in the system. Patinkin is happy
90
But isnt abstraction what good economic models are about, for only then can
we isolate the essence of our concerns? Yes, but with certain requirements to which
I think the core has failed to adhere, yet the periphery has gotten mostly right.
The periphery uses abstraction to remove the inessential, but a typical periphery
paper is very careful that the main object of study is anchored by sensible assumptions. It is fine to be as goofy as needed to make things simpler along inessential
dimensions, but it is important not to sound funny on the specific issue that is to
be addressed.
Instead, core macroeconomics often has aimed not for a realistic anchor and a
simplification of the rest, but for being only half-goofy on everything: preferences
and production functions that do not represent anyone but that could be found
in an introductory microeconomics textbook, the same for markets, and so on. By
now, there are a whole set of conventions and magic parameter values resulting
in an artificial world that can be analyzed with the rigor of micro-theory but that
speaks of no particular real-world issue with any reliability.
Integration?
One possible reaction to my remarks is that I am too impatient; that with
enough time, we will arrive at an El Dorado of macroeconomics where the key
insights of the periphery are incorporated into a massive dynamic stochastic
general equilibrium model. After all, there has been an enormous collective effort
in recent decades in building such models, with an increasing number of bells
and whistles representing various microeconomic frictions. The research departments of central banks around the world have become even more obsessed than
academics with this agenda.
However, I think this incremental strategy may well have overshot its peak and
may lead us to a minimum rather than a maximum in terms of capturing realistic
macroeconomic phenomena. We are digging ourselves, one step at a time, deeper
and deeper into a Fantasyland, with economic agents who can solve richer and
richer stochastic general equilibrium problems containing all sorts of frictions.
Because the progress is gradual, we do not seem to notice as we accept what are
increasingly absurd behavioral conventions and stretch the intelligence and information of underlying economic agents to levels that render them unrecognizable.
The beauty of the simplest barebones real business cycle model is, in fact, in its
simplicity. It is a coherent description of equilibrium in a frictionless world, where it
is reasonable to expect that humans can deal with its simplicity. I would rather stop
there (perhaps with space for adding one nominal rigidity) and simply acknowledge
that it is a benchmark, not a shell or a steppingstone for everything we study in
macroeconomics, which is unfortunately the way the core treats it today.3
about this suggestion because he loves more generality, but if you think about Friedmans review of
Lange, of Langes book, what Friedman must have been trying to tell Patinkin is that he thinks the theory
is empty, that anything can happen in this model. And I think hes got a point.
3
An extreme form of this view of the real business cycle model as the steppingstone for everything else
is the so-called gap approach, which essentially views and constrains the research agenda of macroeconomics to studying the failures of the maximization conditions of this very particular model (Chari,
Ricardo J. Caballero
91
92
policy work. In trying to add a degree of complexity to the current core models, by
bringing in aspects of the periphery, we are simultaneously making the rationality
assumptions behind that core approach less plausible.
Moreover, this integrationist strategy does not come with an assurance that it
will take us to the right place, as there is an enormous amount of path dependence
in the process by which elements are incorporated into the core; and the baggage
we are already carrying has the potential to distort the selection of the mechanisms
of the periphery that are incorporated. Given the enormous complexity of the task
at hand, we can spend an unacceptably long time wandering in surrealistic worlds
before gaining any traction into reality.
We ultimately need to revisit the ambitious goal of the core, of having a framework for understanding the whole, from shocks to transmission channels, all of them
interacting with each other. The issue is how to do this without over-trivializing the
workings of the economy (in the fundamental sense of overestimating the power
of our approximations) to a degree that makes the framework useless as a tool for
understanding significant events and dangerous for policy guidance. I dont have
the answer to this fundamental dilemma, but it does point in the direction of much
more diversification of research and methodology than we currently accept. It also
points in the direction of embracing, rather than sweeping under the rug, the
complexity of the macroeconomic environment. I turn to the latter theme next.
6
Durlauf (2004) offers a thoughtful survey and discussion of the econophysics literature and its limitations for economic policy analysis, precisely because it does not use models that adequately respect the
purposefulness of individual behavior. This of course is a statement of the current state of affairs in the
literature, not of its potential, which is probably substantial. See Bak, Chen, Scheinkman, and Woodford
(1993), Sheinkman and Woodford (1994), Arthur, Durlauf, and Lane (1997), Durlauf (1993, 1997),
93
Having said this, some of the motivations for the econophysics literature do
strike a chord with the task ahead for macroeconomists. For example, Albert and
Barabsi (2002), in advocating for the use of statistical mechanics tools for complex
networks, write:
Physics, a major beneficiary of reductionism, has developed an arsenal of
successful tools for predicting the behavior of a system as a whole from the
properties of its constituents. We now understand how magnetism emerges
from the collective behavior of millions of spins . . . The success of these
modeling efforts is based on the simplicity of the interactions between the
elements: there is no ambiguity as to what interacts with what, and the interaction strength is uniquely determined by the physical distance. We are at a
loss, however, to describe systems for which physical distance is irrelevant or
for which there is ambiguity as to whether two components interact . . . there
is an increasingly voiced need to move beyond reductionist approaches and
try to understand the behavior of the system as a whole. Along this route,
understanding the topology of the interactions between the components, i.e.,
networks, is unavoidable . . .
In any event, I will not review this literature here and instead will focus on
arguments about cumbersome linkages and agents confusion about these linkages
that are closer to mainstream macroeconomics.
Dominoes and Avalanches
Allen and Gales (2000) model of financial networks and the inherent
fragility of some of these structures provided an early and elegant example of
how linkages can cause substantial instability with respect to shocks different
from those the network was designed to handle. Recently, Shin (2009) shows how
fire sales can greatly magnify the domino mechanism highlighted by Allen and
Gale (2000). Another example of promising research in this style is Rotemberg
(2009), which uses graph theory to study how interconnectedness affects firms
ability to make use of an exogenous amount of liquidity. Rotemberg studies a
situation in which all firms are solvent; that is, the payments that any particular
firm is expected to make do not exceed the payments it is entitled to receive.
He finds that interconnectedness can exacerbate the difficulties that firms
have in meeting their obligations in periods where liquidity is more difficult to
obtain.
Of course, the complex-systems literature itself offers fascinating examples of
the power of interconnectedness. Bak, Chen, Scheinkman, and Woodford (1992)
and Sheinkman and Woodford (1994) bring methods and metaphors from statistical mechanics to macroeconomics. They argue that local, nonlinear interactions
Brock (1993), Brock and Durlauf (2001), Acemoglu, Ozdaglar, and Tahbaz-Zalehi (2010) for early steps
in this direction.
94
can allow small idiosyncratic shocks to generate large aggregate fluctuations, rather
than washing out via the law of large numbers. They discuss a kind of macroeconomic instability called self-organized criticality, comparing the economy to a
sand hill: at first, a tiny grain of sand dropped on the hill causes no aggregate effect,
but as the slope of the hill increases, eventually one grain of sand can be sufficient
to cause an avalanche. In the limit, aggregate fluctuations may emerge from hardto-detect and purely idiosyncratic shocks.
Panics
In a complex environment, agents need to make decisions based on information that is astonishingly limited relative to all the things that are going on and that
have the potential to percolate through the system. This degree of ignorance is not
something agents with frontal lobes like to face. Anxiety is also part of the frontal
lobe domain! Reactions that include anxiety and even panic are key ingredients for
macroeconomic crises.
Put differently, a complex environment has an enormous potential to generate
truly confusing surprises. This fact of life needs to be made an integral part of
macroeconomic modeling and policymaking. Reality is immensely more complex
than models, with millions of potential weak links. After a crisis has occurred,
it is relatively easy to highlight the link that blew up, but before the crisis, it is a
different matter. All market participants and policymakers know their own local
world, but understanding all the possible linkages across these different worlds is
too complex. The extent to which the lack of understanding of the full network
matters to economic agents varies over the cycle. The importance of this lack of
understanding is at its most extreme level during financial crises, when seemingly
irrelevant and distant linkages are perceived to be relevant. Moreover, this change
in paradigm, from irrelevant to critical linkages, can trigger massive uncertainty,
which can unleash destructive flights to quality.
Benoit Mandelbrot, the mathematician perhaps best known for his work on
fractal geometry, once drew a parallel from economics to storms, which can only
be predicted after they form. Mandelbrot (2008, in a PBS NewsHour interview
with Paul Solman on October 21, 2008) said: [T]he basis of weather forecasting
is looking from a satellite and seeing a storm coming, but not predicting that the
storm will form. The behavior of economic phenomena is far more complicated
than the behavior of liquids or gases.
Financial crises represent an extreme manifestation of complexity in macroeconomics, but this element probably permeates the entire business cycle, in
part through fluctuations in the perceived probability that complexity and its
consequences will be unleashed in the near future. These fluctuations could be
endogenous and arising from local phenomena as highlighted by the formal complexity literature in physical sciences, but it also could be in response to a more
conventional macroeconomic shock, such as an oil or aggregate demand shock,
especially once these interact with more conventional financial-accelerator-type
mechanisms (like those in Kiyotaki and Moore, 1997; Bernanke and Gertler,
1989).
Ricardo J. Caballero
95
96
7
In Caballero and Krishnamurthy (2008b), we place the origins of the current crisis in this framework.
We argue that perhaps the single largest change in the financial landscape over the last five years was in
complex credit products: collateralized debt obligations, collateralized loan obligations, and the like.
Market participants had no historical record to measure how these financial structures would behave
during a time of stress. These two factors, complexity and lack of history, are the preconditions for
rampant uncertainty. When the AAA subprime tranches began to experience losses, investors became
uncertain about their investments. Had the uncertainty remained confined to subprime mortgage
investments, the financial system could have absorbed the losses without too much dislocation. However,
investors started to question the valuation of the other credit productsnot just mortgagesthat had
been structured in much the same way as subprime investments. The result was a freezing up across the
entire credit market. The policy response to this initial freezing was timid, which kept the stress on the
financial system alive until a full blown sudden financial arrest episode developed (after Lehmans
demise). See Caballero (2009) for an analogy between sudden cardiac arrest and sudden financial arrest.
97
8
Insurance programs during the crisis included: a temporary program created by the U.S. Treasury
Department to insure money-market funds; nonrecourse funding for the purchase of asset-backed
securities through the Term Asset-Backed Securities Loan Facility (TALF); and a temporary increase
in deposit insurance from $100,000 to $250,000. A notable example from outside the United States was
the U.K. Asset Protection Scheme, which backed over half a trillion pounds in post-haircut assets for two
British banks. See Caballero (2009), Madigan (2009), and IMF (2009).
9
Under our proposal, the government would issue tradable insurance credits (TICs) which would be
purchased by financial institutions, some of which would have minimum holding requirements. During
a systemic crisis, each TIC would entitle its holder to attach a government guarantee to some of its assets.
All regulated financial institutions would be allowed to hold and use TICs, and possibly hedge funds,
private equity funds, and corporations as well. In principle, TICs could be used as a flexible and readily
available substitute for many of the facilities that were created by the Federal Reserve during the crisis.
98
As Lucas (2003), following on a Friedman (1946) lead, points out: [T]he theory is never really solved.
What are the predictions of Patinkins model? The model is too complicated to work them out. All
the dynamics are the mechanical auctioneer dynamics that Samuelson introduced, where anything
can happen. . . . You can see from his verbal discussion that hes reading a lot of economics into these
dynamics. What are people thinking? What are they expecting? . . . Hes really thinking about intertemporal substitution. He doesnt know how to think about it, but he is trying to. In the specific example
used here by Lucas, the LucasRapping (1969) model, for example, did represent significant progress
Ricardo J. Caballero
99
relationship is extremely limited. In such cases, the main problem is not in how
to formalize an intuition, but in the assumption that the structural relationship is
known with precision. Superimposing a specific optimization paradigm is not the
solution to this pervasive problem, as much of the difficulty lies precisely in not
knowing which, and whose, optimization problem is to be solved. For this reason
the solution is not simply to explore a wide range of parameters for a specific
mechanism. The problem is that we do not know the mechanism,, not just that we
dont know its strength.
But how do we go about doing policy analysis in models with some loosely
specified blocks not pinned down by specific first-order conditions? Welcome to
the real world! This task is what actual policymakers face. Academic models often
provide precise policy prescriptions because the structure, states, and mechanisms are sharply defined. In contrast, policymakers do not have these luxuries.
Thoughtful policymakers use academic insights to think about the type of policies
they may want to consider, but then try to understand the implications of such
policies when some (or most) of the assumptions of the underlying theoretical
model do not hold. However, this kind of robustness analysis is nearly absent in
our modeling. In this sense, and as I mentioned earlier, the work of Hansen and
Sargent (2007) and others on robust control in policymaking points in the right
direction, although I think we need to go much, much further in reducing the
amount and type of knowledge policymakers and economic agents are assumed
to possess.
One primary driving force behind modern macroeconomics (both core and
periphery) was an attempt to circumvent the Lucas critiquethe argument that
market participants take the policy regime into account and so estimates of economic
parameters for one policy regime may well not be valid if the policy regime changes.
If we now replace some first-order conditions by empirical relationships and their
distributions, doesnt this critique return to haunt us? The answer must be yes, at
least to some extent. But if we do not have true knowledge about the relationship
and its source, then assuming the wrong specific first-order condition can also be a
source of misguided policy prescription. Both the ad-hoc model and the particular
structural model make unwarranted specific assumptions about agents adaptation
to the new policy environment. The Lucas critique is clearly valid, but for many
(most?) policy questions we havent yet found the solutionwe only have the
pretense of a solution.
Ultimately, for policy prescriptions, it is important to assign different weights
to those that follow from blocks over which we have true knowledge, and those
that follow from very limited knowledge. Some of this has already been done in the
asset pricing literature: for example, Ang, Dong, and Piazzesi (2007) use arbitrage
theory to constrain an otherwise nonstructural econometric study of the yield curve
and Taylors rule. Perhaps a similar route can be followed in macroeconomics to
gauge the order of magnitude of some key effects and mechanisms, which can then
over the Patinkin modeling of labor dynamics. It solved the how part of the formal underpinning of
Patinkins dynamics. But as I point out in the text, this kind of solution is insufficient.
100
be combined with periphery insights to generate back-of-the-envelope-type calculations. For now, we shouldnt pretend that we know more than this, although this
is no reason to give up hope. We have made enormous progress over the last few
decades in the formalization of macroeconomics. We just got a little carried away
with the beautiful structures that emerged from this process.
101
and informal commentators cannot be the right approach. I do not have the
answer. But I suspect that whatever the solution ultimately is, we will accelerate our
convergence to it, and reduce the damage we do along the transition, if we focus on
reducing the extent of our pretense-of-knowledge syndrome.
I thank Daron Acemoglu, David Autor, Abhijit Banerjee, Olivier Blanchard, Peter
Diamond, Francesco Giavazzi, Jonathan Goldberg, Chad Jones, Bengt Holmstrm, Arvind
Krishnamurthy, John List, Guido Lorenzoni, James Poterba, Alp Simsek, Robert Solow, and
Timothy Taylor for their comments. Of course they are not responsible for my tirade.
References
Acemoglu, Daron, Asuman Ozdaglar, and
Alireza Tahbaz-Zalehi. 2010. Cascades in Networks
and Aggregate Volatility. Available at: http://
econ-www.mit.edu/faculty/acemoglu/paper.
Albert, Rka, and Albert-Lszl Barabsi. 2002.
Statistical Mechanics of Complex Networks.
Review of Modern Physics, 74(1): 4797.
Allen, Franklin, and Douglas Gale. 2000. Financial Contagion. Journal of Political Economy, 108(1):
133.
Ang, Andrew, Sen Dong, and Monika Piazzesi.
2007. No-Arbitrage Taylor Rules. Available at
SSRN: http://ssrn.com/abstract=621126.
Arthur, William B., Steven N. Durlauf, and
David A. Lane, eds. 1997. The Economy as an Evolving
Complex System II. Reedwood City: Addison-Wesley.
Bak, Per, Kan Chen, Jose Scheinkman, and
Michael Woodford. 1993. Aggregate Fluctuations
from Independent Sectoral Shocks: Self-Organized
Criticality in a Model of Production and Inventory
Dynamics. Richerche Economiche, 47(1): 330.
Bernanke, Ben, and Mark Gertler. 1989.
Agency Costs, Net Worth and Business Fluctuations. The American Economic Review, 79(1): 1431.
Brock, William A. 1993. Pathways to Randomness in the Economy: Emergent Nonlinearity
and Chaos in Economics and Finance. Estudios
Econmicos, 8(1): 355.
Brock, William, and Steven N. Durlauf. 2001.
Discrete Choice with Social Interactions. Review
of Economic Studies, 68(2): 23560.
Caballero, Ricardo J. 2009. Sudden Financial
Arrest. Prepared for the Mundell-Fleming Lecture
delivered at the Tenth Jacques Polak Annual
Research Conference, IMF, November 8.
Caballero, Ricardo J. 2010. Crisis and Reform:
Managing Systemic Risk. Prepared for the XI
Angelo Costa Lecture delivered in Rome on
March 23.
Caballero, Ricardo J., and Arvind Krishnamurthy. 2008a. Collective Risk Management
102