Nacy L. Stokey

There is some truth in the old adage that in economics the questions never change, only the answers.

Are the formal models of modern economics only an embellishment of older ideas? They are more than that, for two reasons. First, they clarify exactly what is being asserted providing a more solid base from which further theoretical arguments acn proceed. In addition they provide a guide for empirical work,...

The old adage is only partly correct: the questions get sharper and clearer, even if entirely satisfactory answers remain elusive.


Greg Kaplan

Hubbard, Skinner, and Zeldes (1995) and Rosenzweig and Wolpin (1993) illustrate two alternative ways in which structural models of consumption can be disciplined by data and turned into quantitative laboratories;...

Attanasio and Weber (1995) ... illustrate the pitfalls of using aggregate data to test models of heterogeneous households. By aggregating microdata in exactly the way prescribed by theory, they show that there can be large differences between the dynamics of the log of mean consumption (the focus of representative agent models and what is measured in aggregate data) and the dynamics of the mean of log consumption (the focus of heterogeneous agent models and that can be constructed only with household-level data). 

Two JPE-published papers represent some of the best early examples of how to utilize a quantitative structural model of consumption for effective policy analysis.

Hubbard et al. (1995) is a classic example of the power of a calibrated structural model.

Imrohoroglu (1989) ... was motivated by Lucas's (1987) famous costs of business cycles calculation. He had shown that in representative agent economies the welfare costs of business cycles are small both because fluctuations in aggregate income are themselves small and because these fluctuations have only a second-order effect on welfare. It was natural to conjecture that in heterogeneous agent economies with incomplete markets this quantitative conclusion might be overtuned, both because fluctuations in individual income can be substantial and because the presence of liquidity constraints means that for some households these fluctuations have a first-order effect on welfare.

She finds that when aggregate shocks change the extent of unemployment risk faced by households, the welfare cost of business cycles can be four to five times larger than in a corresponding representative agent economy.

Krusell and Smith (1998) wanted to understand how the equilibrium business cycle dynamics of macroeconomic variables in this heterogeneous agent economy compare to those in a corresponding representative agent economy ... If the macroeconomic dynamics of the two economies were not too different, it would provide some justification for the common practice of studying macroeconomics through the lens of a single representative agent.

It is important to remember that their finding of indistinguishability between the aggregate dynamics of the heterogeneous angent and representative agent economies is conceptually different from their finding of approximate aggregation... when the model is modified to better match the empirical distribution of welath, the comovement of consumption and income looks very different from the corresponding representative agent economy.


Lars Peter Hansen

Slutsky (1927) and Yule (1927) ... Their vision was to view economic time series as linear responses to current and past independent and identically distributed impulses or shocks.

Frisch pioneered the use of impulse response functions in economic dynamics. His ambition was to provide explicit economic interpretations for how current-period shocks alter economic time series in current and future time periods.

Empirical evidence comes into play because econometricians face uncertainty about the underlying parameters of the rational expectations equilibrium and use data to infer their values.


James J. Heckman

Chicago economics emphasized the value of economic models in interpreting and guiding collection of data and making forecasts and constructing policy counterfactuals.

The Chicago apprach was in stark contrast with the prevalent methods used in the labor economics of that time .... This was a largely atheoretical institutionalist approach that focused on thick description, not explanation. It made generalizations to summarize data, but the summaries were largely disconnected from analytical economics and often from each other.

The Chicago approach emphasized the value of price theory in interpreting data.... In an essay on Wesley Chair Mitchell, he crystallized the Chicago approach to scientific economics:

The ultimate goal of science in any field is a theory - an integrated "explanation" of observed phenomena that can be used to make valid predictions about phenomena not yet observed. Many kinds of work can contribute to this ultimate goal and are essential for its attainment: the collection of observations about the phenomena in question; the organization and arrangements of observations and the extraction of empirical generalizations from them; the development of improved methods of measuring or analyzing observations; the formulation of partial or complete theories to integrate existing evidence. (Friedman 1950, 465)

 The interplay between data and theory was the hallmark of the Chicago approach. 

Interpretation of data - understanding the problems being studied and the mechanisms generating them - was a crucial part of policy analysis. ALl of these activities were essential for the scientific analysis of counterfactuals that is the basis for rigorous policy analysis. Both data and theory were taken very seriously. Economic theory was viewed as an engine for analysis and empirical discovery, and not as an end itself. .... Models that were discodant with data were revised and tested on the same and new data.

Only when the standard tools failed would the theory be amended. This approach was in stark contrast to that of the institutionalists who often favored ad hoc generalizations to "let the facts speak for themselves." They typically made up new models one empirical finding at a time and lacked a common core of basic principles that applied across multiple domains.

Stigler's (1961) search theory explained price (and wage dispersion) not as a failure of competitive markets - as had the institutionalists - but as a consequence of costly search. 

The three basic ingredients of Chicago economics were central to the field... (a) stable preferences for agents, (b) agents responding to incentives, and  (c) equilibrium.

Theory is used to interpret data. Data are used to test theory. Understanding the mechanisms producing empirically estimated "effects" is essential for principled counterfactual analysis and for explaining phenomena. It was never enough to say an intervention "worked." It was required that analysts understand the mechanisms producing "the facts," their generality across multiple empirical domains, and their relevance for public policy.


 John List

In contrast to other sciences, the experimental approach has not progressed to the point of being the cornerstone of the scientific method in economics just yet, but it has progressed sufficiently to find itself in the center of key debates and is well represented in every major economics journal.