But there’s a second shift in progress — a sort of Stage 2 of the data revolution in economics. The tools of economists are changing.The core of economics theory, as it’s practiced today, is based on individual optimization. For example, economists often assume that businesses maximize profits or minimize costs. This is known as a structural model, because economists usually assume that this sort of optimization represents the deep, fundamental structure of the economy, just like everything in your body is made up of atoms and molecules. Comparing this kind of model to data is called structural estimation, and for a while it formed the core of empirical economics.
For a few decades, economists used to imagine how the world works, write down a theory describing their idea, and call it a day. If some statisticians came along and found some support for the theory, well, great! But usually they didn’t, and that was fine too. As one old joke put it, if an idea worked in practice, economists would ask whether it worked in theory. The key was the explosion of affordable information technology that made it easier to gather and analyze data. By the ’90s, there was such a huge stock of untested theories and such a wealth of new data that it made more sense for young, smart economists to turn their efforts in empirical directions. Unlike in physics, where theory and experiment call for very different skill sets, most economists found they could switch from theory to data relatively easily. Prizes like the prestigious Bates Clark Medal awarded to rising economics stars under age 40 started to flow to people whose work emphasized data.