Grant's Interest Rate Observer - May 26, 2000
Secrets of the GDP
Technology is widely regarded as the greatest single source of the U.S. economic achievement of the past decade. But there is another mighty wellspring, a purely technical and statistical one. This is the method by which technological innovation is represented in the national income accounts. In the analysis that follows, our favorite sons of Harvard, James Medoff and Andrew Harless, examine the distortions introduced into a host of government macroeconomic data by the use of so-called hedonic price adjustments.
It was Medoff and Harless who, writing in the March 31 issue of Grant's, uncovered the true foundation of the supposed productivity miracle of 1999. In fact, it was mainly a statistical illusion. Now the economists take the analysis further, disclosing a systematic corruption in the federal reporting on production, growth and inflation. To date, the net result of these innumerable distortions has been bullish for dollar- denominated financial assets. However, the better it comes to be understood, the less bullish it is likely to seem.
"Once up on a time," write Medoff and Harless, "it was easy to tell how fast the economy was growing. We knew what was produced, what was bought and sold, and how much it cost. We knew how much things cost last year. It was simple enough to figure out how much this year's products would have cost at last year's prices. Then we could compare the amount produced this year with the amount produced last year.
"Those days are gone. Technological change has become rapid and discontinuous. The mix of available products changes so quickly that we can no longer talk meaningfully about last year's prices for this year's products. Nonetheless, people ask how fast the economy is growing, and the government is obligated to provide an answer. The answer provided may or may not be a reasonable one, but in any case, it's an answer, not the answer. Investors and policy-makers who treat the official answer as the answer do so at their peril.
"One important method used in the U.S. to compare products across time periods is hedonic price indexing. The idea of hedonic price indexing is this: instead of putting a price on a particular product, put a price on each of the product's characteristics. The prices of product characteristics are estimated by looking at a range of products with different characteristics. For example, in the case of computers, there might be a Ôprice of processor speed' (in dollars per megahertz), a Ôprice of the IBM brand name,' and so on. These Ôhedonic prices' can then be used to generate a Ôtheoretical price' for any product whose characteristics are known. Government statisticians generate theoretical prices for this year's products using last year's hedonic prices, and they compare these theoretical prices with the actual prices observed this year. It's a powerful technique, but, like all powerful things, it can be dangerous.
"The technique is associated with the late Harvard economist Zvi Griliches, who used it to study automobile quality change in a 1961 study. It has been used in the U.S. national accounts since 1987. Currently, it is used with housing, semiconductors, cell phones, digital switches, import and export items, and, most notably, computer hardware and software. These last two categories are particularly important because they have come to represent the most important factor in the growth of investment, which is the most volatile component of gross domestic product.
"According to a study by our associate Lorenzo Isla, information technology has made a larger and larger contribution to total fixed investment--from 5.2% in 1960 to 15.2% in 1988 to 27.2% in the first quarter of 2000. Investment, in turn, accounts for about two-thirds of the year-to-year variation in GDP growth. Thus, again according to the Isla study, IT (hardware and software) accounted for 39% of the total GDP growth in the first quarter of 2000. In other words, GDP growth--the holy grail of the bond market, the currency market and the Federal Open Market Committee--is importantly determined by a statistical technique that probably not one professional investor in a hundred has ever even heard of.
"There are several potential problems with hedonic price indexing. First, while the validity of the technique depends on having the right set of hedonic pricing characteristics, the choice of characteristics is essentially a subjective one. Second, the technique is not well suited to discontinuous technological change: It relies on the premises that this year's goods and last year's goods can be described in terms of the same characteristics and that the characteristics mean the same thing this year as they did last year. In a world of dramatic innovation, rapid obsolescence and compatibility constraints, the applicability of the hedonic price concept is dubious. Finally, hedonic pricing depends on specifying in a simple mathematical form the way each characteristic should affect the price. That may be easy when the characteristic is something, like computer memory, that can be added or removed from a product, and might simply cost a certain number of dollars per megabyte. But when the characteristic is something abstract, like processor speed, a simple mathematical relationship is likely to be misleading. "To see how hedonic pricing can lead to distortions, consider the way in which the rise in processor speed occurs. Chip-makers are constantly introducing new processors that are marginally faster than the fastest previously available. The cost increment for the newest, fastest processor over the previous model is generally substantial, and the price of the old best falls when a new leader comes to take its place. Yet the drop in the price of a mainstream, Ôfast-enough-for-typical-use' processor is relatively small.
"Today, for example, the cost increment between a system with a 600 MHz Pentium III and an identical system with a 700 MHz Pentium III can be as little as $70 (70 cents per MHz). Yet the cost increment from 733 MHz to 800 MHz can be as much as $200 ($3 per MHz). And it can cost nearly $800 ($6 per MHz) to go from 866 MHz to 1 gigahertz (1,000 MHz). The recent introduction of the 1GHz chip is pushing down prices near the high end, but people who get along fine with 600 MHz--people who use computers for business rather than rocket science--won't see much price benefit. Typical hedonic models really do give a single price for processor speed in dollars per MHz, and when they price processor speed, they price the marketing manager's computer along with that of the rocket scientist. "In this situation, a hedonic regression will estimate the Ôhedonic price of processor speed' as some kind of compromise between the $6 high end and the 70 cents (or less) low end. Let's say $3. When this figure is applied across the board to produce theoretical prices, the theoretical price for the high-end or low-end machine will tend to underestimate the actual price, whereas the theoretical price for a mid- range machine will tend to overestimate the actual price. The theoretical price of a 1GHz machine selling at $2,700 might be more like $2,550, but the theoretical price of a 733 MHz machine selling at $1,600 might be more like $1,750.
"Now what will happen if Intel introduces a new Pentium III that runs at 1.1 GHz (1,100 MHz)? The cost of a 1 GHz computer will fall by several hundred dollars--but most people won't care. The cost of a 600 MHz computer will change little. The cost of a 733 MHz computer will fall by maybe $70, a small change in dollar terms, but enough to make it economical for mainstream purchasers, since they can now get more power (compared to, say, a 600 MHz machine) at a negligible cost increment. So, when the government collects its data, it will find many people buying computers around the 733 MHz range. And let's say they pay around $1,530 for the machine that used to cost $1,600. It used to cost $1,600, but its theoretical price was $1,750, so the $70 price drop looks like a $220 price drop. This will be partly offset by the reverse effect at the high end--1GHz computers whose theoretical price was understated--but since that's a smaller market, the aggregate effect will be smaller. And as for the old low-end computers, which also had understated theoretical prices, they won't be selling any more. The net effect is that it looks like computer prices in general have fallen significantly, when in fact the only large drops have been at the high end.
"So what effect does this scenario have on the statistics? For one thing, obviously, inflation will be understated, because the government averages a lot of huge price drops for computers that really only had small price drops. The other side of the coin is that growth will be overstated. The new year's economy produces lots of 733 MHz computers, which, in terms of last year's theoretical prices, are replacing (on the production line, that is), old, low-end computers that had been undervalued. Therefore, real output growth, and anything derived from it, is overstated. Productivity appears to be growing quickly, and anything Ôreal' in the national accounts--real GDP growth, real profit growth, real consumption growth, etc.--is actually less real than you might think.
"Another point worth noting in connection with this example is that the (hypothetical) new 1.1 GHz computers will also appear to have experienced a price drop and to be reflecting a productivity increase, even though, really, their price could not have dropped, since they didn't even exist before. In this example, it's true, the Ôapparent price drop' for the 1.1 GHz computer would be understated relative to a Ôperfectly fit' hedonic model. But the fact remains, as far as the 1.1 GHz machine is concerned, there was no actual price drop, and there was no increase in productivity for any actual product: simply, a new product was introduced. Philosophically, it's hard to regard this as a real increase in productivity (unless, of course, someone can show that these new computers make the rest of the economy more productive). Otherwise, wouldn't we have to regard every new invention as an increase in productivity?
"Some of the problems with hedonic pricing can be minimized by taking more care in the specification of the model. In the processor-speed example, for instance, the specific problem of having a single price for processor speed could be overcome by specifying a slightly more complicated mathematical relationship between the processor speed and price. In particular, a quadratic function would fit better than a linear function. We asked Michael Holdway, who oversees hedonic computer pricing models at the Bureau of Labor Statistics, why a quadratic specification wasn't used. Deadlines and resources, he replied: Neither the money nor the time is available to revamp the technique. To which we would add, not too cynically, that the policy-makers to whom Holdway reports may also lack the will to commit the necessary funds to statistical reform. After all, what could be better than the economic picture the existing hedonic method presents? Where would the bull market be without it?
"In principle, a flawed application of the hedonic pricing methodology could result in a bias in either direction. In recent U.S. data, the direction of bias is clear: the rate of decline in computer prices is being exaggerated, as rapid introductions of new technology are interpreted as large declines in the price of existing technology. Modest growth in the dollar value of hardware and software produced is reported as awesome growth in Ôreal' production, because this year's computers are so tremendously valuable in terms of last year's theoretical prices. The result is lower apparent inflation rates and higher apparent growth rates for the economy as a whole. Politicians, be they Republicans who want to limit Social Security cost-of-living allowances, or Democrats, who want to elect Al Gore, seem quite happy with the bias. If politicians are happy, investors are overjoyed: When the growth rate exceeds the cost of capital, no multiple of earnings is too high to pay.
"A flawed yardstick may be better than no yardstick at all, provided everyone uses the same flawed yardstick. But in Germany, for example, they don't use hedonic price indexes. Comparisons between European growth rates and U.S. growth rates, based on the official statistics, may be, as the saying goes, like comparing apples and oranges. Would you buy the currency of an orange grove because its fruit is sweeter than that of an apple orchard? Plainly, in the real-world currency markets, people have been buying the dollar and selling the euro for the reason (among others) that the U.S. is growing faster than Germany. Without knowing exactly how much hedonic distortions have exaggerated the rate of the U.S. expansion, it is safe to say that the U.S. growth premium is smaller than it looks.
"In this world of shifting sand, we would suggest setting one's growth concept on firm bedrock. Instead of looking at growth in output, look at growth in employment. Human beings are the same species as they were last year and the year before, so there is no need to find ways of comparing disparate items.
"As a little experiment, we took a look at what the inflation rate would look like if the underlying concept of real growth were employment growth rather than output growth. The result is a sort of Ôlabor value inflation rate,' the growth rate of nominal GDP minus the growth rate of employment. We thus adopt, implicitly, the Ôlabor theory of value' (associated with Karl Marx but actually originated by decent, capitalist economists well before he got his hands on it), not because it is the correct theory of value but because it is a reasonable, and less slippery, alternative, to the theories implicit in the official statistics.
"The accompanying graph shows what we found. Until recently, the Ôbig picture' painted by our measure has been similar to that painted by the official GDP deflator. Over the past six years, however, we've seen an unprecedented divergence. (We checked all the way back to 1950 and found no precedent for a divergence continuing unabated for three years, let alone six.) What does this divergence mean? It might mean that we've entered a new era in which productivity not only rises quickly, but also accelerates its rise each consecutive year. Or it might mean that there's something flaky about the recent statistics. "We won't be putting our money on the first interpretation."