Negative Revisions in Government Data
Several observers have noted a pattern of revisions in government data — the story always seems to get worse. This has been particularly pronounced over the last few months.
Mike Panzner, writing at The Big Picture (one of our featured sites), does a nice job of explaining the significance of the issue. He writes as follows:
Panzner then does an initial check by looking at four different data series, summarized in this chart.
He draws a careful conclusion. It summarizes both what we can see from the data and what we cannot. Panzner writes:
Based on a quick read of a graph of the data (see below), it does seem as though the pattern of negative revisions has been trending higher lately, especially during the past year or so, suggesting that the cynics may be on to something.
This is an excellent introduction to a topic of interest.
In his influential weekly market letter John Mauldin generates new hypotheses. He cites the Panzner work and also notices some other features from the chart. His key insight pertains to the period of 2003-04, where initial estimates had upward revisions. The Mauldin conclusion is that while there is no conspiracy, the government methods are poor at turning points and the birth/death methodology is "a wild-eyed guess based on past trends, which by definition we know will change at economic turning points."
First, this is an interesting and important topic. There are some very naive people who believe that an entire government bureaucracy switches allegiance the day after an election. There is a cottage industry aimed at convincing people to ignore data. One of our missions at "A Dash" is to show that this is wrong.
Second, the evidence of the negative revisions is clear. It is worth investigating, and we have been doing so for several weeks.
Third, we do not agree with the "turning point" argument. We hope to convince John Mauldin of this in the weeks ahead. The BLS has several reports showing that the job creation modeling has done well with economic changes. He is correct in his observations about the data, but we believe he is wrong about the cause.
So how does one explain the Mauldin observation about 2003-o4?
In general, we like to have complete results before writing. We are going to make an exception in this case to foster cooperation with others who are looking into the same topic. We all seek the same objective.
We have been analyzing the payroll employment data, so that is the specific subject. The results might apply to other series, but we do not yet know.
Our key finding is the following: The negative revisions have little effect on the not seasonally adjusted data. The revisions mostly relate to seasonally adjusted reports, by far the more publicized and more important.
We'll provide some data to support this in the near future. We have a much more specific hypothesis for our research. More later.
The most important implication of this result is, of course, that there is no conspiracy. Anyone who understands government already knew that, but this is some additional supporting data. If a month is revised downward because of a changed seasonal adjustment, some other month has to be revised upward. For any twelve-month period, the unadjusted job changes and the seasonally adjusted job changes are (approximately) equal.
The second question relates to understanding seasonal adjustments. The BLS uses the Census Bureau ARIMA X12 method, an approach with many decades of development and support. We are doing some tests with this model, available free to any researcher, and expect to report more soon. We wonder how anyone can have a strong opinion without some personal experience with seasonal adjustments via this model.
And finally, we wonder if the revisions will turn upward at some point.
This is a very difficult subject, and we are trying to cooperate with others who are taking a look. It would be a great topic for thesis research. As John Maudlin wisely suggests, it might not have much immediate payoff for those of us managing investments.