Weighing the Week Ahead: The Chartists Take Over

THE CURRENT MARKET ARGUMENT PITS the charts against the
cheap, the increasingly worried tape readers who see an enduring
downtrend emerging versus the spread-sheet studiers who spy increasing
value with every percent decline in the Dow, and contend that stocks are
over-anticipating a recession relapse. –Michael Santoli

Michael Santolio, writing in this week’s Barron’s, captures the current picture quite well.  In a useful article, he goes on to describe the “death cross” and other technical indicators to worry about and also offers some countering viewpoints from those comparing asset classes.

This is a continuation of last week’s theme, and for a good reason — the issue has not changed!  We have a bit more evidence for our weekly review.  If you spend a few minutes, you will see something about the employment reports that you probably have not read before.

Background on “Weighing the Week Ahead”

There are many good services that do a complete list of every event
for the upcoming week, so that is not my mission.  Instead, I try to
single out what will be most important in the coming week.  If I am
correct, my theme for the week is what  we will be watching on TV and
reading in the mainstream media.  It is a focus on what I think is
important for my trading and client portfolios.

Last Week’s Data

There is a lively debate over the meaning of last week’s data.

The Good

These was little support for those claiming a high recession probability.

  • The ECRI’s interpretation of their own indicators is quite clear.  They see a slowing in the rate of growth, but no recession.  Since recessions come from economic shocks, a lower rate of growth leaves the economy more susceptible.  This implies a higher frequency of recessions in the future, but they are definitely not predicting one right now.  A number of pundits have taken one of the published indicators and done their own home-brewed analysis.  I explained why that was incorrect in this article, but you can see and hear Lakshman Acuthan for yourself in this video.
  • The ISM manufacturing index report of 56.2 is lower than last month, but still implies GDP growth of 4.8%.  The readings so far this year are consistent with GDP growth of 5.5%.
  • The payroll employment report showed net job growth, excluding the census worker layoffs, of 100K workers, much better than I expected.  (I believe it is correct to include full-time government jobs in this count.  Government jobs are real, and the measurement works both ways.  It is not like the census workers.  It is a small difference this month, but will be important in the next few months if state and local agencies lay off more workers).  As I expected, the report was viewed as pretty reasonable when first announced, and stock futures traded higher.  By the end of the day, it was routinely described as a “bad report.”

It is interesting to note the viewpoint of Barry Ritholtz, an influential and always skeptical consumer of economic data.  He thinks GDP will be in the 1.5% – 2.5% range for the next few quarters.  He does not give the basis for this forecast (which is about 1% lower than the economic consensus), but he concludes that the data do not support a current recession prediction.

The Bad

Much of the bad news came from the interpretation of the employment data.  Since this was the main economic story for the week, I want to give it a little extra attention.

Key facts you need to know

The payroll employment data and the employment rate come from two different sources, but both are surveys.  Any survey has various potential sources of error.  The BLS does a great job of pointing these out.

Sampling Error

Sampling error means that the businesses or households chosen for the survey (even if all respond) may not accurately represent the actual population characteristic you are trying to measure.  If you learn this in a statistics class, you have a jar with a known percentage of black and white marbles from which you draw repeated samples.  As you do more samples, you come to realize that they reflect the underlying population on average, but often deviate.  There are statistical methods for estimating sampling error.  The BLS often cites a 90% confidence interval.

This is understood but usually overlooked in discussing the big headline numbers.  For the payroll report, for example, the 90% confidence interval is about 104,000 jobs.  Viewed in this way, we could infer that given our survey result actual job growth would be between -4000 and +204,000 about 90% of the time.

When the discussion turns to the various “internals” of these reports, the participants seem to forget that these are are also survey results.  Each item has its own 90% confidence interval.  Here are some selected examples:

Unemployment rate    +/- 0.16%

Size of labor force    +/- 490,000

Change in labor force MoM  +/- 400,000

Do you see why some of the glib commentary about the unemployment rate and the number dropping out of the labor force might be a bit overstated?  It is dangerous to infer too much from monthly changes, even when they seem like large numbers.  The errors are actually quite small in proportion to the entire size of the labor force.  Source:  http://www.bls.gov/opub/ee/empearn201005.pdf  (pp. 233-34).

Here are a few more examples:

Average weekly hours    +/- 1.65%

Average weekly earnings    +/-1.65%

Construction monthly change    +/- 24,000

The changes in weekly hours, viewed as such a negative this month, is well within the “noise” level.  Last month’s positive reading was only a bit outside the confidence interval.  People regularly comment on changes in various subgroup categories when the change is well within the confidence interval.  Here is the source for these and other results from the establishment survey.

Non-Sampling Error

The sources of non-sampling error are many, so any brief summary cannot do justice to the topic.  For today, let me pick the single most important example.  Many pundits who are critics of the headline NFP number think that the “internal” measures are just fine.  I especially have in mind those who have incorrectly focused their criticism on the birth/death adjustment.

The real question about estimating job creation lies in the BLS approach to survey non-respondents.  By treating non-respondents as similar to those who did respond, the BLS uses job losses from business deaths to impute job gains from business births.  This is not an assumption, but rather a conclusion drawn from a lot of data.  It worked very well for many years, and improved the ability to capture cyclical turning points.  In a recent article I showed that this imputation method had, in the most recent period we can measure, overstated job growth, perhaps by as much as 100K per month.

The astute readers of “A Dash” may have already spotted the key conclusion:

If the imputation of new jobs is inaccurate, so are the internal measures.

In effect, if you use hours worked as an indicator, you cannot multiply the hours times the number of jobs to get a “simulated job change” as many like to do.  If there is something wrong with the underlying job count, this method breaks down in a serious way.

Leading Birth/Death critics like Barry Ritholtz will not identify this problem.  Here is why.  In this column he accurately points out the error of those like Rick Santelli who (each month) subtract the raw B/D adjustment from the seasonally adjusted job change.  So far, so good.  Barry’s own error comes in a false distinction between what he calls a “measurement” of job change (what comes from the sample) and the birth/death adjustment.  In fact, the imputation step is not a measurement in any sense of the word.  It is a modeling process — an empirically derived conclusion that is no different from the birth/death adjustment, and vastly more important.

By not treating job creation is a two-step process, Barry has focused on the birth/death adjustment, meanwhile leading nearly everyone else in his footsteps. It is my hope that he will take another look at what he calls the “measurement” part of job creation.

To summarize, if you share my concern about imputing the characteristics of existing businesses to business deaths and births, you will have have the same concern about measures like hours worked.

Employment Summary

The employment situation is not as good as people portrayed it to be last month, nor as bad as argued this month.  We are plodding along at a below trend growth rate.  It is not enough to reduce the unemployment rate, but it is also not at recession levels.

Others reaching similar conclusions include Gene Epstein (A Second Dip Still Unlikely).  Many of those calling this a bad report were trying to explain a market that was slightly lower on the day.  Fifteen minutes before the close, the market was a touch higher.  Would the stories have been different?

If you need to know the market change on the day to interpret economic data, you need to sharpen your skill set.

The Ugly

Two things bother me.

  • The really ugly news of the week relates to technical analysis. We now have a long list of technical problems.
    • Death crosses — 50-day versus 200-day MA’s.
    • Breakdown of the trading range.
    • Head and shoulders pattern with the neckline threatened.
    • Dow Theory sell signal.
    • Elliot Wave disaster forecast.
    • Art Cashin and the cocktail napkin charting.  Art accurately conveys what the floor is thinking.  They are all worried about the charts.
  • The strength in the Euro.  This is supposed to be a positive, but the market is not reacting.  Those predicting dollar/Euro parity seem to be mistaken.  Meanwhile, US equities and commodities have broken a pattern of rallies on Euro strength.

The technicians are all so very certain about the move lower, the various targets, and the inevitability of it all.  The true believers (of either political or economic persuasion) are even more confident.

The Week Ahead

There is little in the way of data this week.  ISM services will add to the ISM manufacturing.  Initial claims will add to the four-week moving average that we favor.

It is a week open for those with charts and technical analysis, while we await the onset of earnings reports.

Our Own Forecast

Our own indicators turned bearish right after our May 9th report and
have been neutral or bearish since then.  We have switched to neutral
once again, and that is our vote in the weekly Ticker
Sense Blogger Sentiment Poll
.   Here is
what we see:

  • Only 24% of our 55 ETF’s have a positive rating and three of these
    are inverse ETFs.  This remains pretty weak.
  • 68% of our 55 sectors are in our “penalty box,” down from 93% last week.  This is a small improvement in our key measure of risk.
  • Our universe has a media strength of only -16.

[For more on the penalty box see this article.  For more on the system ratings, you
can write to etf at newarc dot com for our free report package or to be
added to the (free) weekly email list.  You can also write personally to
me with questions or comments, and I’ll do my best to answer.]

For short-term accountswe were mostly
neutral last week, with a small short lean.  For long-term investors our asset allocation program has shifted into a heavier allocation to bonds.

You may also like

3 comments

  • Paul in Kansas City July 5, 2010  

    As always your analysis is very helpful. I think you can make a case for focusing on the Energy MLP and Royalty Trust sectors which dovetails nicely into your oil exploration post. THese may be a little more resilient to techinical indicators everyone appears to be fearing! At least they look cheap on the spreadsheet 🙂 !!

  • Mike C July 6, 2010  

    For long-term investors our asset allocation program has shifted into a heavier allocation to bonds.
    You’ve lost me here. Can you clarify the rationale? I thought you were in this camp:
    http://www.cnbc.com/id/38042727
    Strategist: Stocks Reach Cheapest in 60 Years
    The U.S. stock market is the cheapest since 1951, according to a model comparing earnings valuations to corporate bond yields.
    The S&P 500 has a so-called earnings yield of 9.1 percent, which when compared to corporate bond yields at 6.1% forms the largest gap in six decades, wrote Michael Darda, chief strategist and economist at MKM Partners.
    Darda calculated the ‘earnings yield’ by taking the inverse of his current market multiple of 11, which is based on the trailing four-quarters corporate profit data released in the GDP report. This method is similar to the ‘Fed Model’ allegedly used by Alan Greenspan, which took the earnings yield and compared it to government bond yields.

    According to this model or any of its variants, wouldn’t the long-term move be to shift out of bonds into stocks and not the other way around?

  • Jeff Miller July 6, 2010  

    Mike C – The asset allocation program is a model-driven method that uses our sectors to adjust positions in reaction to the market “message.” Like humans, it can get defensive – and even short – without regard to earnings.
    I do not report changes in this program unless there is something dramatic. More often I discuss Vince’s trading programs (Oscar and Felix) or my own approach.
    We have four different programs. For most clients some combination is best.
    As I have frequently mentioned here, on a forward earnings basis stocks are extremely attractive for long-term investors.
    And thanks for asking!
    Jeff