Will We Ever Get to ‘Full Employment’?
By Paul Solman
A man enters a Shoe Carnival store in Morton Grove, Ill. Photo by Tim Boyle/Getty Images.
Paul Solman answers questions from the NewsHour audience on business and economic news here on his Making Sense page. Here is Thursday’s query:
Question: No one is satisfied with current numbers for employment or unemployment levels. My question: What is the metric (level) that we’re supposed to be at? In the 1960’s, 4 percent unemployment was considered to be full employment. Over the past years, that number seems to be accepted as being higher: 4 percent, 5 percent, 5.5 percent or even 6 percent.
Paul Solman: There are several issues with regard to the unemployment data. First, what is “full employment”? (That’s the question you pose, Gary).
Second, there’s a closely related question: what is the point at which the employment rate becomes so low, it triggers inflation — the so-called “NAIRU” or the non-accelerating inflation rate of unemployment?
Third: How accurate is the measure of employment we are currently using, and how has it changed over time?
As to questions one and two, the answer has changed over the years and decades. In the 1990s alone, the full-employment definition dropped from something like 6 percent to a number closer to 4 percent. That’s because the official unemployment rate did drop to 4 percent under President Clinton in the year 2000. At the same time, inflation rose only slightly — back to a touch over 3 percent, where it had been four years earlier.
Yes, keeping employment up was a dot.com boom that seemed, in retrospect, to have been unsustainable. But while it lasted, 96 percent of us, by the official measure, were not only willing to work, but were actually gainfully employed, without causing a bidding war for our services — a bidding war that would constitute noticeable wage inflation, which would in turn inevitably lead to price inflation to cover the higher cost of labor.
An even lower number constituted full employment in 1929. It’s hard to compare data between then and now, but while the inflation rate hovered around zero, the official unemployment rate at the peak of the 1920s was 2.9 percent. Roarin’ indeed.
Or consider Japan. It boasted an unemployment rate that never exceeded three percent for 40 years, from the early 1950s to the early 1990s. Sure, there was a no-layoff policy. Granted, women didn’t join the workforce as they did in America.
But hey, an average of something like 2 percent unemployment over nearly half a century is, well, an average of something like 2 percent unemployment over nearly half a century. And while inflation in Japan did spike in the 1970s, just as it did in America, no one, then or since, blamed it on low unemployment.
But all of this leads, like it or not, to question three: When we talk about “full employment” of 4 percent (or even 6 percent) these days, do we mean the same thing we used to back when?
The answer is no. The government has changed the way it measures unemployment and as a result, some Americans who used to be counted as officially unemployed no longer are — because they haven’t looked for work recently enough. That’s why I devised our monthly U-7 statistic, which includes everyone who says they want to be working full-time, but aren’t. Compared to the official unemployment number of 7.6 percent, U-7 is running at 16 percent.
There’s another significant difference between official unemployment today, compared to that statistic years ago. As we explained in a story back in 2003, many more Americans are now on either disability or in prison than in previous years. Were they out on the street looking for work, today’s official unemployment number, as well as our “Solman Scale” U-7, would be several percentage points higher than they are.
Final thought. All of this rumination about the underestimation of true unemployment has one unambiguous implication for the definition of “full employment”: it could probably fall below 4 percent without triggering inflation.