Bleg: What Changed in the Late 70s?

Gentle readers will be aware that I look at (and graph) a lot of data sets, slicing and dicing them every which way to try and suss out how the world — or at least the economy — works.

One thing I and many others keep running across is what seems to be a fundamental shift in the late 70s. The economy after that period seems to work differently than it did in the post-war years. You see big moves in the data ’75-’80, and a lot of historic highs and lows since then — often sustained at those high and low levels (though often with more volatility around the average).

At least one profoundly important indicator has been steadily increasing ever since that time — the duration of high unemployment during and expecially after recessions:

Look at the purple, black, brown, and red lines — ’81, ’90, ’01, and ’07, in that order. What’s going on here?

I’ve pointed out several items over the years that had their inflection point in 1981 (notably the increase in the federal debt starting then, following 35 years of decline), and attributed those direction shifts to the rise of Reaganomics. I still believe that, but there are a lot of other changes that seemed to begin in the Carter years. Some of those 1981 inflection points (not including the federal debt runup) could be attributable to whatever changed in the preceding five years.

Again, lots of other people have noticed this, across a lot of different data sets and analyses. It seems like the economy changed fundamentally in the late 70s — the before seems to be structurally different from the after. The economy acts differently now.

Here’s my question: what changed? Here are some possibilities off the top of my head.

The first thing that comes to mind is inflation/stagflation. That was the defining economic reality of those years, not tamed until ’83/’84. Maybe people got scared of that syndrome, and started acting differently as a result. But: for 20 years post-’85, inflation was as tame as it’s ever been. Would people keep reacting to the fear of stagflation even after all that time? (Yes, the righties are still waving that flag as a way to dismiss all ideas vaguely Keynesian, but…)

Then there’s OPEC. Could the conscious and intentional manipulation of world oil prices since its inception have caused what we’ve seen? Maybe, but the truth is OPEC has pretty limited power. Supply and demand still overwhelm their price-fixing ability. They can only tweak prices at the margins.

Technology? Globalization? International or domestic monetary or trade policies?

You can probably tell I’m flailing here. But as far as I can tell I’m not alone. I haven’t found a single convincing explanation for the phenomenon. My only inkling of an explanation for the above graph is this, and I’m hard-pressed to relate it specifically to the late seventies. (The rapid decline in the cost of computers?)

Thoughts?



Posted

in

by

Tags:

Comments

26 responses to “Bleg: What Changed in the Late 70s?”

  1. jazzbumpa Avatar

    I wasn’t aware that anyone else had notices a basic change change in the late 70’s. I just came to that realization myself, looking at the relationship between deficits and inflation. In the post war years, until the late 70’s there was a positive correlation. Then, it went negative for a while, and now has settled in at something close to zero. I haven’t posted this level of detail yet, but the preliminary work is here.

    http://jazzbumpa.blogspot.com/2010/12/of-deficits-and-inflation.html

    http://jazzbumpa.blogspot.com/2010/12/of-deficits-and-inflation-part-2.html

    I’d welcome your comments.

    Clearly, something changed ca. 1980, or a little before.

    Cheers!
    JzB

  2. Chris T Avatar
    Chris T

    The microprocessor was invented in the early 1970’s. This development was absolutely critical to fully integrating digital computers with most applications and the development of the PC.

  3. jazzbumpa Avatar

    Chris –

    OK. That’s a fact. How does that relate to what is commonly called the great moderation, but I call the great stagnation?

    Is there a straight line between the microprocessor and post recession unemployment duration?

  4. Chris T Avatar
    Chris T

    Digital computers were difficult to integrate for many applications prior to the invention of the microprocessor due to the CPU being made up of several different components. By combining all of the functions of a CPU onto one chip, the cost of manufacturing digital CPUs fell significantly and it became much easier to incorporate them into other products.

    Employers now had the option of using computers in the place of human labor for many tasks and tended to do so rather than rehire labor following recessions. As computers have become ever more capable, the range of tasks they can perform is expanding and labor takes ever longer to recover following a recession (because of dependency on new jobs rather than getting rehired for the same ones as before).

  5. Asymptosis Avatar

    Jazzbumpa, your (first) post was one of the inspirations. It is damned interesting and your two questions to chris here are a propos.

    Re chris’s answer to you, it’s much in line with what I said here:

    http://www.asymptosis.com/why-prosperity-requires-a-welfare-state.html

    With followups here:

    http://www.asymptosis.com/are-machines-replacing-humans-or-am-i-a-luddite.html

    and here:

    http://www.asymptosis.com/robin-hansons-reply-to-the-luddites.html

    The fact that Hanson even deigned to comment over here suggests to me that he could have been feeling defensive of his views, and that maybe this argument has some legs…

    I’m not sure how that could explain the great moderation, though. Secondary effects at best, no?

  6. Chris T Avatar
    Chris T

    I’m ‘Chris’ in the comments on the above posts.

    Note that in the last two recessions productivity growth has been historically high:

    http://www.frbsf.org/publications/economics/letter/2010/el2010-28.html

    Also note that contrary to popular belief, American manufacturing had been growing unabated up until the previous recession even while manufacturing employment has fallen off a cliff:

    http://4.bp.blogspot.com/_4jIlyJ10uJU/S4PWmrc4o-I/AAAAAAAAGGg/A0CB7xzYApw/s1600-h/Industrial+Production.JPG

    Employment durables:
    http://1.bp.blogspot.com/_4jIlyJ10uJU/S4PYHgzVXNI/AAAAAAAAGGw/6jmZsYrR9Rs/s1600-h/Goods+Producing+Industries.JPG
    (Courtesy fivethirtyeight.com)

    Also note the previous manufacturing employment high was about 1979. The only way to explain this is automation.

  7. Asymptosis Avatar

    Yeah the 1930s saw the greatest productivity boom in the 20th century.

    Recessions are capitalism’s way of sticking it to the little guy.

    (Well, with the help of the fed, which frantically stomps on anything that looks like wage growth…oh, I mean “structural inflation.”)

  8. jazzbumpa Avatar

    The other thing I’ve been thinking about, at my place and the discussions at Angry Bear, that might dovetail into this discussion is the LUMP OF LABOR FALLACY fallacy. Technology is supposed to create new jobs, even as it eliminates old ones. Seems to not be happening.

    Contra Chris, another thing that has happened since roughly 1980 is that GDP growth has been in decline. It’s hard to relate that to microprocessor use – at least the connection isn’t obvious to me.

    Another major change is the wealth disparity increase, attributable to two factors: income disparity (top 10% capturing all generated wealth as median income stagnated) and loss of tax progressivity.

    BTW, I now have part 3 of my “inflation vs deficit” series posted.

    http://jazzbumpa.blogspot.com/2011/01/of-deficits-and-inflation-part-3.html

    Cheers!
    JzB

  9. zb Avatar
    zb

    I a significant contributor was deregulation in general, and especially of financial markets, in the US, but also in the world. I think we gained benefits from these changes (silk clothing from China, iPhones, a range of consumer goods, development in those countries making those products, more rapid transmission of innovative ideas from other countries — Israeli startups are a good example). But, I also think that there was a fundamental change in the structure of society in the 20th century, a changes that we’re still figuring out all the consequences of.

    In looking at unemployment specifically, I’m thinking we’re moving towards a “freelance” society where all employment is short term. The consequence is that workers have to capture the full value of their employment at the instant they produce the work (because long term relationships are going to be unreliable). But that means everyone has to accept and embrace a lot more volatility (than, perhaps, human beings are built for).

  10. Asymptosis Avatar

    jazzbumpah:I atribute both the decline in GDP growth and the increasing inequality to Reaganomics; we haven’t been taxing (progressively) and redistributing enough to keep the log rolling.

    zb: “a significant contributor was deregulation in general, and especially of financial markets, in the US, but also in the world.”

    I agree, but that didn’t happen in the 70s, did it? That’s the Reaganomics effect, no?

    I’m kind of liking the “invention of the microprocessor” notion, but need to ponder more. Real widespread use (micros) wasn’t till the eighties, but corporations started buying minis by the truckload in the late 70s, replacing lots of clerical white-collar work.

  11. Chris T Avatar
    Chris T

    Technology is supposed to create new jobs, even as it eliminates old ones. Seems to not be happening.

    It’s creating jobs, just not as quickly as it’s been replacing them. This an issue I have with the lump of labor ‘fallacy’; even if technology creates more jobs than it destroys in the long term, it doesn’t need to do both concurrently. There can be a significant lag between job destruction and creation (particularly since it’s easier to think of ways to apply technology to existing applications rather than create all new ones). This does not mean I oppose technology, far from it, but I also don’t think we should completely dismiss the negative effects on those displaced.

    Contra Chris, another thing that has happened since roughly 1980 is that GDP growth has been in decline. It’s hard to relate that to microprocessor use – at least the connection isn’t obvious to me.

    Has it though or has it been more drawn out? It’s true annual growth rates have slowed, but so has the decline in GDP during recessions (at least prior to the most recent one). If you calculate growth per year averaged over the previous ten years (a moving average), the moderation isn’t nearly as bad (and actually starts on a decadal basis in the early 70s). This is probably a better way to look at it anyway, since we care about aggregate growth, not just year to year. (I’ll put together a graph and attendant data later tonight.)

    Another major change is the wealth disparity increase, attributable to two factors: income disparity (top 10% capturing all generated wealth as median income stagnated) and loss of tax progressivity.

    This is readily explainable by technological change. Prior to cheap digital computers, labor was the only game in town for performing most tasks and was the limiting factor in production. Thus much of the wealth generated had to be sunk back into labor and wages rose. Now, there is an alternative to labor and the demand for it has fallen, removing the upward pressure on wages.

  12. Asymptosis Avatar

    chris: “(I’ll put together a graph and attendant data later tonight.)”

    Would like to see that. Thanks.

    Steve

  13. Chris T Avatar
    Chris T

    @Asymptosis
    Steve, check your e-mail.

  14. Asymptosis Avatar

    Chris, haven’t gotten to your files yet. Will probably tomorrow.

    More on the 70s: could it just be about the stock market?

    http://www.ritholtz.com/blog/wp-content/uploads/2011/01/Real_Stock_Growth_Log.png

    Prices took a long dive, ’68-’82… (inflation adjusted, which brings us back to the stagflation thing…)

  15. Chris T Avatar
    Chris T

    @Asymptosis
    Some things that happened around that time I can think of:
    Breton Woods collapsed in the early 70’s (a combination of other countries economically recovering from WWII and massive U.S. government deficits during the Vietnam War while still using the last vestiges of the gold standard) and the United States became a net oil importer about 1970.

    Deregulation happens too late to explain it (railroad wasn’t until Ford and the bulk of it happened under Carter).

  16. Olav Martin Kvern Avatar
    Olav Martin Kvern

    Steve wrote: “Real widespread use (micros) wasn’t till the eighties, but corporations started buying minis by the truckload in the late 70s, replacing lots of clerical white-collar work.”

    Did they? My strong impression at the time was that widespread adoption of minicomputers simply increased the number of data points a corporation could monitor and process. It’s only anecdotal evidence (at best), but the people I knew in accounting at the time were thrilled by how much more they could do–and, far from being laid off, were in great demand. So…can you cite something? Did we really see a significant dip in clerical employment in the late 70s? Even if we did, wasn’t there job growth in other sectors? (I was working at a company that quadrupled in size because of the Data General Nova series. Again, it’s anecdotal, but still.)

    I’ll admit a philosophical bias: I don’t think that mini/microcomputers have resulted in net job loss, overall. I do think that global communications (cheap transport and telephones, mostly) changes have changed business practices, and that those changes have resulted in local (i.e., national, for the USA) job loss. Maybe I’m defining computing in too narrow a sense…?

    There is something weird about the 70s, for sure. I like Chris T’s most recent comment–the combination of historical factors cited feels like the start of an answer.

  17. Big Sis Avatar

    Hiya Olav, long-time! Hope all is well.

    I agree with you. I first went looking for a job in the early 80’s, and at that time my company (Visa, which was pretty high-tech as companies went) was just starting to get dedicated word-processors. These were bought for the secretaries, whose job continued to be what it had always been: take handwritten drafts from people and type them up… (and oh god, the editing and review process!). (I remember interviewing at Lanier during my job search. Those word processors were rather expensive if I remember, and single purpose, so not designed for everyone’s desktop. They were basically typewriter replacements and nothing else.)

    Only a few years later, when everyone in the company finally started to get actual computers on their desks, did that process start to change. At that time, I did have one colleague (a young-side-of-middle-aged man) who found it increasingly difficult to do his job because he just couldn’t type.

  18. Big Sis Avatar

    Actually, when I started at Visa in ’82, as far as I know there were just typewriters in use. (and we programmers shared pools of terminals. We reviewed code on greenbar printouts). We started getting word-processors for the secretaries shortly after that.

  19. Chris T Avatar
    Chris T

    Maybe I’m defining computing in too narrow a sense…?

    A major problem I’ve found when discussing this topic is that people get hung up on a ‘computer’ as the thing they use on a day to day basis (a PC). When in fact what makes it a computer is the small microprocessor inside the case which is a type of integrated circuit. The neat thing about integrated circuits is they be put into virtually anything electronic and can execute algorithms (a series of logical steps) to control those devices. Those algorithms can vary from the simple to the incredibly complex.

    This is important because the majority of activities in the economy can be written out as series of logical steps (even activities that technically can’t, such as driving, can be ‘faked’ with a sophisticated enough logic tree). Prior to the integrated chip, almost all such logic steps could only be performed by humans.

    The microprocessor made performing such tasks digitally cost effective relative to human labor.

  20. Olav Martin Kvern Avatar
    Olav Martin Kvern

    @Chris T
    re: “…people get hung up on a ‘computer’ as the thing they use on a day to day basis (a PC)”

    Not my bias, that’s for sure! (I build microcontroller-based musical instruments as a hobby, and have been working with electronics since the 1970s.) My point was that, as far as I can tell, there weren’t major dislocations of labor due to mini/microcomputer adoption in the 1970s. Steve’s original point (or question?) was about clerical workers being displaced by mini/micro adoption in the 1970s. I think we might see something like that in the late 1980s, but not in the late 1970s. Looking back, the rate of change in the workplace seems very slow, relative to the rate of technological change.

  21. ZZZZ Avatar
    ZZZZ

    You should ask Stirling Newberry what he thinks. He wrote this a few years ago:
    http://web.archive.org/web/20060611063241/http://www.bopnews.com/archives/006046.html

    This is the missing graph from the article (curse you archive.org for stripping images from archived articles):
    http://www.imagebam.com/image/5f1a7c114123433

    Since your spam filter thinks I’m writing spam, I’ll add a few paragraph’s from the linked articles:

    The American economy has two different faces: one is a basic engine of how growth works, and the other is a basic assumption of the over all rate of growth. The way the economy works is by generating sprawl and new city centers. People move to them, provide services and establish first mover advantages, and the government’s role is to keep oil, water and roads rolling.

    This system of growth – glimpsed in embryo in a few cities in the 1920’s and turned into a way of life after the second world war – is what began with a big bang after the war. Let’s take a look at payroll growth by percentage of payrolls that is, how many jobs are added based on the number of jobs that exist. To smooth out the recessions, this chart takes a 10 year moving average. That is, over the previous 10 years – starting with the data in 1939 then. It includes the World War II jobs boom, and continues to the present. The average for the entire period is .19% That is, on average the American economy added one fifth of one percent of its current job total to payrolls.

  22. ZZZZ Avatar
    ZZZZ

    Comment Part 2 (Because you have a shitty spam filter on your site) again adding paragraphs from linked articles to avoid filters:

    Tangentally this is somewhat related to the above article:
    http://www.correntewire.com/three_polar_politics_post_petroleum_america

    As long as the possibility of starting up the land casino is there, as long as the present generation can believe that the next generation will pay heavily for access to it, there will be no substantial change in the American political landscape. The question will be between two wings of the political spectrum over which can best maintain the present system.

    This is also an interesting article on the Penultimate crisis (unrelated to the above article):
    http://agonist.org/stirling_newberry/20080901/global_gloom

    In history a great deal of time is spent on the ultimate crisis of an order, it is the time of mythic action, great good, and great evil. But less is often paid to the penultimate crisis, the moment before the moment. This is an error, because the ultimate crisis is the ultimate crisis because of the attitudes brought to it from the one just before.

    At the point of ultimate crisis, certain groups, hammered by the previous solution, vow not to accept again terms of the same kind. Others, buffered by the previous solution, become complacent in their ability to impose, by force if need be, the same terms again.

  23. Asymptosis Avatar

    Sorry folks, catching up.

    I agree that minis can’t explain the 70s weirdness — that was sort of a stab in the dark responding to Chris’s suggestion re: the microprocessor. As Ole and Janet say, and as accepted theory generally predicts, companies didn’t so much lay off clerical staff (some I’m sure, but…) as fulfill felt needs that they could never have done before: generating sales reports daily, weekly, or monthly instead of quarterly, for instance.

    I think it’s different now, and that technology and productivity advances *are* explanatory for the ever-longer “jobless recoveries.” Technology combined with globalization has now transformed the whole eco(no)system. We’re on the knowledge end of a global economy, and to an ever-greater extent only knowledge workers can benefit from that. (By definition, 50% of people have an IQ below 100.) That wasn’t true in the 70s.

    Recessions are where that effect is effected. They give companies the leverage and incentive to cut jobs and wages (“we can’t help it!”), and we see these huge productivity jumps as a result — a self-perpetuating cycle.

    This dynamic has been systematically enforced by the fed and mainstream economists: any time wages start to increase, they call it “structural inflation” (because wages are “sticky” and hard to bring down), freak out that we’re going back to the late seventies or the Weimar Republic, and clamp down on growth. When profits or assets increase, it’s just the market working, and they leave the faucet open. It’s true that those things are *not* so sticky, but still…

    ZZZZ, haven’t had time to look at all your stuff yet, but it looks interesting and will definitely do so.

  24. Chris T Avatar
    Chris T

    I think it’s different now, and that technology and productivity advances *are* explanatory for the ever-longer “jobless recoveries.”

    Manufacturing would have been the first affected by automation with respect to jobs and 1979 being the peak in manufacturing employment bares this out.

    Based on anecdotal evidence (I know, I know), automation in the white collar world didn’t really get going until the 1990’s.

  25. Leroy Dumonde Avatar

    One word: PATCO. Reagan crushed the unions.

    You see, American workers used to be more highly trained and highly paid across the board. And because of that the USA used to have a much more cash based economy.

    The outcome is that what used to be business inventory recessions have now become household balance sheet recessions. Up until the 1990s recessions were caused by either some external shock (e.g. OPEC price shocks) or Fed policy (e.g. Paul Volker throwing a brick at the economy’s head). Nowadays, recession are caused by household debt overhang.

    So, way back when the economy tanked because businesses had to recalibrate their inventory. And in so doing they used to hoard labor because they knew the economy would come roaring back once said recalibration was done. And they knew this because they knew that households would have nothing constraining them from going back to the consumption party once they were all back to working full time.

    But now it’s households that have to recalibrate their balance sheets by paying down debt incurred from joining in the consumption party via debt. And this is out of the control of businesses who thus have no reason to hoard workers.

    You see, business owners used to have reason to keep many people on at reduced pay and/or hours or lay them off with a guaranty that they’d be hired back because they knew that they and other owners would straighten things out. On top of that those employees were careerists (many of whom had been through an apprenticeship) and so they didn’t want to lose them when the boom came back. But now it’s all in the hands of consumers who are much less reliable in sorting things out (which, by the way, argues for a way larger emphasis on home economics in high school).

    At any rate, this surely doesn’t explain it all but I think it’s a very large aspect that is ignored out of most economist’s knee jerk union hating attitude.