Gramercy Images News

A Financial Novelty weblog

Archive for the ‘The Archdruid Report’ tag

Man, Conqueror of Nature, Dead at 408

without comments

EARLY SELFIE by WilliamBanzai7/Colonel Flick

EARLY SELFIE, a photo by WilliamBanzai7/Colonel Flick on Flickr.

 

by John Michael Greer

Man, the conqueror of Nature, died Monday night of a petroleum overdose, the medical examiner’s office confirmed this morning. The abstract representation of the human race was 408 years old. The official announcement has done nothing to quell the rumors of suicide and substance abuse that have swirled around the death scene since the first announcement yesterday morning, adding new legal wrinkles to the struggle already under way over Man’s inheritance.

Man’s closest associates disagree about what happened. His longtime friend and confidant Technology thinks it was suicide. “Sure, Man liked to have a good time,” he said at a press conference Tuesday evening, “and he was a pretty heavy user, but it wasn’t like he was out of control or anything. No, I’m sure he did it on purpose. Just a couple of weeks ago we were hanging out at his place, looking up at the moon and talking about the trips we made out there, and he turned to me and said, ‘You know, Tech, that was a good time—a really good time. I wonder if I’ll ever do anything like that again.’ He got into moods like that more and more often in the last few years. I tried to cheer him up, talking about going to Mars or what have you, and he’d go along with it but you could tell his heart wasn’t in it.”

Other witnesses told a different story. “It was terrifying,” said a housekeeper who requested that her name not be given. “He was using more and more of the stuff every day, shooting it up morning, noon and night, and when his connections couldn’t get him as much as he wanted, he’d go nuts. You’d hear him screaming at the top of his lungs and pounding his fists on the walls. Everybody on the staff would hide whenever that happened, and it happened more and more often—the amount he was using was just unbelievable. Some of his friends tried to talk him into getting help, or even just cutting back a little on his petroleum habit, but he wouldn’t listen.”

The medical examiner’s office and the police are investigating Man’s death right now. Until their report comes out, the tragic end of humanity’s late self-image remains shrouded in mystery and speculation.

 

 

A Tumultuous Family Saga

“He was always a rebel,” said Clio, the muse of history, in an exclusive interview in her office on Parnassus this morning. “That was partly his early environment, of course.span style=”mso-spacerun: yes;”  /spanHe was born in the household of Sir Francis Bacon, remember, and brought up by some of the best minds of seventeenth-century Europe; an abstract image of humanity raised by people like that wasn’t likely to sit back and leave things as they were, you know. Still, I think there were strong family influences too. His father was quite the original figure himself, back in the day.”

Though almost forgotten nowadays, Man’s father Everyman, the abstract representation of medieval humanity, was as mediagenic in his own time as his son became later on.span style=”mso-spacerun: yes;”  /spanThe star of a wildly popular morality play and the subject of countless biographies, Everyman was born in extreme poverty in a hovel in post-Roman Europe, worked his way up to become a wealthy and influential figure in the Middle Ages and Renaissance, then stepped aside from his financial and political affairs to devote his last years to religious concerns. Savage quarrels between father and son kept the broadsheet and pamphlet press fed with juicy stories all through the seventeenth and eighteenth centuries, and eventually led to their final breach over Darwin’s theory of evolution in 1859.

By that time Man was already having problems with substance abuse. “He was just using coal at first,” Technology reminisced. “Well, let’s be fair, we both were. That was the hot new drug in those days.span style=”mso-spacerun: yes;”  /spanIt was cheap, you could get it without too much hassle, and everybody on the cutting edge was using it. I remember one trip we took together—it was on one of the early railroads, at thirty miles an hour. We thought that was really fast.span style=”mso-spacerun: yes;”  /spanWere we innocent back then, or what?”

Clio agreed with that assessment. “I don’t think Man had any idea what he was getting into, when he started abusing coal,” she said. “It was an easy habit to fall into, very popular in avant-garde circles just then, and nobody yet knew much about the long term consequences of fossil fuel abuse. Then, of course, he started his campaign to conquer Nature, and he found out very quickly that he couldn’t keep up the pace he’d set for himself without artificial help. That was when the real tragedy began.”

The Conquest of Nature

It’s an open question when Man first decided to conquer Nature. “The biographers all have their own opinions on that,” Clio explained, gesturing at a shelf loaded with books on Man’s dramatic and controversial career.span style=”mso-spacerun: yes;”  /span”Some trace it back to the influence of his foster-father Francis Bacon, or the other mentors and teachers he had in his early days. Others say that the inspiration came from the crowd he ran with when he was coming of age in the eighteenth and nineteenth centuries. He used to tell interviewers that it was a family thing, that everyone in his family all the way back to the Stone Age had been trying to conquer Nature and he was just the one who finally succeeded, but that won’t stand up to any kind of scrutiny. Examine the career of Everyman, for example, and you’ll find that he wasn’t interested in conquering Nature; he wanted to conquer himself.”

“The business about conquering Nature?” Technology said. “He got into that back when we were running around being young and crazy. I think he got the idea originally from his foster-father or one of the other old guys who taught him when he was a kid, but as far as I know it wasn’t a big deal to him until later. Now I could be wrong, you know. I didn’t know him that well in those days; I was mostly just doing my thing then, digging mines, building water mills, stuff like that. We didn’t get really close until we both got involved in this complicated coal deal; we were both using, but I was dealing, too, and I could get it cheaper than anybody else—I was using steam, and none of the other dealers knew how to do that. So we got to be friends and we had some really wild times together, and now and then when we were good and ripped, he’d get to talking about how Nature ought to belong to him and one of these days he was going to hire some soldiers and just take it.

“Me, I couldn’t have cared less, except that Man kept on bringing me these great technical problems, really sweet little puzzles, and I’ve always been a sucker for those. He figured out how I was getting the coal for him so cheap, you see, and guessed that I could take those same tricks and use them for his war against Nature. For me, it was just a game, for Nature, against Nature, I couldn’t care less.” Just give me a problem and let me get to work on it, and I’m happy.

“But it wasn’t just a game for him. I think it was 1774 when he really put me to work on it.span style=”mso-spacerun: yes;”  /spanHe’d hired some mercenaries by then, and was raising money and getting all kind of stuff ready for the war.span style=”mso-spacerun: yes;”  /spanHe wanted steam engines so, like the man said, it was steam engine time—I got working on factories, railroads, steamships, all the rest. He already had some of his people crossing the border into Nature to seize bits of territory before then, but the eighteenth century, that’s when the invasion started for real. I used to stand next to him at the big rallies he liked to hold in those days, with all the soldiers standing in long lines, and he’d go into these wild rants about the glorious future we were going to see once Nature was conquered. The soldiers loved it; they’d cheer and grab their scientific instruments and lab coats and go conquer another province of Nature.”

The Triumphant Years

It was in 1859, Technology recalled, that Man first started using petroleum. “He’d just had the big spat with his dad over this Darwin dude: the worst fight they ever had, and in fact Man never spoke to the old man again. Man was still steaming about the fight for days afterwards, and then we heard that this guy named Edwin Drake over in Pennsylvania could get you something that was an even bigger rush than coal. Of course Man had to have some, and I said to myself, hey, I’ll give it a try—and that was all she wrote, baby. Oh, we kept using coal, and a fair bit of it, but there’s nothing like petroleum.

“What’s more, Man figured out that that’s what he needed to finish his conquest of Nature. His mercs had a good chunk of Nature by then, but not all of it, not even half, and Man was having trouble holding some of the territory he’d taken—there were guerrillas behind his lines, that sort of thing. He’d pace around at headquarters, snapping at his staff, trying to figure out how to get the edge he needed to beat Nature once and for all. ‘I’ve gotta have it all, Tech,’ he’d say sometimes, when we were flopped on the couch in his private quarters with a couple of needles and a barrel of petroleum, getting really buzzed. ‘I’ve conquered distance, the land, the surface of the sea—it’s not enough. I want it iall/i.’ And you know, he got pretty close.”

Petroleum was the key, Clio explained. “It wasn’t just that Man used petroleum, all his soldiers and his support staff were using it too, and over the short term it’s an incredibly powerful drug; it gives users a rush of energy that has to be seen to be believed. Whole provinces of Nature that resisted every attack in the first part of the war were overrun once Man started shipping petroleum to his forces. By the 1950s, as a result, the conquest of Nature was all but complete. Nature still had a few divisions holed up in isolated corners where they couldn’t be gotten at by Man’s forces, and partisan units were all over the conquered zone, but those were minor irritations at that point. It was easy enough for Man and his followers to convince themselves that in a little while the last holdouts would be defeated and Nature would be conquered once and for all.

“That’s when reality intervened, though, because all those years of abusing coal, petroleum, and other substances started to catch up with Man. He was in bad shape, and didn’t know it—and then he started having problems feeding his addiction.”

On and Off the Wagon

“I forget exactly how it happened,” Technology recounted. “It was some kind of disagreement with his suppliers—he was getting a lot of his stuff from some Arab guys at that point, and he got into a fight with them over something, and they said, ‘Screw you, man, if you’re going to be like that we’re just not going to do business with you any more.’ So he tried to get the stuff from somebody else, and it turned out the guy from Pennsylvania was out of the business, and the connections he had in Texas and California couldn’t get enough. The Arab guys had a pretty fair corner on the market. So Man went into withdrawal, big time. We got him to the hospital, and the doctor took one look at him and said, ‘You gotta get into rehab, now.’ So me and some of his other friends talked him into it.”

“The records of his stays in rehab are heartbreaking,” Clio said, pulling down a tell-all biography from her shelf. “He’d start getting the drug out of his system, convince himself that he was fine, check himself out, and start using again almost immediately. Then, after a little while, he’d have problems getting a fix, end up in withdrawal, and find his way back into rehab. Meanwhile the war against Nature was going badly as the other side learned how to fight back effectively. There were rumors of ceasefire negotiations, even a peace treaty between him and Nature.”

“I went to see him in rehab one day,” said Technology. “He looked awful. He looked iold/i—like his old man Everyman. He was depressed, too, talking all the time about this malaise thing. The thing is, I think if he’d stuck with it then he could have gotten off the stuff and straightened his life out. I really think he could have done it, and I tried to help. I brought him some solar panels, earth-sheltered housing, neat stuff like that, to try to get him interested in something besides the war on Nature and his petroleum habit. That seemed to cheer him up, and I think all his friends had high hopes for a while.

“Then the next thing I heard, he was out of rehab. He just couldn’t hack it any longer. I went to his place, and there he was, laughing and slapping everybody’s back and full of big ideas and bigger plans, just like before. That’s what it looked like at first, but the magic was gone. He tried to do a comeback career, but he just couldn’t get it back together, and things went downhill from there.”

The Final Years

The last years of Man’s career as representation of the human race were troubled. “The war against Nature wasn’t going well by then,” Clio explained. “Man’s forces were holding onto the most important provinces and cities, but insurgencies were springing up all over—drug-resistant microbes here, herbicide-tolerant weeds there. Morale was faltering, and a growing fraction of Man’s forces in the struggle against Nature no longer believed in what they were doing. They were in it for the money, nothing more, and the money was running out. Between the costs of the war, the costs of Man’s lavish lifestyle, and the rising burden of his substance abuse problem, Man was in deep financial trouble; there’s reason to believe that he may have been engaged in outright fraud to pay his bills during the last few years of his life.”

Meanwhile, Man was becoming increasingly isolated. “He’d turned his back on most of his friends,” said the anonymous housekeeper quoted earlier. “Art, Literature, Philosophy—he stopped talking to any of them, because they kept telling him to get off the stuff and straighten out his life. I remember the last time Science came to visit—she wanted to talk to Man about the state of the atmosphere, and Man literally threw her out of the house and slammed the door in her face.span style=”mso-spacerun: yes;”  /spanI was working downstairs in the laundry, where you usually can’t hear much, but I could hear Man screaming, ‘I own the atmosphere! I own the planet! I own the solar system! I own the goddam istars/i! They’re mine, mine, imine/i—how dare you tell me what to do with my property?’ He went on like that for a while, then collapsed right there in the entry. A couple of us went up, carried him into his bedroom, and got him cleaned up and put to bed. We had to do that pretty often, the last year or so.”

His longtime friend Technology was apparently the last person to see Man alive. “I went over to his place Monday afternoon,” Technology recalled. “I went there pretty often, and we’d do some stuff and hang out, and I’d start rapping about all kinds of crazy stuff, omniscient supercomputers, immortal robot bodies, stuff like that. I told him, ‘Look, Man, if you want to get into stuff like omniscience and immortality, go talk to Religion.span style=”mso-spacerun: yes;”  /spanThat’s her bag, not mine.’ But he didn’t want to do that; he had some kind of falling out with her a while back, you know, and he wanted to hear it from me, so I talked it up. It got him to mellow out and unwind, and that’s what mattered to me.

“Monday, though, we get to talking, and it turns out that the petroleum he had was from this really dirty underground source in North Dakota. I said to him, ‘Man, what the frack were you thinking?’ He just looked at me and said, ‘I’ve gotta have the stuff, Tech. I’ve gotta have the stuff.’ Then he started blubbering, and I reached out to, like, pat his shoulder—and he just blew up at me. He started yelling about how it was my fault he was hooked on petroleum, my fault the war against Nature wasn’t going well, my fault this and that and blah blah blah. Then he got up and stormed out of the room and slammed the door behind him. I should have gone after him, I know I should have, but instead I just shook my head and left. Maybe if I’d gone and tried to talk him down, he wouldn’t have done it.”

“Everything was quiet,” the housekeeper said. “Too quiet. Usually we’d hear Man walking around, or he’d put some music on or something, but Monday night, the place might as well have been empty. Around ten o’ clock, we were really starting to wonder if something was wrong, and two of us from the housekeeping staff decided that we really had to go check on Man and make sure he was all right. We found him in the bathroom, lying on the floor. It was horrible—the room stank of crude oil, and there was the needle and all his other gear scattered around him on the floor. We tried to find a pulse, but he was already cold and stiff; I went and called for an ambulance anyway, and—well, you know the rest.”

The Troubled Aftermath

Man’s death leaves a great many questions unanswered. “By the time Everyman died,” Clio explained, “everyone knew who his heir would be.span style=”mso-spacerun: yes;”  /spanMan had already taken over his father’s role as humanity’s idealized self-image. That hasn’t happened this time, as you know. Man didn’t leave a will, and his estate is a mess—it may be years before the lawyers and the accountants finish going through his affairs and figure out whether there’s going to be anything at all for potential heirs to claim. Meanwhile there are at least half a dozen contenders for the role of abstract representation of the human race, and none of them is a clear favorite. It may be a long time before all the consequences are sorted out.”

Meanwhile, one of the most important voices in the debate has already registered an opinion. Following her invariable habit, Gaia refused to grant any personal interviews, but a written statement to the media was delivered by a spokesrabbit on Tuesday evening. “Please accept My sympathy for the tragic demise of Man, the would-be conqueror of Nature,” it read. “I hope it will not be out of place, though, to suggest that whomever My human children select as their new self-image might consider being a little less self-centered—not to mention a little less self-destructive.”

 

[WilliamBanzai]

[Man, Conqueror of Nature, Dead at 408]

[The Archdruid Report]

Written by testudoetlepus

December 23rd, 2013 at 3:13 pm

The End of the Shale Bubble?

without comments

by John Michael Greer

It’s been a little more than a year since I launched the present series of posts on the end of America’s global empire and the future of democracy in the wake of this nation’s imperial age. Over the next few posts I plan on wrapping that theme up and moving on.However traumatic the decline and fall of the American empire turns out to be, after all, it’s just one part of the broader trajectory that this blog seeks to explore, and other parts of that trajectory deserve discussion as well.

I’d planned to have this week’s post take last week’s discussion of voluntary associations further, and talk about some of the other roles that can be filled, in a time of economic contraction and social disarray, by groups of people using the toolkit of democratic process and traditional ways of managing group activities and assets. Still, that topic is going to have to wait another week, because one of the other dimensions of the broader trajectory just mentioned is moving rapidly toward crisis.

It’s hard to imagine that anybody in today’s America has escaped the flurry of enthusiastic media coverage of the fracking phenomenon.Still, that coverage has included so much misinformation that it’s probably a good idea to recap the basics here. Hydrofracturing—“fracking” in oil industry slang—is an old trick that has been used for decades to get oil and natural gas out of rock that isn’t porous enough for conventional methods to get at them. As oil and gas extraction techniques go, it’s fairly money-, energy- and resource-intensive, and so it didn’t see a great deal of use until fairly recently.

Then the price of oil climbed to the vicinity of $100 a barrel and stayed there. Soaring oil prices drove a tectonic shift in the US petroleum industry, making it economically feasible to drill for oil in deposits that weren’t worth the effort when prices were lower. One of those deposits was the Bakken shale, a sprawling formation of underground rock in the northern Great Plains, which was discovered back in the 1970s and sat neglected ever since due to low oil prices. To get any significant amount of oil out of the Bakken, you have to use fracking technology, since the shale isn’t porous enough to let go of its oil any other way.Once the rising price of crude oil made the Bakken a paying proposition, drilling crews headed that way and got to work, launching a lively boom.

Another thoroughly explored rock formation further east, the Marcellus shale, attracted attention from the drilling rigs for a different reason, or rather a different pair of reasons.The Marcellus contains no oil to speak of, but some parts of it have gas that is high in natural gas liquids—“wet gas” is the industry term for this—and since those liquids can replace petroleum in some applications, they can be sold at a much higher price than natural gas.Meanwhile, companies across the natural gas industry looked at the ongoing depletion of US coal reserves, and the likelihood of government mandates favoring natural gas over coal for power generation, and decided that these added up to a rosy future for natural gas prices.Several natural gas production firms thus started snapping up leases in the Marcellus country of Pennsylvania and neighboring states, and a second boom got under way.

As drilling in the Bakken and Marcellus shales took off, several other shale deposits, some containing oil and natural gas, others just natural gas, came in for the same sort of treatment. The result was a modest temporary increase in US petroleum production, and a more substantial but equally temporary increase in US natural gas production.It could never be anything more than temporary, for reasons hardwired into the way fracking technology works.

If you’ve ever shaken a can of soda pop good and hard and then opened it, you know something about fracking that countless column inches of media cheerleading on the subject have sedulously avoided. The technique is different, to be sure, but the effect of hydrofracturing on oil and gas trapped in shale is not unlike the effect of a hard shake on the carbon dioxide dissolved in soda pop:in both cases, you get a sudden rush toward the outlet, which releases most of what you’re going to get.Oil and gas production from fracked wells thus starts out high but suffers ferocious decline rates—up to 90% in the first year alone.Where a conventional, unfracked well can produce enough oil or gas to turn a profit for decades if it’s well managed, fracked wells in tight shales like the Bakken and Marcellus quite often stop becoming a significant source of oil or gas within a few years of drilling.

The obvious response to this problem is to drill more wells, and this accordingly happened. That isn’t a panacea, however. Oil and gas exploration is a highly sophisticated science, and oil and gas drilling companies can normally figure out the best sites for wells long before the drill bit hits the ground. Since they are in business to make money, they normally drill the best sites first. When that sensible habit intersects with the rapid production decline rates found in fracked wells, the result is a brutal form of economic arithmetic:as the best sites are drilled and the largest reserves drained, drilling companies have to drill more and more wells to keep the same amount of oil or gas flowing.Costs go up without increasing production, and unless prices rise, profits get hammered and companies start to go broke.

They start to go broke even more quickly if the price of the resource they’re extracting goes down as the costs of maintaining production go up.In the case of natural gas, that’s exactly what happened. Each natural gas production company drew up its projections of future prices on the assumption that ordinary trends in production would continue.As company after company piled into shale gas, though, production soared, and the harsh economic downturn that followed the 2008 housing market crash kept plummeting natural gas prices from spurring increased use of the resource; so many people were so broke that even cheap natural gas was too expensive for any unnecessary use.

Up to that point, the fracking story followed a trajectory painfully familiar to anyone who knows their way around the economics of alternative energy.From the building of the first solar steam engines before the turn of the last century, through the boom-and-bust cycle of alternative energy sources in the late 1970s, right up to the ethanol plants that were launched with so much fanfare a decade ago and sold for scrap much more quietly a few years later, the pattern’s the same, a repeated rhythm of great expectations followed by shattered dreams. .

Here’s how it works.A media panic over the availability of some energy resource or other sparks frantic efforts to come up with a response that won’t require anybody to change their lifestyles or, heaven help us, conserve. Out of the flurry of available resources and technologies, one or two seize the attention of the media and, shortly thereafter, the imagination of the general public.  Money pours into whatever the chosen solution happens to be, as investors convince themselves that there’s plenty of profit to be made backing a supposedly sure thing, and nobody takes the time to ask hard questions.In particular, investors tend to lose track of the fact that something can be technically feasible without being economically viable, and rosy estimates of projected cash flow and return on investment take the place of meaningful analysis.

Then come the first financial troubles, brushed aside by cheerleading “analysts” as teething troubles or the results of irrelevant factors certain to pass off in short order.The next round of bad news follows promptly, and then the one after that; the first investors begin to pull out; sooner or later, one of the hot companies that has become an icon in the new industry goes suddenly and messily bankrupt, and the rush for the exits begins.Barring government subsidies big enough to keep some shrunken form of the new industry stumbling along thereafter, that’s usually the end of the road for the former solution du jour, and decades can pass before investors are willing to put their money into the same resource or technology again.

That’s the way that the fracking story started, too. By the time it was well under way, though, a jarring new note had sounded:the most prestigious of the US mass media suddenly started parroting the most sanguine daydreams of the fracking industry.They insisted at the top of their lungs that the relatively modest increases in oil and gas production from fracked shales marked a revolutionary new era, in which the United States would inevitably regain the energy independence it last had in the 1950s, and prosperity would return for all—or at least for all who jumped aboard the new bandwagon as soon as possible. Happy days, we were told, were here again.

What made this barrage of propaganda all the more fascinating was the immense gaps that separated it from the realities on and under the ground in Pennsylvania and North Dakota. The drastic depletion rates from fracked wells rarely got a mention, and the estimates of how much oil and gas were to be found in the various shale deposits zoomed upwards with wild abandon.Nor did the frenzy stop there; blatant falsehoods were served up repeatedly by people who had every reason to know that they were false—I’m thinking here of the supposedly energy-literate pundits who insisted, repeatedly and loudly, that the Green River shale in the southwest was just like the Bakken and Marcellus shales, and would yield abundant oil and gas once it was fracked. (The Green River shale, for those who haven’t been keeping score, contains no oil or gas at all; instead, it contains kerogen, a waxy hydrocarbon goo that would have turned into oil or gas if it had stayed deep underground for a few million years longer, and kerogen can’t be extracted by fracking—or, for that matter, by any other economically viable method.)

Those who were paying attention to all the hoopla may have noticed that the vaporous claims being retailed by the mainstream media around the fracking boom resembled nothing so much as the equally insubstantial arguments most of the same media were serving up around the housing boom in the years immediately before the 2008 crash.The similarity isn’t accidental, either. The same thing happened in both cases:Wall Street got into the act.

A recent report from financial analyst Deborah Rogers, Shale and Wall Street (you can download a copy in PDF format here), offers a helpful glimpse into the three-ring speculative circus that sprang up around shale oil and shale gas during the last three years or so.Those of my readers who suffer from the delusion that Wall Street might have learned something from the disastrous end of the housing bubble are in for a disappointment:the same antics, executed with the same blissful disregard for basic honesty and probity, got trotted out again, with results that will be coming down hard on what’s left of the US economy in the months immediately ahead of us.

If you remember the housing bubble, you know what happened.Leases on undrilled shale fields were bundled and flipped on the basis of grotesquely inflated claims of their income potential; newly minted investment vehicles of more than Byzantine complexity—VPPs, “volumetric production payments,” are an example you’ll be hearing about quite a bit in a few months, once the court cases begin—were pushed on poorly informed investors and promptly began to crash and burn; as the price of natural gas dropped and fracking operations became more and more unprofitable, “pump and dump” operations talked up the prospects of next to worthless properties, which could then be unloaded on chumps before the bottom fell out.It’s an old story, if a tawdry one, and all the evidence suggests that it’s likely to finish running its usual course in the months immediately ahead.

There are at least two points worth making as that happens. The first is that we can expect more of the same in the years immediately ahead.Wall Street culture—not to mention the entire suite of economic expectations that guides the behavior of governments, businesses, and most individuals in today’s America—assumes that the close-to-zero return on investment that’s become standard in the last few years is a temporary anomaly, and that a good investment ought to bring in what used to be considered a good annual return:4%, 6%, 8%, or more. What only a few thinkers on the fringes have grasped is that such returns are only normal in a growing economy, and we no longer have a growing economy.

Sustained economic growth, of the kind that went on from the beginning of the industrial revolution around 1700 to the peak of conventional oil production around 2005, is a rare anomaly in human history.It became a dominant historical force over the last three centuries because cheap abundant energy from fossil fuels could be brought into the economy at an ever-increasing rate, and it stopped because geological limits to fossil fuel extraction put further increases in energy consumption permanently out of reach. Now that fossil fuels are neither cheap nor abundant, and the quest for new energy sources vast and concentrated enough to replace them has repeatedly drawn a blank, we face several centuries of sustained economic contraction—which means that what until recently counted as the groundrules of economics have just been turned on their head.

You will not find many people on Wall Street capable of grasping this. The burden of an outdated but emotionally compelling economic orthodoxy, to say nothing of a corporate and class culture that accords economic growth the sort of unquestioned aura of goodness other cultures assign to their gods, make the end of growth and the coming of permanent economic decline unthinkable to the financial industry, or for that matter to the millions of people in the industrial world who rely on investments to pay their bills.There’s a strong temptation to assume that those 8% per annum returns must still be out there, and when something shows up that appears to embody that hope, plenty of people are willing to rush into it and leave the hard questions for later.Equally, of course, the gap thus opened between expectations and reality quickly becomes a happy hunting ground for scoundrels of every stripe.

Vigorous enforcement of the securities laws might be able to stop the resulting spiral into a permanent bubble-and-bust economy. For all the partisan bickering in Washington DC, though, a firm bipartisan consensus since the days of George W. Bush has placed even Wall Street’s most monumental acts of piracy above the reach of the law.The Bush and Obama administrations both went out of their way to turn a blind eye toward the housing bubble’s spectacular frauds, and there’s no reason to think Obama’s appointees in the Justice Department will get around to doing their jobs this time either.Once the imminent shale bust comes and goes, in other words, it’s a safe bet that there will be more bubbles, each one propping up the otherwise dismal prospects of the financial industry for a little while, and then delivering another body blow to the economies of America and the world as it bursts.

This isn’t merely a problem for those who have investments, or those whose jobs depend in one way or another on the services the financial industry provides when it’s not too busy committing securities fraud to get around to it. The coming of a permanent bubble-and-bust economy puts a full stop at the end of any remaining prospect for even the most tentative national transition away from our current state of dependence on fossil fuels.Pick a project, any project, from so sensible a step as rebuilding the nation’s long-neglected railroads all the way to such pie-in-the-sky vaporware as solar power satellites, and it’s going to take plenty of investment capital.If it’s to be done on any scale, furthermore, we’re talking about a period of decades in which more capital every year will have to flow into the project.

The transition to a bubble-and-bust economy makes that impossible. Bubbles last for an average of three years or so, so even if the bubble-blowers on Wall Street happen by accident on some project that might actually help, it will hardly have time to get started before the bubble turns to bust, the people who invested in the project get burned, and the whole thing tumbles down into disillusionment andbankruptcy.If past experience is anything to go by, furthermore, most of the money thus raised will be diverted from useful purposes into the absurd bonuses and salaries bankers and brokers think society owes them for their services.

Over the longer run, a repeated drumbeat of failed investments and unpunished fraud puts the entire system of investment itself at risk.The trust that leads people to invest their assets, rather than hiding them in a hole in the ground, is a commons; like any commons, it can be destroyed by abuse; and since the federal government has abandoned its statutory duty to protect that commons by enforcing laws against securities fraud, a classic tragedy of the commons is the most likely outcome, wrecking the system by which our society directs surplus wealth toward productive uses and putting any collective response to the end of the fossil fuel age permanently out of reach.

 

 

All these are crucial issues. Still, there’s a second point of more immediate importance. I don’t think anybody knows exactly how big the shale bubble has become, but it’s been one of Wall Street’s few really large profit centers over the last three years.  It’s quite possible that the bubble is large enough to cause a major financial panic when it bursts, and send the United States and the world down into yet another sharp economic downturn.  As Yogi Berra famously pointed out, it’s tough to make predictions, especially about the future; still, I don’t think it’s out of place to suggest that sensible preparations for hard times might be wise just now, and if any of my readers happen to have anything invested in the shale or financial industries, I’d encourage them to consider other options in the fairly near term.

 

The End of the Shale Bubble?

[John Michael Greer]

Written by testudoetlepus

March 4th, 2013 at 2:19 pm

The Center Cannot Hold

without comments

by John Michael Greer

When William Butler Yeats put the phrase I’ve used as the title for this week’s post into the powerful and prescient verses of “The Second Coming,” he had deeper issues in mind than the crisis of power in a declining American empire. Still, the image is anything but irrelevant here; the political evolution of the United States over the last century has concentrated so many of the responsibilities of government in Washington DC that the entire American system is beginning to crack under the strain.

This is admittedly not the way you’ll hear the centralization of power in America discussed by those few voices in our national conversation who discuss it at all. On the one hand are the proponents of centralized power, who insist that leaving any decision at all in the hands of state or local authorities is tantamount to handing it over to their bogeyman du jour—whether that amounts to the bedsheet-bedecked Southern crackers who populate the hate speech of the left, say, or the more diverse gallery of stereotypes that plays a similar role on the right. On the other hand are those who insist that the centralization of power in America is the harbinger of a totalitarian future that will show up George Orwell as an incurable optimist.

I’ve already talked in a number of previous posts about the problems with this sort of thinking, with its flattening out of the complexities of contemporary politics into an opposition between warm fuzzy feelings and cold prickly ones. I’d like, to pursue the point a little further, to offer two unpopular predictions about the future of American government.The first is that the centralization of power in Washington DC has almost certainly reached its peak, and will be reversing in the decades ahead of us. The second is that, although there will inevitably be downsides to that reversal, it will turn out by and large to be an improvement over the system we have today.These predictions unfold from a common logic; both are consequences of the inevitable failure of overcentralized power.

It’s easy to get caught up in abstractions here, and even easier to fall into circular arguments around the functions of political power that attract most of the attention these days—for example, the power to make war. I’ll be getting to this latter a bit further on in this post, but I want to start with a function of government slightly less vexed by misunderstandings. The one I have in mind is education.

In the United States, for a couple of centuries now, the provision of free public education for children has been one of the central functions of government. Until fairly recently, in most of the country, it operated in a distinctive way. Under legal frameworks established by each state, local school districts were organized by the local residents, who also voted to tax themselves to pay the costs of building and running schools.Each district was managed by a school board, elected by the local residents, and had extensive authority over the school district’s operations.

In most parts of the country, school districts weren’t subsets of city, township, or county governments, or answerable to them; they were single-purpose independent governments on a very small scale, loosely supervised by the state and much more closely watched by the local voters. On the state level, a superintendent of schools or a state board of education, elected by the state’s voters, had a modest staff to carry out the very limited duties of oversight and enforcement assigned by the state legislature.On the federal level, a bureaucracy not much larger supervised the state boards of education, and conducted the even more limited duties assigned it by Congress.

Two results of that system deserve notice.First of all, since individual school districts were allowed to set standards, chose textbooks, and manage their own affairs, there was a great deal of diversity in American education. While reading, writing, and ‘rithmetic formed the hard backbone of the school day, and such other standards as history and geography inevitably got a look in as well, what else a given school taught was as varied as local decisions could make them. What the local schools put in the curriculum was up to the school board and, ultimately, to the voters, who could always elect a reform slate to the school board if they didn’t like what was being taught.

Second, the system as a whole gave America a level of public literacy and general education that was second to none in the industrial world, and far surpassed the poor performance of the far more lavishly funded education system the United States has today.In a previous post, I encouraged readers to compare the Lincoln-Douglas debates of 1858 to the debates in our latest presidential contest, and to remember that most of the people who listened attentively to Lincoln and Douglas had what then counted as an eighth-grade education.The comparison has plenty to say about the degeneration of political thinking in modern America, but it has even more to say about the extent to which the decline in public education has left voters unprepared to get past the soundbite level of thinking.

Those of my readers who want an even more cogent example are encouraged to leaf through a high school textbook from before the Second World War. You’ll find that the reading comprehension, reasoning ability, and mathematical skill expected as a matter of course from ninth-graders in 1930 is hard to find among American college graduates today. If you have kids of high school age, spend half an hour comparing the old textbook with the one your children are using today.You might even consider taking the time to work through a few of the assignments in the old textbook yourself.

Plenty of factors have had a role in the dumbing-down process that gave us our current failed system of education, to be sure, but I’d like to suggest that the centralization of power over the nation’s educational system in a few federal bureaucracies played a crucial role. To see how this works, again, a specific example is useful. Let’s imagine a child in an elementary school in Lincoln, Nebraska, who is learning how to read. Ask yourself this: of all the people concerned with her education, which ones are able to help that individual child tackle the daunting task of figuring out how to transform squiggles of ink into words in her mind?

The list is fairly small, and her teacher and her parents belong at the top of it. Below them are a few others: a teacher’s aide if her classroom has one, an older sibling, a friend who has already managed to learn the trick. Everyone else involved is limited to helping these people do their job. Their support can make that job somewhat easier—for example, by making sure that the child has books, by seeing to it that the classroom is safe and clean, and so on—but they can’t teach reading. Each supporting role has supporting roles of its own; thus the district’s purchasing staff, who keep the school stocked with textbooks, depend on textbook publishers and distributors, and so on. Still, the further you go from the child trying to figure out that C-A-T means “cat,” the less effect any action has on her learning process.

Now let’s zoom back 1200 miles or so to Washington DC and the federal Department of Education. It’s a smallish federal bureaucracy, which means that in the last year for which I was able to find statistics, 2011, it spent around $71 billion.Like many other federal bureaucracies, its existence is illegal. I mean that quite literally; the US constitution assigns the federal government a fairly limited range of functions, and “those powers necessary and convenient” to exercise them; by no stretch of the imagination can managing the nation’s public schools be squeezed into those limits. Only the Supreme Court’s embarrassingly supine response to federal power grabs during most of the twentieth century allows the department to exist at all.

So we have a technically illegal bureaucracy running through $71 billion of the taxpayers’ money in a year, which is arguably not a good start. The question I want to raise, though, is this:what can the staff of the Department of Education do that will have any positive impact on that child in the classroom in Lincoln, Nebraska? They can’t teach the child themselves; they can’t fill any of the supporting roles that make it possible for the child to be taught. They’re 1200 miles away, enacting policies that apply to every child in every classroom, irrespective of local conditions, individual needs, or any of the other factors that make teaching a child to read different from stamping out identical zinc bushings.

There are a few—a very few—things that can usefully be done for education at the national level. One of them is to make sure that the child in Lincoln is not denied equal access to education because of her gender, her skin color, or the like. Another is to provide the sort of overall supervision to state boards of education that state boards of education traditionally provided to local school boards. There are a few other things that belong on the same list.All of them can be described, to go back to a set of ideas I sketched out a couple of weeks ago, as measures to maintain the commons.

Public education is a commons. The costs are borne by the community as a whole, while the benefits go to individuals:the children who get educated, the parents who don’t have to carry all the costs of their children’s education, the employers who don’t have to carry all the costs of training employees, and so on. Like any other commons, this one is vulnerable to exploitation when it’s not managed intelligently, and like most commons in today’s America, this one has taken quite a bit of abuse lately, with the usual consequences. What makes this situation interesting, in something like the sense of the apocryphal Chinese proverb, is that the way the commons of public education is being managed has become the principal force wrecking the commons.

The problem here is precisely that of centralization. The research for which economist Elinor Ostrom won her Nobel Prize a few years back showed that, by and large, effective management of a commons is a grassroots affair; those who will be most directly affected by the way the commons is managed are also its best managers.The more distance between the managers and the commons they manage, the more likely failure becomes, because two factors essential to successful management simply aren’t there. The first of them is immediate access to information about how management policies are working, or not working, so that those policies can be adjusted immediately if they go wrong; the second is a personal stake in the outcome, so that the managers have the motivation to recognize when a mistake has been made, rather than allowing the psychology of previous investment to seduce them into pursuing a failed policy right into the ground.

Those two factors don’t function in an overcentralized system.Politicians and bureaucrats don’t get to see the consequences of their failed decisions up close, and they don’t have any motivation to admit that they were wrong and pursue new policies—quite the contrary, in fact.Consider, for example, the impact of the No Child Left Behind (NCLB) Act, pushed through Congress by bipartisan majorities and signed with much hoopla by George W. Bush in 2002. In the name of accountability—a term that in practice means “finding someone to punish”—the NCLB Act requires mandatory standardized testing at specific grade levels, and requires every year’s scores to be higher than the previous year’s, in every school in the nation. Teachers and schools that fail to accomplish this face draconian penalties.

My readers may be interested to know that next year, by law, every child in America must perform at or above grade level. It’s reminiscent of the imaginary town of Lake Wobegon—“where all the children are above average”—except that this is no joke; what’s left of America’s public education system is being shredded by the efforts of teachers and administrators to save their jobs in a collapsing economy, by teaching to the tests and gaming the system, under the pressure of increasingly unreal mandates from Washington DC. Standardized test scores have risen slightly; meaningful measures of literacy, numeracy, and other real-world skills have continued to move raggedly downward, and you can bet that the only response anybody in Washington is going to be willing to discuss is yet another round of federal mandates, most likely even more punitive and less effective than the current set.

Though I’ve used education as an example, nearly every part of American life is pervaded by the same failed logic of overcentralization. Another example?Consider the Obama administration’s giddy pursuit of national security via drone attacks.As currently operated, Predator drones are the ne plus ultra in centralized warfare; each drone attack has to be authorized by Obama himself, the drone is piloted via satellite link from a base in Nevada, and you can apparently sit in the situation room in the White House and watch the whole thing live. Hundreds of people have been blown to kingdom come by these attacks so far, in the name of a war on terror that Obama’s party used to denounce.

Now of course that habit only makes sense if you’re willing to define young children and wedding party attendees as terrorists, which seems a little extreme to me. Leaving that aside, though, there’s a question that needs to be asked: is it working? Since none of the areas under attack are any less full of anti-American insurgents than they have been, and the jihadi movement has been able to expand its war dramatically in recent weeks into Libya and Mali, the answer is pretty clearly no. However technically superlative the drones themselves are, the information that guides them comes via the notoriously static-filled channels of intelligence collection and analysis, and the decision to use them takes place in the even less certain realms of tactics and strategy; nor is it exactly bright, if you want to dissuade people from seeking out Americans and killing them, to go around vaporizing people nearly at random in parts of the world where avenging the murder of a family member is a sacred duty.

In both cases, and plenty of others like them, we have other alternatives, but all of them require the recognition that the best response to a failed policy isn’t a double helping of the same. That recognition is nowhere in our collective conversation at the moment. It would be useful if more of us were to make an effort to put it there, but there’s another factor in play. The center really cannot hold, and as it gives way, a great many of today’s political deadlocks will give way with it.

Eliot Wigginton, the teacher in rural Georgia who founded the Foxfire project and thus offered the rest of us an elegant example of what can happen when a purely local educational venture is given the freedom to flower and bear fruit, used to say that the word “learn” is properly spelled F-A-I-L. That’s a reading lesson worth taking to heart, if only because we’re going to have some world-class chances to make use of it in the years ahead. One of the few good things about really bad policies is that they’re self-limiting; sooner or later, a system that insists on embracing them is going to crash and burn, and once the rubble has stopped bouncing and the smoke clears away, it’s not too hard for the people standing around the crater to recognize that something has gone very wrong.In that period of clarity, it’s possible for a great many changes to be made, especially if there are clear alternatives available and people advocating for them.

In the great crises that ended each of America’s three previous rounds of anacyclosis—in 1776, in 1861, and in 1933—a great many possibilities that had been unattainable due to the gridlocked politics of the previous generation suddenly came within reach. In those past crises, the United States was an expanding nation, geographically, economically, and in terms of its ability to project power in the world; the crisis immediately ahead bids fair to arrive in the early stages of the ensuing contraction. That difference has important effects on the nature of the changes before us.

Centralized power is costly—in money, in energy, in every other kind of resource.Decentralized systems are much cheaper.In the days when the United States was mostly an agrarian society, and the extravagant abundance made possible by a global empire and reckless depletion of natural resources had not yet arrived, the profoundly localized educational system I sketched out earlier was popular because it was affordable.Even a poor community could count on being able to scrape together the political will and the money to establish a school district, even if that meant a one-room schoolhouse with one teacher taking twenty-odd children a day through grades one through eight. That the level of education that routinely came out of such one-room schoolhouses was measurably better than that provided by today’s multimillion-dollar school budgets is just one more irony in the fire.

 

 

On the downside of America’s trajectory, as we descend from empire toward whatever society we can manage to afford within the stringent limits of a troubled biosphere and a planet stripped of most of its nonrenewable resources, local systems of the one-room schoolhouse variety are much more likely to be an option than centralized systems of the sort we have today. That shift toward the affordably local will have many more consequences; I plan on exploring another of them next week.

 

The Center Cannot Hold<

[John Michael Greer]

Written by testudoetlepus

February 7th, 2013 at 8:02 pm

We Don’t Live In Neverland

without comments

byJohn Michael Greer

The return to an older American concept of government as the guarantor of the national commons, the theme of last week’s post here on The Archdruid Report, is to my mind one of the crucial steps that might just succeed in making a viable future for the post-imperial United States. A viable future, mind you, does not mean one in which any signficant number of Americans retain any significant fraction of the material abundance we currently get from the “wealth pump” of our global empire. The delusion that we can still live like citizens of an imperial power when the empire has gone away will be enormously popular, not least among those who currently insist they want nothing to do with the imperial system that guarantees their prosperity, but it’s still a delusion.

The end of American empire, it deserves repeating, means the end of a system in which the five per cent of humanity that live in the United States get to dispose of a quarter of the planet’s energy and a third of its raw materials and industrial product. Even if the fossil fuels that undergird the industrial product weren’t depleting out of existence—and of course they are—the rebalancing of global wealth driven by the decline of one empire and the rise of another will involve massive and often traumatic impacts, especially for those who have been living high on the hog under the current system and will have to get used to a much smaller portion of the world’s wealth in the years immediately ahead. Yes, dear reader, if you live in the United States or its inner circle of allies—Canada, Britain, Australia, Japan, and a few others—this means you.

I want to stress this point, because habits of thought already discussed in this sequence of posts make it remarkably difficult for most Americans to think about a future that isn’t either all warm fuzzy or all cold prickly. If an imagined future is supposed to be better than the one we’ve got, according to these habits of thought, it has to be better in every imaginable way, and if it’s worse, it has to be worse just as uniformly. Suggest that the United States might go hurtling down the far side of its imperial trajectory and come out of the process as a Third World nation, as I’ve done here, and you can count on blank incomprension or self-righteous anger if you go on to suggest that the nation that comes out the other side of this project might still be able to provide a range of basic social goods to its citizens, and might even recover some of the values it lost a century ago in the course of its headlong rush to empire.

Now in fact I’m going to suggest this, and indeed I’ve already sketched out some of the steps that individual Americans might choose to take to lay the foundations for that project. Still, it’s also worth noting that the same illogic shapes the other end of the spectrum of possible futures. These days, if you pick up a book offering a vision of a better future or a strategy to get there, it’s usually a safe bet that you can read the thing from cover to cover no reference whatsoever to any downsides, drawbacks, or tradeoffs that might be involved in pursuing the vision or enacting the strategy. Since every action in the real world has downsides, drawbacks, and tradeoffs, this is not exactly a minor omission, nor does the blithe insistence on ignoring such little details offer any reason to feel confident that the visions and strategies will actually work as advertised.

One example in particular comes to mind here, because it has immediate relevance to the project of this series of posts. Those of my readers who have been following the peak oil scene for any length of time will have encountered any number of enthusiastic discussions of relocalization: the process, that is, of disconnecting from the vast and extravagant global networks of production, consumption, and control that define so much of industrial society, in order to restore or reinvent local systems that will be more resilient in the face of energy shortages and other disruptions, and provide more security and more autonomy to those who embrace them.

A very good case can be made for this strategy. On the one hand, the extreme centralization of the global economy has become a source of massive vulnerabilities straight across the spectrum from the most abstract realms of high finance right down to the sprawling corporate structures that put food on your table. Shortfalls of every kind, from grain and fuel to financial capital, are becoming a daily reality for many people around the world as soaring energy costs put a galaxy of direct and indirect pressures on brittle and overextended systems. That’s only going to become worse as petroleum reserves and other vital resources continue to deplete. As this process continues, ways of getting access to necessities that are deliberately disconnected from the global economic system, and thus less subject to its vulnerabilities, are going to be well worth having in place.

At the same time, participation in the global economy brings with it vulnerabilities of another kind. For anyone who has to depend for their daily survival on the functioning of a vast industrial structure which is not answerable to the average citizen, talk about personal autonomy is little more than a bad joke, and the ability of communities to make their own choices and seek their own futures in such a context is simply another form of wishful thinking. Many people involved in efforts to relocalize have grasped this, and believe that deliberately standing aside from systems controlled by national governments and multinational corporations offers one of the few options for regaining personal and community autonomy in the face of an increasingly troubled future.

There are more points that can be made in favor of relocalization schemes, and you can find them rehashed endlessly on pro-relocalization websites all over the internet. For our present purposes, though, this fast tour of the upside will do, because each of these arguments comes with its own downside, which by and large you won’t find mentioned anywhere on those same websites.

The downside to the first argument? When you step out of the global economy, you cut yourself off from the imperial wealth pump that provides people in America with the kind of abundance they take for granted, and the lifestyles that are available in the absence of that wealth pump are far more restricted, and far more impoverished, than most would-be relocalizers like to think. Peasant cultures around the world are by and large cultures of poverty, and there’s a good reason for that: by the time you, your family, and the other people of your village have provided food on the table, thatch on the roof, a few necessary possessions, and enough of the local equivalent of cash to cover payments to the powers that be, whether those happen to be feudal magnates or the local property tax collector, you’ve just accounted for every minute of labor you can squeeze out of a day.

That’s the rock on which the back-to-the-land movement of the Sixties broke; the life of a full-time peasant farmer scratching a living out of the soil is viable, and it may even be rewarding, but it’s not the kind of life that the pampered youth of the Baby Boom era was willing to put up with for more than a fairly brief interval. It may well be that economic relocalization is still the best available option for dealing with the ongoing unraveling of the industrial economy—in fact, I’d agree that this is the case—but I wonder how many of its proponents have grappled with the fact that what they’re proposing may amount to no more than a way to starve with dignity while many others are starving without it.

The downside to the second argument is subtler, but in some ways even more revealing. The best way to grasp it is to imagine two relocalization projects, one in Massachusetts and the other in South Carolina. The people in both groups are enthusiastic about the prospect of regaining their personal autonomy from the faceless institutions of a centralized society, and just as eager to to bring back home to their own communities the power to make choices and pursue a better future. Now ask yourself this: what will these two groups do if they get that power? And what will the people in Massachusetts think about what the people in South Carolina will do once they get that power?

I’ve conducted a modest experiment of sorts along these lines, by reminding relocalization fans in blue states what people in red states are likely to do with the renewed local autonomy the people in the blue states want for themselves, and vice versa. Every so often, to be sure, I run across someone—more often on the red side of the line than on the blue one—whose response amounts to “let ‘em do what they want, so long as they let us do what we want.” Far more often, though, people on either side are horrified to realize that their opposite numbers on the other side of America’s widening cultural divide would use relocalization to enact their own ideals in their own communities.

More than once, in fact, the response has amounted to a flurry of proposals to hedge relocalization about with restrictions so that it can only be used to support the speaker’s own political and social agendas, with federal bureaucracies hovering over every relocalizing community, ready to pounce on any sign that a community might try to do something that would offend sensibilities in Boston or San Francisco, on the one hand, or the Bible Belt on the other. You might think, dear reader, that it would be obvious that this would be relocalization in name only; you might also think that it would be just as obvious that those same bureaucracies would fall promptly into the hands of the same economic and political interests that have made the current system as much of a mess as it is. Permit me to assure you that in my experience, among a certain segment of the people who like to talk about relocalization, these things are apparently not obvious at all.

By this point in the discussion, I suspect most of my readers have come to believe that I’m opposed to relocalization schemes. Quite the contrary, I think they’re among the best options we have, and the fact that they have significant downsides, drawbacks, and tradeoffs does not nullify that. Every possible strategy, again, has downsides, drawbacks, and tradeoffs; whatever we choose to do to face the onset of the Long Descent, as individuals, as communities, or as a nation, problems are going to ensue and people are going to get hurt. Trying to find an option that has no downsides simply guarantees that we will do nothing at all; and in that case, equally, problems are going to ensue and people are going to get hurt. That’s how things work in the real world—and it may be worth reminding my readers that we don’t live in Neverland.

Thus I’d like to suggest that a movement toward relocalization is another crucial ingredient of a viable post-imperial America. In point of fact, we’ve got the structures in place to do the thing already; the only thing that’s lacking is a willingness to push back, hard, against certain dubious habits in the US political system that have rendered those structures inoperative.

Back in 1787, when the US constitution was written, the cultural differences between Massachusetts and South Carolina were very nearly as sweeping as they are today. That’s one of the reasons why the constitution as written left most internal matters in the hands of the individual states, and assigned to the federal government only those functions that concerned the national commons as a whole: war, foreign policy, minting money, interstate trade, postal services, and a few other things. The list was expanded in a modest way before the rush to empire, so that public health and civil rights, for example, were brought under federal supervision over the course of the 19th century. Under the theory of government I described last week, these were reasonable extensions, since they permitted the federal government to exercise its function of securing the national commons.

Everything else remained in the hands of the states and the people. In fact, the tenth amendment to the US constitution specifically requires that any power not granted to the federal government in so many words be left to the states and the people—a principle which, perhaps not surprisingly, has been roundly ignored by everyone in Washington DC for most of a century now. Under the constitution and its first nineteen amendments, in fact, the states were very nearly separate countries who happened to have an army, navy, foreign policy, and postal system in common.

Did that system have problems? You bet. What rights you had and what benefits you could expect as a citizen depended to a huge extent on where you lived—not just which state, but very often which county and which township or city as well. Whole classes of citizens might be deprived of their rights or the protection of the laws by local politicians or the majorities that backed them, and abuses of power were pervasive. All of that sounds pretty dreadful, until you remember that the centralization of power that came with America’s pursuit of empire didn’t abolish any of those things; it simply moved them to a national level. Nowadays, serving the interests of the rich and influential at the expense of the public good is the job of the federal government, rather than the local sheriff, and the denial of civil rights and due process that used to be restricted to specific ethnic and economic subgroups within American society now gets applied much more broadly.

Furthermore, one of the things that’s rendered the US government all but incapable of taking any positive action at all in the face of a widening spiral of crises is precisely the insistence, by people in Massachusetts, South Carolina, and the other forty-eight states as well, that their local views and values ought to be the basis of national policy. The rhetoric that results, in tones variously angry and plaintive, amounts to “Why can’t everyone else be reasonable and do it my way?”—which is not a good basis for the spirit of compromise necessary to the functioning of democracy, though it makes life easy for advocacy groups who want to shake down the citizenry for another round of donations to pay for the never-ending fight.

One of the few things that might succeed in unsticking the gridlock, so that the federal government could get back to doing the job it’s supposed to do, would be to let the people in Massachusetts, South Carolina, and the other forty-eight states pursue the social policies they prefer on a state by state basis. Yes, that would mean that people in South Carolina would do things that outraged the people in Massachusetts, and people in Massachusetts would return the favor. Yes, it would also mean that abuses and injustices would take place. Of course abuses and injustices take place now, in both states and all the others as well, but the ones that would take place in the wake of a transfer of power over social issues back to the states would no doubt be at least a little different from the current ones.

Again, the point of relocalization schemes is not that they will solve every problem. They won’t, and in fact they will certainly cause new problems we don’t have yet. The point of relocalization schemes is that, all things considered, if they’re pursued intelligently, the problems that they will probably solve are arguably at least a little worse than the problems that they will probably cause. Does that sound like faint praise? It’s not; it’s as much as can be expected for any policy this side of Neverland, in the real world, where every solution brings new problems of its own.

Now in fact relocalization has at least two other benefits that tip the balance well into positive territory. One of them is an effect I haven’t discussed in this series of posts, and I haven’t seen covered anywhere else in the peak oil blogosphere yet; it will need a post of its own, and that will have to wait a week. The other, though, is a simple matter of resilience.

The more territory has to be governed from a single political center, all things considered, the more energy and resources will be absorbed in the process of governing. This is why, before the coming of the industrial age, nations on the scale of the present United States of America rarely existed, and when they did come into being, they generally didn’t last for more than a short time. In an age of declining energy availability and depleting resources, the maintenance costs of today’s sprawling, centralized United States government won’t be affordable for long.  Devolving all nonessential functions of the central government to the individual states, as the US constitution mandates, might just cut costs to the point that some semblance of civil peace and democratic governance can hang on for the long term. That probably doesn’t seem like much to those whose eyes are fixed on fantasies of a perfect world, and are convinced they can transform it from fantasy to reality as soon as everyone else stops being unreasonable and agrees with them. Still, it’s better than most potential outcomes available to us in the real world—and again, we don’t live in Neverland.

 

We Don’t Live In Neverland

[John Michael Greer]

Written by testudoetlepus

February 6th, 2013 at 12:03 am

Restoring the Commons

without comments

by John Michael Greer

The hard work of rebuilding a post-imperial America, as I suggested in last week’s post, is going to require the recovery or reinvention of many of the things this nation chucked into the dumpster with whoops of glee as it took off running in pursuit of its imperial ambitions. The basic skills of democratic process are among the things on that list; so, as I suggested last month, are the even more basic skills of learning and thinking that undergird the practice of democracy.

All that remains crucial. Still, it so happens that a remarkably large number of the other things that will need to be put back in place are all variations of a common theme. What’s more, it’s a straightforward theme—or, more precisely, would be straightforward if so many people these days weren’t busy trying to pretend that the concept at its center either doesn’t exist or doesn’t present the specific challenges that have made it so problematic in recent years. The concept in question? The mode of collective participation in the use of resources, extending from the most material to the most abstract, that goes most often these days by the name of “the commons.”

The redoubtable green philosopher Garrett Hardin played a central role decades ago in drawing attention to the phenomenon in question with his essay The Tragedy of the Commons. It’s a remarkable work, and it’s been rendered even more remarkable by the range of contortions engaged in by thinkers across the economic and political spectrum in their efforts to evade its conclusions. Those maneuvers have been tolerably successful; I suspect, for example, that many of my readers will recall the flurry of claims a few years back that the late Nobel Prize-winning economist Elinor Ostrom had “disproved” Hardin with her work on the sustainable management of resources.

In point of fact, she did no such thing. Hardin demonstrated in his essay that an unmanaged commons faces the risk of a vicious spiral of mismanagement that ends in the common’s destruction; Ostrom got her Nobel, and deservedly so, by detailed and incisive analysis of the kinds of management that prevent Hardin’s tragedy of the commons from taking place. A little later in this essay, we’ll get to why those kinds of management are exactly what nobody in the mainstream of American public life wants to talk about just now; the first task at hand is to walk through the logic of Hardin’s essay and understand exactly what he was saying and why it matters.

Hardin asks us to imagine a common pasture, of the sort that was common in medieval villages across Europe. The pasture is owned by the village as a whole; each of the villagers has the right to put his cattle out to graze on the pasture. The village as a whole, however, has no claim on the milk the cows produce; that belongs to the villager who owns any given cow. The pasture is a collective resource, from which individuals are allowed to extract private profit; that’s the basic definition of a commons.

In the Middle Ages, such arrangements were common across Europe, and they worked well because they were managed by tradition, custom, and the immense pressure wielded by informal consensus in small and tightly knit communities, backed up where necessary by local manorial courts and a body of customary law that gave short shrift to the pursuit of personal advantage at the expense of others. The commons that Hardin asks us to envision, though, has no such protections in place. Imagine, he says, that one villager buys additional cows and puts them out to graze on the common pasture. Any given pasture can only support so many cows before it suffers damage; to use the jargon of the ecologist, it has a fixed carrying capacity for milk cows, and exceeding the carrying capacity will degrade the resource and lower its future carrying capacity. Assume that the new cows raise the total number of cows past what the pasture can support indefinitely, so once the new cows go onto the pasture, the pasture starts to degrade.

Notice how the benefits and costs sort themselves out. The villager with the additional cows receives all the benefit of the additional milk his new cows provide, and he receives it right away. The costs of his action, by contrast, are shared with everyone else in the village, and their impact is delayed, since it takes time for pasture to degrade. Thus, according to today’s conventional economic theories, the villager is doing the right thing. Since the milk he gets is worth more right now than the fraction of the discounted future cost of the degradation of the pasture he will eventually have to carry, he is pursuing his own economic interest in a rational manner.

The other villagers, faced with this situation, have a choice of their own to make. (We’ll assume, again, that they don’t have the option of forcing the villager with the new cows to get rid of them and return the total herd on the pasture to a level it can support indefinitely.) They can do nothing, in which case they bear the costs of the degradation of the pasture but gain nothing in return, or they can buy more cows of their own, in which case they also get more milk, but the pasture degrades even faster. According to most of today’s economic theories, the latter choice is the right one, since it allows them to maximize their own economic interest in exactly the same way as the first villager. The result of the process, though, is that a pasture that would have kept a certain number of cattle fed indefinitely is turned into a barren area of compacted subsoil that won’t support any cattle at all. The rational pursuit of individual advantage thus results in permanent impoverishment for everybody.

This may seem like common sense. It is common sense, but when Hardin first published “The Tragedy of the Commons” in 1968, it went off like a bomb in the halls of academic economics. Since Adam Smith’s time, one of the most passionately held beliefs of capitalist economics has been the insistence that individuals pursuing their own economic interest without interference from government or anyone else will reliably produce the best outcome for everybody. You’ll still hear defenders of free market economics making that claim, as if nobody but the Communists ever brought it into question. That’s why very few people like to talk about Hardin’s tragedy of the commons these days; it makes it all but impossible to uphold a certain bit of popular, appealing, but dangerous nonsense.

Does this mean that the rational pursuit of individual advantage always produces negative results for everyone? Not at all. The theorists of capitalism can point to equally cogent examples in which Adam Smith’s invisible hand passes out benefits to everyone, and a case could probably be made that this happens more often than the opposite. The fact remains that the opposite does happen, not merely in theory but also in the real world, and that the consequences of the tragedy of the commons can reach far beyond the limits of a single village.

Hardin himself pointed to the destruction of the world’s oceanic fisheries by overharvesting as an example, and it’s a good one. If current trends continue, many of my readers can look forward, over the next couple of decades, to tasting the last seafood they will ever eat. A food resource that could have been managed sustainably for millennia to come is being annihilated in our lifetimes, and the logic behind it is that of the tragedy of the commons: participants in the world’s fishing industries, from giant corporations to individual boat owners and their crews, are pursuing their own economic interests, and exterminating one fishery after another in the process.

Another example? The worldwide habit of treating the atmosphere as an aerial sewer into which wastes can be dumped with impunity. Every one of my readers who burns any fossil fuel, for any purpose, benefits directly from being able to vent the waste CO2 directly into the atmosphere, rather than having to cover the costs of disposing of it in some other way. As a result of this rational pursuit of personal economic interest, there’s a very real chance that most of the world’s coastal cities will have to be abandoned to the rising oceans over the next century or so, imposing trillions of dollars of costs on the global economy.

Plenty of other examples of the same kind could be cited. At this point, though, I’d like to shift focus a bit to a different class of phenomena, and point to the Glass-Steagall Act, a piece of federal legislation that was passed by the US Congress in 1933 and repealed in 1999. The Glass-Steagall Act made it illegal for banks to engage in both consumer banking activities such as taking deposits and making loans, and investment banking activities such as issuing securities; banks had to choose one or the other. The firewall between consumer banking and investment banking was put in place because in its absence, in the years leading up to the 1929 crash, most of the banks in the country had gotten over their heads in dubious financial deals linked to stocks and securities, and the collapse of those schemes played a massive role in bringing the national economy to the brink of total collapse.

By the 1990s, such safeguards seemed unbearably dowdy to a new generation of bankers, and after a great deal of lobbying the provisions of the Glass-Steagall Act were eliminated. Those of my readers who didn’t spend the last decade hiding under a rock know exactly what happened thereafter: banks went right back to the bad habits that got their predecessors into trouble in 1929, profited mightily in the short term, and proceeded to inflict major damage on the global economy when the inevitable crash came in 2008.

That is to say, actions performed by individuals (and those dubious “legal persons” called corporations) in the pursuit of their own private economic advantage garnered profits over the short term for those who engaged in them, but imposed long-term costs on everybody. If this sounds familiar, dear reader, it should. When individuals or corporations profit from their involvement in an activity that imposes costs on society as a whole, that activity functions as a commons, and if that commons is unmanaged the tragedy of the commons is a likely result. The American banking industry before 1933 and after 1999 functioned, and currently functions, as an unmanaged commons; between those years, it was a managed commons. While it was an unmanaged commons, it suffered from exactly the outcome Hardin’s theory predicts; when it was a managed commons, by contrast, a major cause of banking failure was kept at bay, and the banking sector was more often a source of strength than a source of weakness to the national economy.

It’s not hard to name other examples of what I suppose we could call “commons-like phenomena”—that is, activities in which the pursuit of private profit can impose serious costs on society as a whole—in contemporary America. One that bears watching these days is food safety. It is to the immediate financial advantage of businesses in the various industries that produce food for human consumption to cut costs as far as possible, even if this occasionally results in unsafe products that cause sickness and death to people who consume them; the benefits in increased profits are immediate and belong entirely to the business, while the costs of increased morbidity and mortality are borne by society as a whole, provided that your legal team is good enough to keep the inevitable lawsuits at bay. Once again, the asymmetry between benefits and costs produces a calculus that brings unwelcome outcomes.

The American political system, in its pre-imperial and early imperial stages, evolved a distinctive response to these challenges. The Declaration of Independence, the wellspring of American political thought, defines the purpose of government as securing the rights to life, liberty, and the pursuit of happiness. There’s more to that often-quoted phrase than meets the eye. In particular, it doesn’t mean that governments are supposed to provide anybody with life, liberty, or happiness; their job is simply to secure for their citizens certain basic rights, which may be inalienable—that is, they can’t be legally transferred to somebody else, as they could under feudal law—but are far from absolute. What citizens do with those rights is their own business, at least in theory, so long as their exercise of their rights does not interfere too drastically with the ability of others to do the same thing. The assumption, then and later, was that citizens would use their rights to seek their own advantage, by means as rational or irrational as they chose, while the national community as a whole would cover the costs of securing those rights against anyone and anything that attempted to erase them.

That is to say, the core purpose of government in the American tradition is the maintenance of the national commons. It exists to manage the various commons and commons-like phenomena that are inseparable from life in a civilized society, and thus has the power to impose such limits on people (and corporate pseudopeople) as will prevent their pursuit of personal advantage from leading to a tragedy of the commons in one way or another. Restricting the capacity of banks to gamble with depositors’ money is one such limit; restricting the freedom of manufacturers to sell unsafe food is another, and so on down the list of reasonable regulations. Beyond those necessary limits, government has no call to intervene; how people choose to live their lives, exercise their liberties, and pursue happiness is up to them, so long as it doesn’t put the survival of any part of the national commons at risk.

As far as I know, you won’t find that definition taught in any of the tiny handful of high schools that still offer civics classes to young Americans about to reach voting age. Still, it’s a neat summary of generations of political thought in pre-imperial and early imperial America. These days, by contrast, it’s rare to find this function of government even hinted at. Rather, the function of government in late imperial America is generally seen as a matter of handing out largesse of various kinds to any group organized or influential enough to elbow its way to a place at the feeding trough. Even those people who insist they are against all government entitlement programs can be counted on to scream like banshees if anything threatens those programs from which they themselves benefit; the famous placard reading “Government Hands Off My Medicare” is an embarrassingly good reflection of the attitude that most American pseudoconservatives adopt in practice, however loudly they decry government spending in theory.

A strong case can be made, though, for jettisoning the notion of government as national sugar daddy and returning to the older notion of government as guarantor of the national commons. The central argument in that case is simply that in the wake of empire, the torrents of imperial tribute that made the government largesse of the recent past possible in the first place will go away. As the United States loses the ability to command a quarter of the world’s energy supplies and a third of its natural resources and industrial product, and has to make do with the much smaller share it can expect to produce within its own borders, the feeding trough in Washington DC—not to mention its junior equivalents in the fifty state capitals, and so on down the pyramid of American government—is going to run short.

In point of fact, it’s already running short. That’s the usually unmentioned factor behind the intractable gridlock in our national politics: there isn’t enough largesse left to give every one of the pressure groups and veto blocs its accustomed share, and the pressure groups and veto blocs are responding to this unavoidable problem by jamming up the machinery of government with ever more frantic efforts to get whatever they can. That situation can only end in crisis, and probably in a crisis big enough to shatter the existing order of things in Washington DC; after the rubble stops bouncing, the next order of business will be piecing together some less gaudily corrupt way of managing the nation’s affairs.

That process of reconstruction might be furthered substantially if the pre-imperial concept of the role of government were to get a little more air time these days. I’ve spoken at quite some length here and elsewhere about the very limited contribution that grand plans and long discussions can make to an energy future that’s less grim than the one toward which we’re hurtling at the moment, and there’s a fair bit of irony in the fact that I’m about to suggest exactly the opposite conclusion with regard to the political sphere. Still, the circumstances aren’t the same. The time for talking about our energy future was decades ago, when we still had the time and the resources to get new and more sustainable energy and transportation systems in place before conventional petroleum production peaked and sent us skidding down the far side of Hubbert’s peak. That time is long past, the options remaining to us are very narrow, and another round of conversation won’t do anything worthwhile to change the course of events at this point.

That’s much less true of the political situation, because politics are subject to rules very different from the implacable mathematics of petroleum depletion and net energy. At some point in the not too distant future, the political system of the United States of America is going to tip over into explosive crisis, and at that time ideas that are simply talking points today have at least a shot at being enacted into public policy. That’s exactly what happened at the beginning of the three previous cycles of anacyclosis I traced out in a previous post in this series. In 1776, 1860, and 1933, ideas that had been on the political fringes not that many years beforehand redefined the entire political dialogue, and in all three cases this was possible because those once-fringe ideas had been widely circulated and widely discussed, even though most of the people who circulated and discussed them never imagined that they would live to see those ideas put into practice. There are plenty of ideas about politics and society in circulation on the fringes of today’s American dialogue, to be sure. I’d like to suggest, though, that there’s a point to reviving an older, pre-imperial vision of what government can do, and ought to do, in the America of the future. A political system that envisions its role as holding an open space in which citizens can pursue their own dreams and experiment with their own lives is inherently likely to be better at dissensus than more regimented alternatives, whether those come from the left or the right—and dissensus, to return to a central theme of this blog, is the best strategy we’ve got as we move into a future where nobody can be sure of having the right answers.

 

Restoring the Commons

[John Michael Greer]

Written by testudoetlepus

January 25th, 2013 at 9:54 pm

The Road Down from Empire

without comments


by John Michael Greer

Here in the Appalachians, at least, there’s something about the month of January that encourages sober thoughts. Maybe it’s the weather, which is pretty reliably gray and cold; maybe it’s the arrival of the bills from the holiday season just ended, or the awkward way that those bills usually arrive about the same time that the annual crop of New Year’s resolutions start landing in the recycle bin. Pick your reason, but one way or another it seems like a good time to circle back and finish up the theme I’ve been developing here for most of a year now, the decline and fall of America’s global empire and the difficult task of rebuilding something worthwhile in its wake.

The hard work of reinventing democracy in a post-imperial America, the subject of several of last month’s posts, is only one facet of this broader challenge. I’ve mentioned before that the pursuit of empire is a drug, and like most other drugs, it makes you feel great at the time and then wallops you the next morning. It’s been just over a hundred years now since the United States launched itself on its path to global empire, and the hangover that was made inevitable by that century-long bender is waiting in the wings. I suspect one of the reasons the US government is frantically going through the empties in the trash, looking for one that still has a few sips left in it, is precisely that first dim dawning awareness of just how bad the hangover is going to be.

It’s worth taking a few moments to go over some of the more visible signposts of the road down from empire. To begin with, the US economy has been crippled by a century of imperial tribute flowing in from overseas. That’s what happened to our manufacturing sector; once the rest of the industrial world recovered from the Second World War, manufacturers in an inflated tribute economy couldn’t compete with the lower costs of factories in less extravagantly overfunded parts of the world, and America’s industrial heartland turned into the Rust Belt. As the impact of the tribute economy spread throughout US society, in turn, it became next to impossible to make a living doing anything productive, and gaming the imperial system in one way or another—banking, investment, government contracts, you name it—turned into the country’s sole consistent growth industry.

That imposed distortions on every aspect of American society, which bid fair to cripple its ability to pick up the pieces when the empire goes away. As productive economic sectors withered, the country’s educational system reoriented itself toward the unproductive, churning out an ever-expanding range of administrative specialties for corporations and government while shutting down what was once a world-class system of vocational and trade schools. We now have far more office fauna than any sane society needs, and a drastic shortage of people who have any less abstract skill set. For the time being, we can afford to offshore jobs, or import people from other countries to do them at substandard wages; as our empire winds down and those familiar bad habits stop being possible, the shortage of Americans with even the most basic practical skills will become a massive economic burden.

Meanwhile the national infrastructure is caught in a downward spiral of malign neglect made inevitable by the cash crunch that always hits empires on the way down. Empire is an expensive habit;the long-term effects of the imperial wealth pump on those nations subjected to its business end mean that the income from imperial arrangements goes down over time, while the impact of the tribute economy at home generally causes the costs of empire go up over time. The result can be seen on Capitol Hill day by day, as one fantastically expensive weapons system after another sails through Congress with few dissenting votes, while critically important domestic programs are gutted by bipartisan agreement, or bog down in endless bickering. The reliable result is a shell of a nation, seemingly strong when observed from outside but hollowing out within, and waiting for the statistically inevitable shove that will launch it on its final skid down the rough slope into history’s compost bin.

You may well be thinking, dear reader, that the logical response of a nation caught in a predicament of this sort would be to bite the bullet, back away from empire in a deliberate fashion, and use the last bit of income from the tribute economy to pay for the expenses of rebuilding a domestic economy of a more normal kind. You’d be right, too, but there are compelling reasons why very few empires in history have had the great good sense to manage their decline in this manner. Imperial China did it in the fifteenth century, scrapping a burgeoning maritime empire in the Indian Ocean, and of course Britain did it after 1945, though that was largely because a 500-pound gorilla named the United States was sitting on Britannia’s prostrate body, informing her politely that in future, the global empire would be American, thank you very much; other than that, examples are few and far between.

The logic here is easy to follow. Any attempt to withdraw from imperial commitments will face concerted resistance from those who profit from the status quo, while those who claim to oppose empire are rarely willing to keep supporting a policy of imperial retreat once it turns out, as it inevitably does, that the costs of that policy will include a direct impact on their own incomes or the value of their investments. Thus politicians who back a policy of withdrawal from empire can count on being pilloried by their opponents as traitors to their country, and abandoned by erstwhile allies who dislike empire in the abstract but want to retain lifestyles that only an imperial tribute economy can support. Since politicians are, after all, in the business of getting into office and staying there, their enthusiasm for such self-sacrificing policies is understandably limited.

The usual result is a frantic effort to kick the can as far as possible down the road, so that somebody else has to deal with it. Most of what’s going on in Washington DC these days can be described very exactly in those terms. Despite popular rhetoric, America’s politicians these days are not unusually wicked or ignorant; they are, by and large, roughly as ethical as their constituents, and rather better educated—though admittedly neither of these is saying much. What distinguishes them from the statesmen of an earlier era, rather, is that they are face to face with an insoluble dilemma that their predecessors in office spent the last few decades trying to ignore. As the costs of empire rise, the profits of empire dwindle, the national economy circles the drain, the burden of deferred maintenance on the nation’s infrastructure grows, and the impact of the limits to growth on industrial civilization worldwide becomes ever harder to evade, they face the unenviable choice between massive trouble now and even more massive trouble later; being human, they repeatedly choose the latter, and console themselves with the empty hope that something might turn up.

It’s a common hope these days. I’ve commented here more than once about the way that the Rapture, the Singularity, and all the other apocalyptic fantasies on offer these days serve primarily as a means by which people can pretend to themselves that the future they’re going to get isn’t the one that their actions and evasions are busily creating for them. The same is true of a great many less gaudy fictions about the future—the much-ballyhooed breakthroughs that never quite get around to happening, the would-be mass movements that never attract anyone but the usual handful of activists, the great though usually unspecified leaps in consciousness that will allegedly happen any day now, and all the rest of it. The current frenzy of meretricious twaddle in the media about how shale gas is going to make the US a net energy exporter gets a good share of its impetus from the same delusive hope—though admittedly the fact that a great many people have invested a great deal of money in companies in the fracking business, and are trying to justify their investments using the same sort of reasoning that boosted the late housing bubble, also has more than a little to do with it.

There’s likely to be plenty more of the same thing in the decades ahead. Social psychologists have written at length about what James Howard Kunstler has usefully termed the psychology of previous investment, the process by which people convince themselves to throw bad money after good, or to remain committed to a belief system even though all available evidence demonstrates that it isn’t true and doesn’t work. The critical factor in such cases is the emotional cost of admitting that the decision to buy the stock, adopt the belief system, or make whatever other mistake is at issue, was in fact a mistake. The more painful it is to make that admission, the more forcefully most people will turn away from the necessity to do so, and it’s safe to assume that they’ll embrace the most consummate malarkey if doing so allows them to insist to themselves that the mistake wasn’t a mistake after all.

As America stumbles down from its imperial peak,in other words, the one growth industry this country will have left will consist of efforts to maintain the pretense that America doesn’t have an empire, that the empire isn’t falling, and that the fall doesn’t matter anyway. (Yes, those statements are mutually contradictory. Get used to it; you’ll be hearing plenty of statements in the years to come that are even more more incoherent. )As the decline accelerates, anyone who offers Americans a narrative that allows them to pretend they’ll get the shiny new future that our national mythology promises them will be able to count on a large and enthusiastic audience. The narratives being marketed for this purpose need not be convincing; they need not even be sane. So long as they make it possible for Americans to maintain the fiction of a brighter future in the teeth of the facts, they’ll be popular.

The one bit of hope I can offer here is that such efforts at collective make-believe don’t last forever. Sooner or later, the fact of decline will be admitted and, later still, accepted; sooner or later, our collective conversation will shift from how America can maintain perpetual growth to how America can hold onto what it has, then to how America can recover some of what it lost, and from there to figuring out how America—or whatever grab bag of successor societies occupies the territory currently held by the United States—can get by in the harsh new deindustrial world that grew up around it while nobody was looking. It’s a normal process in an age of decline, and can be traced in the literature of more than one civilization before ours.

It bears remembering, though, that individuals are going through the same process of redefinition all by themselves. This process differs from the five stages of peak oil, which I’ve discussed elsewhere, in that it’s not primarily about the emotional impact of loss; it’s a matter of expectations, and of the most pragmatic sort of economic expectations at that. Consider a midlevel managerial employee in some corporation or other whose job, like so many other jobs these days, is about to go away forever. Before the rumors start flying, she’s concerned mostly with clawing her way up the corporate ladder and increasing her share of the perks and privileges our society currently grants to its middle classes. Then the rumors of imminent layoffs start flying, and she abruptly has to shift her focus to staying employed. The pink slips come next, bearing bad news, and her focus shifts again, to getting a new job; when that doesn’t happen and the reality of long term joblessness sinks in, a final shift of focus takes place, and she has to deal with a new and challenging world.

This has already happened to a great many people in America. It’s going to happen, over the years ahead, to a great many more—probably, all things considered, to a large majority of people in the American middle class, just as it happened to a large majority of the industrial working class a few decades further back. Not everyone, it has to be said, will survive the transition; alcoholism, drug abuse, mental and physical illness, and suicide are among the standard risks run by the downwardly mobile. A fair number of those who do survive will spend the rest of their lives clinging to the vain hope that something will happen and give them back what they lost.

It’s a long, rough road down from empire, and the losses involved are not merely material in nature. Basing one’s identity on the privileges and extravagances made possible by the current US global empire may seem like a silly thing to do, but it’s very common. To lose whatever markers of status are respected in any given social class, whether we’re talking about a private jet and a Long Island mansion, a fashionable purse and a chic condo in an upscale neighborhood, or a pickup and a six-pack, can be tantamount to losing one’s identity if that identity has no more solid foundation—and a great many marketing firms have spent decades trying to insure that most Americans never think of looking for more solid foundations.

That last point has implications we’ll be exploring in a later sequence of posts. For the time being, though, I want to talk a bit about what all this means to those of my readers who have already come to terms with the reality of decline, and are trying to figure out how to live their lives in a world in which the conventional wisdom of the last three hundred years or so has suddenly been turned on its head. The first and, in many ways, the most crucial point is one that’s been covered here repeatedly already:you are going to have to walk the road down from empire yourself. Nobody else is going to do it for you, and you can’t even assume that anybody else will make it easier for you. What you can do, to make it a little easier than it will otherwise be, is to start walking it before you have to.

That means, to return to a slogan I’ve used more than once in this blog, using LESS—Less Energy, Stuff, and Stimulation. The more energy you need to maintain your everyday lifestyle, the more vulnerable you’ll be to sudden disruptions when the sprawling infrastructure that supplies you with that energy starts having running into serious trouble. Today, routine blackouts and brownouts of the electrical grid, and rationing or unpredictable availability of motor fuel, have become everyday facts of life in Third World nations that used to have relatively reliable access to energy. As America’s global empire unravels and the blowback from a century of empire comes home to roost, we can expect the same thing here. Get ready for that in advance, and you won’t face a crisis when it happens.

The same is true of the extravagant material inputs most Americans see as necessities, and of the constant stream of sensory stimulation that most Americans use to numb themselves to the unwelcome aspects of their surroundings and their lives. You will be doing without those at some point. The sooner you learn how to get by in their absence, the better off you’ll be—and the sooner you get out from under the torrent of media noise you’ve been taught to use to numb yourself, the sooner you can start assessing the world around you with a relatively clear head, and the sooner you’ll notice just how far down the arc of America’s descent we’ve already come.

Using LESS isn’t the only thing that’s worth doing in advance, of course. I’ve discussed elsewhere, for example, the need to develop the skills that will enable you to produce goods or provide services for other people, using relatively simple tools and, if at all possible, the energy of your own muscles. As the imperial tribute economy winds down and the United States loses the ability to import cheap goods and cheap labor from abroad, people will still need goods and services, and will pay for them with whatever measure of value is available—even if that amounts to their own unskilled labor. There are plenty of other steps that can be taken to prepare for life in a post-imperial society skidding down the far side of Hubbert’s peak, and the sooner you start taking those steps, the better prepared you will be to cope with that unfamiliar world.

Still, it may be possible to go further than that. In several of December’s posts here I raised the possibility that, in the wake of empire, the deliberate cultivation of certain core skills—specifically, clear reasoning, public speaking, and democratic process—might make it possible to kickstart a revival of America’s formerly vibrant democratic traditions. The same principle, I’d like to suggest, may be able to be applied more generally. Certain core insights that were central to pre-imperial America’s more praiseworthy achievements, but were tossed into the dumpster during the rush to empire, could be revived and put back to work in the post-imperial era. If that can be done at all, it’s going to involve a lot of work and a willingness to challenge some widely held notions of contemporary American culture, but I think the attempt is worth making. We’ll begin that discussion next week.

 

The Road Down from Empire

[John Michael Greer]

Written by testudoetlepus

January 18th, 2013 at 3:24 pm

Into an Unknown Country

without comments

by John Michael Greer

Was it just my imagination, or was the New Year’s celebration just past even more halfhearted than those of the last few years? My wife and I welcomed 2013 with a toast, and breakfasted the next morning on the traditional good-luck foods—rice and beans, corn bread, greens and bacon—that I learned to enjoy back when I was studying old-fashioned Southern folk magic. Outside our little house, though, the midnight air seemed remarkably quiet; the whoops, horns, and firecrackers of New Years past were notable mostly by their absence, and the next day’s hush seemed less a matter of hangovers than a not unreasonable dread of what 2013 might have in store for us all.

No doubt some of that was a function of the media panic about the so-called Fiscal Cliff. The New Yorker scored a palpable hit by headlining a piece on the subject “Washington Celebrates Solving Totally Unnecessary Crisis They Created,” but there’s more to it than that. What, after all, was this “fiscal cliff”? A measure that would have repealed some of the tax breaks and hikes in Federal spending put in place since 2000, and thus reduced the annual Federal deficit by a modest amount. All that yelling, in other words, was provoked by the possibility that the US government might have to take a few steps in the direction of living within its means. If the frantic struggle to avert that outcome is any measure of the kind of statesmanship we can expect from the White House and Congress in the year to come, it’s no wonder that hiding under the mattress has so much evident appeal just now.

There’s more involved in the evident lack of enthusiasm for the new year, though, than the latest clown acts playing in the three-ring circus that is today’s Washington DC. A great many of the comforting rationalizations that have played so large a role in justifying a continued reliance on the unsustainable are wearing very thin. Consider the claims, retailed by the media at ever-increasing volume these days, that recent upturns in the rate of domestic petroleum production in the US offer a conclusive disproof to the idea of peak oil, and herald the arrival of a new age of cheap abundant fuel. Courtesy of Jim Kunstler’s latest blog post, I’d like to offer a chart of US petroleum production, from 1920 to now, that puts those claims in perspective.

See the tiny little uptick in production over there on the far right? That’s the allegedly immense rise in petroleum production that drives all the rhetoric. If that blip doesn’t look like a worldchanging event to you, dear reader, you’re getting the message. It isn’t a worldchanging event; it’s the predictable and, by the way, repeatedly predicted result of the rise in oil prices from around $30 a barrel to between three and four times that, following the 2008 spike and crash. Triple or quadruple the price of any other commodity, and sources of that commodity that weren’t economically feasible to produce at the lower price will suddenly become paying propositions, too. (Yes, that’s spelled “Bakken shale” in the present tense.) If the price of oil were to triple or quadruple again over the next few years, we’ll probably see another increase on the same very modest scale, too. That increase still won’t be a worldchanging event, though the economic impact of another round of price increases on that scale might be.

More generally, we’ve got a real shortage of worldchanging events just now. There are good reasons for that, just as there are equally—well, equally strong, if not equally good—reasons why so many people are pinning all their hopes on a worldchanging event of one kind or another. Therapists like to point out that if you always do what you’ve always done, you’ll always get what you’ve always gotten, and of late it’s become a truism (though it’s also a truth) that doing the same thing and expecting to get different results is a good working definition of insanity. The attempt to find some way around that harsh but inescapable logic is the force that drove the prophetic hysteria about 2012, and drives end-of-the-world delusions more generally: if the prospect of changing the way you live terrifies you, but the thought of facing the consequences of the way you live terrifies you just as much, daydreaming that some outside force will come along and change everything for you can be a convenient way to avoid having to think about the future you’re making for yourself.

With that in mind, and with an eye toward the year ahead of us, I’d like to attend to three New Year customs that haven’t gotten as much attention here on The Archdruid Report as they probably should. First, I’d like to go over my predictions for the year just finished, and see how well they did; second, I’d like to offer up some predictions for the year to come; and third, I’d like to make some suggestions for what my readers might consider doing about it all.

My 2012 predictions appeared in the first January post here last year. Here they are:

“I’d like to suggest that when we take a backwards look in the early days of 2013, we will most likely see that that’s what happened in 2012, too: a slow worsening across a wide range of trends, punctuated by localized crises and regional disasters. I’d like to predict, in fact, that when we take that backward look, the US dollar and the Euro will both still exist and be accepted as legal tender, though the Eurozone may have shed a couple of countries who probably shouldn’t have joined it in the first place; that stock markets around the world will have had another volatile year, but will still be trading. Here in the US, whoever is unlucky enough to win the 2012 presidential election will be in the middle of an ordinary transition to a new term of office; the new Congress will be gearing up for another two years of partisan gridlock; gas stations will still have gas for sale and grocery stores will be stocked with groceries; and most Americans will be making the annual transition between coping with their New Year’s hangovers and failing to live up to their New Year’s resolutions, just as though it was any other year.

“Official US statistics will no doubt insist that the unemployment rate has gone down…but the number of people out of work in the United States will likely set another all-time record; the number of people in severe economic trouble will have gone up another good-sized notch, and public health clinics will probably be seeing the first wave of malnutrition-caused illness in children. If you happen to have spent the year in one of the areas unfortunate enough to get hit by the hard edge of the increasingly unstable weather, you may have had to spend a week or two in an emergency shelter while the flood waters receded or the wreckage got hauled away, and you might even notice that less and less gets rebuilt every year.

“Unless that happens, though, or unless you happen to pay close attention to the things that don’t usually make the evening news, you may well look back in the first days of 2013 and think that business as usual is still ongoing. You’d be right, too, so long as you recognize that there’s been a stealthy change in what business as usual now means. Until the peak of world conventional petroleum production arrived in 2005, by and large, business as usual meant the continuation of economic growth. Since then, by and large, it has meant the continuation of economic decline.”

No countries left the Eurozone in 2012, and if malnutrition-caused illness in children has had a notable uptick in America, I haven’t yet heard of it. Other than that, I think it’s fair to say that I called it. I’d like to put on my sorcerer’s cap, furthermore, and gaze a little deeper into the mists of futurity; I thus predict that just as 2012 looked like a remake of 2011 a little further down the curve of decline, 2013 will look a good deal like 2012, but with further worsening along the same broad array of trends and yet another round of local crises and regional disasters. The number of billion-dollar weather disasters will tick up further, as will the number of Americans who have no job—though, to be sure, the official unemployment rate and other economic statistics will be gimmicked then as now. The US dollar, the Euro, and the world’s stock markets will still be in business at year’s end, and there will still be gas for sale in gas stations, groceries for sale in grocery stores, and more people interested in the Super Bowl than in global warming or peak oil, as 2013 gives way to 2014.

As the year unfolds, I’d encourage my readers to watch the fracking bubble. Yes, it’s a speculative bubble of the classic sort, one that has soaked up a vast amount of investment money over the last few years, and the glorious future of American energy independence being touted by the media has the same function, and the same relationship to reality, as the glorious future of endlessly rising house prices that got waved around with equal abandon in 2006 and 2007. I don’t expect the bubble to pop this year—my best guess at this point is that that’ll happen in 2014—but it’s already losing air as the ferocious decline rates experienced by fracked oil and gas wells gnaw the bottom out of the fantasy. Expect the new year to bring more strident claims of the imminent arrival of a shiny new future of energy abundance, coupled with a steady drumbeat of bad financial news suggesting, in essence, that the major players in that end of the oil and gas industry are well and truly fracked.

I’d also encourage my readers to watch the climate. The tendency to focus on predicted apocalypses to come while ignoring the reality of ongoing collapse in the present is as evident here as in every other corner of contemporary culture; whether or not the planet gets fried to a crackly crunch by some more or less distant future date, it’s irrefutable that the cost of weather-related disasters across the world has been climbing year over year for decades, and this is placing an increasingly harsh burden on local and regional economies here in the US and elsewhere. It’s indicative that many coastal towns in Louisiana and Mississippi that were devastated by Hurricane Katrina have never been rebuilt, and it’s probably a safe bet that a similar fate waits for a fair number of the towns and poorer neighborhoods hit hardest by Hurricane Sandy. As global warming pumps more heat into the heat engine we call Earth’s climate, the inevitable result is more extreme weather—drier droughts, fiercer storms, more serious floods, and so on down a litany that’s become uncomfortably familiar in recent years.

Most of the infrastructure of industrial society was built during the period of abnormally good weather we call the twentieth century. A fair amount of it, as New York subway riders have had reason to learn, is poorly designed to handle extreme weather, and if those extremes become normal, the economics of maintaining such complex systems as the New York subways in the teeth of repeated flooding start to look very dubious indeed. I don’t expect to see significant movements out of vulnerable coastal areas quite yet, but if 2011’s Hurricane Irene and 2012’s Hurricane Sandy turn out to have a bouncing baby sibling who decides to pay a visit to the Big Apple in 2013, 2014 might see the first businesses relocating further inland, perhaps to the old mill towns of the southern Hudson valley and the eastern end of Pennsylvania, perhaps further still.

That’s speculative. What isn’t speculative is that all the trends that have been driving the industrial world down the arc of the Long Descent are still in play, and so are all the parallel trends that are pushing America’s global empire along its own trajectory toward history’s dustbin Those things haven’t changed; even if anything could be done about them, which is far from certain, nothing is being done about them; indeed, outside of a handful of us on the fringes of contemporary culture, nobody is even talking about the possibility that something might need to be done about them. That being the case, it’s a safe bet that the trends I’ve sketched out will continue unhindered, and give us another year of the ordinary phenomena of slowly accelerating decline and fall.

That, in turn, leads to the question of what my readers might do about it all.

My advice hasn’t changed. It’s a source of some amusement to me, though, that no matter how clearly I try to communicate that advice, a fair number of people will hear what they want to hear, or perhaps what they expect to hear, rather than what I’m saying. Over the course of this last week, for example, several people commenting on this post on one of the many other forums where it appears insisted with some heat that I claimed that activism was worthless, while one of the commenters here on The Archdruid Report took me to task for what he thought was a rejection of community in favor of an unworkable go-it-alone approach.

Not so. What I’m saying is that any meaningful response to the crisis of our time has to begin on the individual level, with changes in our own lives. To say that it should begin there doesn’t mean that it should end there; what it does mean is that without the foundation of personal change, neither activism nor community building nor anything else is going to do much. We’ve already seen what happens when climate activists go around insisting that other people ought to decrease their carbon footprint, while refusing to do so themselves, and the results have not exactly been good. Equally, if none of the members of a community are willing to make the changes necessary to decrease their own dependence on a failing industrial system, just what good is the community as a whole supposed to do?

A great many people like to insist that changing your own life isn’t enough, and then act as though that means that changing your own life isn’t necessary. Again, not so. If industrial society as a whole has to stop dumping excess carbon dioxide into the atmosphere, dear reader, that means among many other things that you, personally, have to stop contributing your share of that excess. Equally, if industrial society as a whole is running short of fossil fuels, that means among many other things that you, personally, are going to have to get used to living without them. That being the case, why not start with the part of the problem about which you can actually do something—your own consumption of fossil fuels and your own production of carbon dioxide—and then go from there?

Political activism, community building, and a great many other proposed responses to the crisis of our time are entirely valid and workable approaches if those who pursue them start by making the changes in their own lives they expect other people to make in turn. Lacking that foundation, they go nowhere. It’s not even worth arguing any more about what happens when people try to get other people to do the things they won’t do themselves; we’ve had decades of that, it hasn’t helped, and it’s high time that the obvious lessons get drawn from that fact. Once again, if you always do what you’ve always done…

That being said, here are some suggested New Year’s resolutions for those of my readers who are interested in being part of the solution:

1. Caulk, weatherstrip, and insulate the place where you live. Most Americans can cut between 5% and 25% of their total annual energy use by weatherizing their homes. None of the work is rocket science; your local hardware store can sell you everything you need for a very modest amount of money, and there are plenty of sources in print and online that can teach you everything you need to know. The sooner you get to work, the sooner you start saving money, and the sooner a good chunk of your share of excess carbon dioxide stops messing with the atmosphere.

2. Make at least one commute or run at least one errand a week on foot, by bicycle, or by public transit. A great many Americans don’t actually need cars at all. A good many of those who do, due to a half century of idiotic land use planning, need them a great deal less often than they think. The best way to learn this is to experience what it’s like to travel by some other means. It’s long past time to ditch the “yuppie logic” that suggests that it’s a good idea to drive a mile to the health club to get on a treadmill and get the exercise you didn’t get by walking to the health club. It’s also long past time to ditch the equally false logic that insists that getting there faster is the only thing that matters.

3. If you take a vacation, take the train. Traveling by train uses a small fraction of the fuel per mile that a plane needs, and the trip is part of the vacation rather than an ordeal to endure between one place and the next. Give it a try. If you live in the US, you might also consider supporting the National Association of Railroad Passengers, which lobbies for expanded passenger rail service and offers a discount on fares for members.

4. Buy it used. This applies to everything from cars, should you actually need one, to the cheapest of trinkets. By buying a used product rather than a new one, you save the energy cost of manufacturing the new product, and you also keep things out of the waste stream. Used computers are particularly worth your while; if you live in a tolerably large urban area in the US, you can often get more computers than you need by letting your circle of friends know that you’ll take used but working devices off their hands for free. You won’t be able to play the latest computer games on them, sure, but if you’re obsessed with playing the latest computer games, you don’t need a computer; you need a life. Speaking of getting a life…

5. Turn off the boob tube. Better still, if you can talk the people you live with into it, get rid of the thing altogether. Commercial television exists to fill your brain with emotionally manipulative imagery that lures you into buying products you wouldn’t otherwise need or want. Public television? Replace “products” with “opinions” and you’re not too far off. (Huge rapacious corporations spend millions of dollars to fund public TV programs; I hope none of my readers are naive enough to think that these corporations do this out of some vague sense of moral obligation.) You don’t need any of that stuff cluttering up your brain. While you’re at it…

6. Take up an art, craft, or hobby. Once you turn off the TV, you’re going to have the one luxury that nobody in a modern consumer society is ever supposed to have: actual, unstructured free time. It’s worth luxuriating in that for a bit, but pretty soon you’ll find that you want to do something with that time, and one of the best options is to learn how to do something interesting with your hands. Three quarters of a century ago, most people had at least one activity that gave them something creative to do in their off hours, and a good many of those activities also produced useful and valuable things. Unless you’re at least seventy years old or come from a very unusual family, you have no idea how many arts, crafts and hobbies Americans used to pursue, or how little money it takes to get started with most of them. By the way, if you think you’re too old to take up playing the guitar or doing some other seemingly complicated skill, you’re not.

7. Do without something this year. This is the scary one for most people in today’s consumer society. To be able to have something, and choose not to have it, challenges some of the deepest of modern taboos. Give it a try. The point isn’t to strike an assumed pose of ecological virtue, by the way, so don’t tell anybody what you’re doing without, or even that you’re doing without something. Nor is this about “being good” in some socially approved manner, so don’t choose something that you’re supposed to want to do without. Just quietly neglect to make something part of your life, and pay attention to your own emotional reactions. If you’re like most people in today’s America, you’ll be in for a wild ride, but the destination is worth reaching.

So there you are. As we head deeper into the unknown country of 2013, have a happy and sustainable new year!

 

A couple of notes might be worth placing here for fans of my writing. First of all, my latest peak oil book, Not The Future We Ordered: The Psychology of Peak Oil and the Myth of Eternal Progress, is available for preorder. Karnac Press, the publisher, is a specialty press publishing mostly in the field of psychology; the book is primarily intended for psychologists, therapists, and members of the healing professions, who will need to know what they’re dealing with as the psychological impacts of peak oil take their toll, but it may also be of interest to peak oil readers generally. Much of what’s covered in Not The Future We Ordered hasn’t appeared here or in any of my other books, so it may be worth a look.

I’m also pleased to announce that I’ve been offered a position as contributing editor and monthly columnist with PeakProsperity.com(formerly ChrisMartenson.com). My first column there will be appearing later this month. My working plan at this point is to head deeper into the territory I explored in my book The Wealth of Nature, with an eye toward the practical and personal implications of the end of the age of abundance. This is a paid gig, and so the meat of my monthly columns will be in the subscribers-only area, but I plan on doing my level best to make sure it’s worth the price of admission. Again, might be worth a look.

 

 

Into an Unknown Country

[John Michael Greer]

Written by testudoetlepus

January 3rd, 2013 at 5:44 pm

The Beginning of the World

without comments

by John Michael Greer

Friday was, as I’m sure most of my readers noticed, an ordinary day. Here in the north central Appalachians, it was chilly but not unseasonably so, with high gray clouds overhead and a lively wind setting the dead leaves aswirl; wrens and sparrows hopped here and there in my garden, poking among the recently turned soil of the beds. No cataclysmic earth changes, alien landings, returning messiahs, or vast leaps of consciousness disturbed their foraging. They neither knew nor cared that one of the great apocalyptic delusions of modern times was reaching its inevitable end around them.

The inimitable Dr. Rita Louise, on whose radio talk show I spent a couple of hours on Friday, may have summed it up best when she wished her listeners a happy Mayan Fools Day. Not that the ancient Mayans themselves were fools, far from it, but then they had precisely nothing to do with the competing fantasies of doom and universal enlightenment that spent the last decade and more buzzing like flies around last Friday’s date.

It’s worth taking a look back over the genesis of the 2012 hysteria, if only because we’re certain to see plenty of reruns in the years ahead. In the first half of the 20th century, as archeologists learned to read dates in the Mayan Long Count calendar, it became clear that one of the major cycles of the old Mayan timekeeping system would roll over on that day. By the 1970s, that detail found its way into alternative culture in the United States, setting off the first tentative speculations about a 2012 apocalypse, notably drug guru Terence McKenna’s quirky “Timewave Zero” theory.

It was the late New Age promoter Jose Arguelles, though, who launched the 2012 fad on its way with his 1984 book The Mayan Factor and a series of sequels, proclaiming that the rollover of the Mayan calendar in 2012 marked the imminent transformation of human consciousness that the New Age movement was predicting so enthusiastically back then. The exactness of the date made an intriguing contrast with the vagueness of Arguelles’ predictions about it, and this contrast left ample room for other authors in the same field to jump on the bandwagon and redefine the prophecy to fit whatever their own eschatological preferences happened to be. This they promptly did.

Early on, 2012 faced plenty of competition from alternative dates for the great transformation. The year 2000 had been a great favorite for a century, and became 2012’s most important rival, but it came and went without bringing anything more interesting than another round of sordid business as usual. Thereafter, 2012 reigned supreme, and became the center of a frenzy of anticipation that was at least as much about marketing as anything else. I can testify from my own experience that for a while there, late in the last decade, if you wanted to write a book about anything even vaguely tangential to New Age subjects and couldn’t give it a 2012 spin, many publishers simply weren’t interested.

So the predictions piled up. The fact that no two of them predicted the same thing did nothing to weaken the mass appeal of the date. Neither did the fact, which became increasingly clear as the last months of 2012 approached, that a great many people who talked endlessly about the wonderful or terrible things that were about to happen weren’t acting as though they believed a word of it. That was by and large as true of the New Age writers and pundits who fed the hysteria as it was of their readers and audiences; I long ago lost track of the number of 2012 prophets who, aside from scheduling a holiday trip to the Yucatan or some other fashionable spot for the big day, acted in all respects as though they expected the world to keep going in its current manner straight into 2013 and beyond.

That came as a surprise to me. Regular readers may recall my earlier speculation that 2012 would see scenes reminiscent of the “Great Disappointment” of 1844, with crowds of true believers standing on hilltops waiting for their first glimpse of alien spacecraft descending from heaven or what have you. Instead, in the last months of this year, some of the writers and pundits most deeply involved in the 2012 hysteria started claiming that, well, actually, December 21st wasn’t going to be the day everything changed; it would, ahem, usher in a period of transition of undefined length during which everything would sooner or later get around to changing. The closer last Friday came, the more evasive the predictions became, and Mayan Fools Day and its aftermath were notable for the near-total silence that spread across the apocalyptic end of the blogosphere. Say what you will about Harold Camping, at least he had the courage to go on the air after his May prophecy flopped and admit that he must have gotten his math wrong somewhere.

Now of course Camping went on at once to propose a new date for the Rapture, which flopped with equal inevitability a few months later. It’s a foregone conclusion that some of the 2012 prophets will do the same thing shortly, if only to kick the apocalypse marketing machine back into gear. It’s entirely possible that they’ll succeed in setting off a new frenzy for some other date, because the social forces that make apocalyptic fantasies so tempting to believe just now have not lost any of their potency.

The most important of those forces, as I’ve argued in previous posts, is the widening mismatch between the fantasy of entitlement that has metastasized through contemporary American society, on the one hand, and the ending of an age of fossil-fueled imperial extravagance on the other. As the United States goes bankrupt trying to maintain its global empire, and industrial civilization as a whole slides down the far side of a dizzying range of depletion curves, it’s becoming harder by the day for Americans to make believe that the old saws of upward mobility and an ever brighter future have any relevance to their own lives—and yet those beliefs are central to the psychology, the self-image, and the worldview of most Americans. The resulting cognitive dissonance is hard to bear, and apocalyptic fantasies offer a convenient way out. They promise that the world will change, so that the believers don’t have to.

That same frantic desire to ignore the arrival of inescapable change pervades today’s cultural scene, even in those subcultures that insist most loudly that change is what they want. In recent months, to cite only one example, nearly every person who’s mentioned to me the claim that climate change could make the Earth uninhabitable has gone on to ask, often in so many words, “So why should I consume less now?” The overt logic here is usually that individual action can’t possibly be enough. Whether or not that’s true is anyone’s guess, but cutting your own carbon footprint actually does something, which is more than can be said for sitting around enjoying a standard industrial world lifestyle while waiting for that imaginary Kum Ba Ya moment when everyone else in the world will embrace limits not even the most ardent climate change activists are willing to accept themselves.

Another example? Consider the rhetoric of elite privilege that clusters around the otherwise inoffensive label “1%.” That rhetoric plays plenty of roles in today’s society, but one of them pops up reliably any time I talk about using less. Why, people ask me in angry tones, should they give up their cars when the absurdly rich are enjoying gigantic luxury yachts? Now of course we could have a conversation about the total contribution to global warming of cars owned by people who aren’t rich, compared to that of the fairly small number of top-end luxury yachts that usually figure in such arguments, but there’s another point that needs to be raised. None of the people who make this argument to me have any control over whether rich people have luxury yachts. All of them have a great deal of control over whether and how often they themselves use cars. Blaming the global ecological crisis on the very rich thus functions, in practice, as one more way to evade the necessity of unwelcome change.

Along these same lines, dear reader, as you surf the peak oil and climate change blogosphere and read the various opinions on display there, I’d encourage you to ask yourself what those opinions amount to in actual practice. A remarkably large fraction of them, straight across the political landscape from furthest left to furthest right and including all stops in between, add up to demands that somebody else, somewhere else, do something. Since the people making such demands rarely do anything to pressure, or even to encourage, those other people elsewhere to do whatever it is they’re supposed to do, it’s not exactly hard to do the math and recognize that here again, these opinions amount to so many ways of insisting that the people holding them don’t have to give up the extravagant and unsustainable lifestyles most people in the industrial world think of as normal and justifiable.

There’s another way to make the same point, which is that most of what you’ll see being proposed in the peak oil and climate change blogosphere has been proposed over and over and over again already, without the least impact on our predicament. From the protest marches and the petitions, through the latest round of grand plans for energy futures destined to sit on the shelves cheek by jowl with the last round, right up to this week’s flurry of buoyantly optimistic blog posts lauding any technofix you care to name from cold fusion and algal biodiesel to shale gas and drill-baby-drill: been there, done that, used the T-shirt to wipe another dozen endangered species off the face of the planet, and we’re still stuck in the same place. The one thing next to nobody wants to talk about is the one thing that distinguished the largely successful environmental movement of the 1960s and 1970s from the largely futile environmental movement since that time, which is that activists in the earlier movement were willing to start the ball rolling by making the necessary changes in their own lives first.

The difficulty, of course, is that making these changes is precisely what many of today’s green activists are desperately trying to avoid. That’s understandable, since transitioning to a lifestyle that’s actually sustainable involves giving up many of the comforts, perks, and privileges central to the psychology and identity of people in modern industrial societies. In today’s world of accelerating downward mobility, especially, the thought of taking any action that might result in being mistaken for the poor is something most Americans in particular can’t bear to contemplate—even when those same Americans recognize on some level that sooner or later, like it or not, they’re going to end up poor anyway.

Those of my readers who would like to see this last bit of irony focused to incandescence need only get some comfortably middle class eco-liberal to start waxing lyrical about life in the sustainable world of the future, when we’ll all have to get by on a small fraction of our current resource base. This is rarely difficult; I field such comments quite often, sketching out a rose-colored contrast between today’s comfortable but unsatisfying lifestyles and the more meaningful and fulfilling existence that will be ours in a future of honest hard work in harmony with nature. Wait until your target is in full spate, and then point out that he could embrace that more meaningful and fulfilling lifestyle right now by the simple expedient of discarding the comforts and privileges that stand in the way. You’ll get to watch backpedaling on a heroic scale, accompanied by a flurry of excuses meant to justify your target’s continued dependence on the very comforts and privileges he was belittling a few moments before.

What makes the irony perfect is that, by and large, the people whom you’ll hear criticizing the modern lifestyles they themselves aren’t willing to renounce aren’t just mouthing verbal noises. They realize, many of them, that the lifestyles that industrial societies provide even to their more privileged inmates are barren of meaning and value, that the pursuit and consumption of an endless series of increasingly shoddy manufactured products is a very poor substitute for a life well lived, and that stepping outside the narrowing walls of a world defined by the perks of the consumer economy is the first step toward a more meaningful existence. They know this; what they lack, by and large, is the courage to act on that knowledge, and so they wander the beach like J. Alfred Prufrock in Eliot’s poem, letting the very last inch or so of the waves splash over their feet—the bottoms of their trousers rolled up carefully, to be sure, to keep them from getting wet—when they know that a running leap into the green and foaming water is the one thing that can save them. Thus it’s not surprising that their daydreams cluster around imaginary tidal waves that will come rolling in from the deep ocean to sweep them away and make the whole question moot.

This is why it’s as certain as anything can be that within a year or so at most, a good many of the people who spent the last decade or so talking endlessly about last Friday will have some other date lined up for the end of the world, and will talk about it just as incessantly. It’s that or face up to the fact that the only way to live up to the ideals they think they espouse is to walk straight toward the thing they most fear, which is the loss of the perks and privileges and comforts that define their identity—an identity many of them hate, but still can’t imagine doing without.

Meanwhile, of course, the economy, the infrastructure, and the resource flows that make those perks and privileges and comforts possible are coming apart around them. There’s a great deal of wry amusement to be gained from watching one imaginary cataclysm after another seize the imagination of the peak oil scene or society as a whole, while the thing people think they’re talking about—the collapse of industrial civilization—has been unfolding all around them for several years now, in exactly the way that real collapses of real civilizations happen in the real world.

Look around you, dear reader, as the economy stumbles through another round of contraction papered over with increasingly desperate fiscal gimmicks, the political system of your country moves ever deeper into dysfunction, jobs and livelihoods go away forever, whatever social safety net you’re used to having comes apart, towns and neighborhoods devastated by natural disasters are abandoned rather than being rebuilt, and the basic services that once defined a modern society stop being available to a larger and larger fraction of the people of the industrial world. This is what collapse looks like. This is what people in the crumbling Roman Empire and all those other extinct civilizations saw when they looked out the window. To those in the middle of the process, as I’ve discussed in previous posts, it seems slow, but future generations with the benefit of hindsight will shake their heads in wonder at how fast industrial civilization went to pieces.

I commented in a post at the start of this year that the then-current round of fast-collapse predictions—the same predictions, mind you, that had been retailed at the start of the year before, the year before that, and so on—were not only wrong, as of course they turned out to be, but missed the collapse that was already under way. The same point holds good for the identical predictions that will no doubt be retailed over the next few weeks, insisting that this is the year when the stock market will plunge to zero, the dollar and/or the Euro will lose all their value, the economy will seize up completely and leave the grocery shelves bare, and so on endlessly; or, for that matter, that this is the year when cold fusion or algal biodiesel or some other vaporware technology will save us, or the climate change Kum Ba Ya moment I mentioned earlier will get around to happening, or what have you.

It’s as safe as a bet can be that none of these things will happen in 2013, either. Here again, though, the prophecies in question are not so much wrong as irrelevant. If you’re on a sinking ocean liner and the water’s rising fast belowdecks, it’s not exactly useful to get into heated debates with your fellow passengers about whether the ship is most likely to be vaporized by aliens or eaten by Godzilla. In the same way, it’s a bit late to speculate about how industrial civilization will collapse, or how to prevent it from collapsing, when the collapse is already well under way. What matters at that stage in the game is getting some sense of how the process will unfold, not in some abstract sense but in the uncomfortably specific sense of where you are, with what you have, in the days and weeks and months and years immediately ahead of you; that, and then deciding what you are going to do about it.

With that in mind, dear reader, I’d like to ask you to do something right now, before going on to the paragraph after this one. If you’re in the temperate or subarctic regions of the northern hemisphere, and you’re someplace where you can adjust the temperature, get up and go turn the thermostat down three degrees; if that makes the place too chilly for your tastes, take another moment or two to put on a sweater. If you’re in a different place or a different situation, do something else simple to decrease the amount of energy you’re using at this moment. Go ahead, do it now; I’ll wait for you here.

Have you done it? If so, you’ve just accomplished something that all the apocalyptic fantasies, internet debates, and protest marches of the last two decades haven’t: you’ve decreased, by however little, the amount of carbon dioxide going into the atmosphere. That sweater, or rather the act of putting it on instead of turning up the heat, has also made you just a little less dependent on fossil fuels. In both cases, to be sure, the change you’ve made is very small, but a small change is better than no change at all—and a small change that can be repeated, expanded, and turned into a stepping stone on the way to bigger changes, is infinitely better than any amount of grand plans and words and handwaving that never quite manage to accomplish anything in the real world.

Turning down your thermostat, it’s been said repeatedly, isn’t going to save the world. That’s quite true, though it’s equally true that the actions that have been pursued by climate change and peak oil activists to date don’t look particularly likely to save the world, either, and let’s not even talk about what wasn’t accomplished by all the wasted breath over last Friday’s nonevent. That being the case, taking even the smallest practical steps in your own life and then proceeding from there will take you a good deal further than waiting for the mass movements that never happen, the new technologies that never pan out, or for that matter the next deus ex machina some canny marketer happens to pin onto another arbitrary date in the future, as a launching pad for the next round of apocalyptic hysteria.

Meanwhile, a world is ending. The promoters of the 2012 industry got that right, though they missed just about everything else; the process has been under way for some years now, and it won’t reach its conclusion in our lifetimes, but what we may as well call the modern world is coming to an end around us. The ancient Mayans knew, however, that the end of one world is always the beginning of another, and it’s an interesting detail of all the old Mesoamerican cosmological myths that the replacement for the old world doesn’t just pop into being. Somebody has to take action to make the world begin. It’s a valid point, and one that can be applied to our present situation, when so many people are sitting around waiting for the end and so few seem to be willing to kickstart the beginning in the only way that matters—that is, by making actual changes in their own lives. The deindustrial world of the future is poised to begin, but someone has to begin it. Shall we?

 

The Beginning of the World

[John Michael Greer]

Audio clip: Adobe Flash Player (version 9 or above) is required to play this audio clip. Download the latest version here. You also need to have JavaScript enabled in your browser.

Written by testudoetlepus

December 27th, 2012 at 4:06 pm

Enacting Democracy

without comments

by John Michael Greer

The recovery of reason, the theme of last week’s post here on The Archdruid Report, has implications that go well past the obvious. One of the examples that comes first to mind is also directly relevant to the theme of this series of posts, and it unfolds from an experience that many people have mentioned to me in recent years: the inability of Americans with different beliefs to sit down and have a constructive conversation about their disagreements.

Those of my readers who have tried to do this any time recently, unless they were very lucky, will have found stalemate the all but inevitable outcome. Each side trots out its favorite talking points, most of them sound bites culled from popular media of one kind or another. When these fail to have the expected effect on the other side, both sides try again, with similar results, until finally one or both sides withdraw into frustration and hostility.

Though it’s unpopular these days to point this out, both sides in the current American culture wars follow this same wearily predictable pattern. Yes, I’m familiar with the recent flurry of liberal psychologists who insist that conservatives are just too irrational to accept what liberals see as self-evident truths; I don’t buy their claims, not least because I’ve watched liberals behave with exactly the same degree of illogic in parallel situations. The problem on both sides, as I see it, is the debasement of thinking discussed in last week’s post: the malign transformation of our inner discourse into a set of arbitrary linkages between verbal noises and simple emotional reactions. If a verbal noise produces warm fuzzy emotions in one person and cold prickly emotions in another, they are not going to be able to communicate unless both are able to get past that unthinking reaction—and getting past that unthinking reaction is something that very few Americans these days are able to do.

There’s another useful way to speak of the confusion of language in today’s America, and that’s to point out that nearly all our collective discourse has been reduced to phatic communication. That seemingly exotic phrase describes a very familiar process: the use of verbal noises to signal belonging and readiness for social interaction. When two men sit down in a bar here in Cumberland, and one says to the other, “So, how about them Ravens?”—we’re halfway between Baltimore and Pittsburgh, so in football season it’s either that or “How about them Steelers?”—the question needn’t indicate any interest in the team in question. Rather, it’s a formal way to acknowledge the other person’s presence and claim membership in a community. In a different context, the question might be “Nice weather, isn’t it?” or some other equally vacant utterance. The form varies but the content—or more precisely the lack of content—remains identical.

Much of today’s political discourse serves exactly the same purpose: it signals readiness for social interaction and claims membership in a specific political subculture, and that’s basically all it does. The verbal noises that get used for phatic communication in that context vary even with fairly small shifts across the political landscape, but if you sit in on a discussion among people who more or less agree with each other’s politics, you can usually figure out pretty quickly what the relevant warm-fuzzy and cold-prickly phrases are, and once you’ve done that you can identify yourself either as a member of the community or as an outsider with a very few words. It’s an experiment I recommend, partly for the entertainment value, and partly because there are few better ways to learn just how much of what passes for political thought these days is a set of essentially content-free signals meant to define the boundaries of a group.

It’s really quite remarkable to watch the range of things that get turned into phatic labels for political subcultures these days. Not long ago, for example, “Merry Christmas” and “Happy Holidays” were equally content-free phatic utterances used from the middle of November to the end of the year across most of American society. These days, “Merry Christmas” has been turned into a phatic badge on the rightward end of the contemporary culture wars, and “Happy Holidays” is well on its way to becoming a phatic badge of equal force on the left. Myself, I have no problem wishing my Christian neighbors a merry Christmas—that is what they’re celebrating, after all—and wishing a happy Hanukkah, a blessed solstice, or even a merry Krampustide to those who celebrate these other festivities; one of the benefits of being able to use language for purposes other than phatic communication is that, when a phatic noise is the right thing to use, you can choose your signals deliberately to get the results you want.

It thus probably needs to be said that there’s nothing wrong with phatic communication. Human beings are social primates, with the normal set of social primate instincts and reactions, and casual comments about football teams and the weather are no more objectionable in themselves than the grunts and postures baboons use to accomplish the same ends. The problem here is simply a function of the fact that human language has functions other than phatic communication, and when those other functions are of crucial importance, staying stuck in phatic communication doesn’t help much.

There’s an old word, dialectic, that may be worth introducing here. No, it doesn’t have anything to do with Marxism; long before Hegel’s time, it was used for exactly the kind of communication that’s most lacking in American society these days, the kind in which two or more people sit down and say, in effect, “let us reason together.” The ancient philosopher Plotinus described dialectic as the most precious part of philosophy, and the point’s a valid one; the ability to sit down with someone who disagrees with you about some important issue, discuss the matter, determine what common ground exists and where the differences of opinion lie, and either resolve the disagreement or sort out the questions of fact and value that have to be settled in order to resolve it, represents a high level of the practical wisdom that philosophy once upon a time was meant to cultivate.

Dialectic is a learned skill, and not a particularly difficult one, either. Anyone who can tell the difference between a fact and an opinion, recognize a dozen or so of the standard logical fallacies, follow an argument step by step from its premises to its conclusion, and forbear from dragging the discussion down to the level of personal slurs, can pick it up promptly given a competent teacher and a little practice. In the ancient world, dialectic was the way that philosophy was taught: a teacher would start a conversation with a couple of senior students on some specific theme, and go from there. If the dialogue that followed was any good, it wouldn’t simply rehash existing knowledge, but turn into an adventure of the mind that broke new ground; those of my readers who are familiar with the dialogues of Plato, which were meant to imitate dialectic at work, will have some sense of how this worked.

Pass beyond the circle of students around a teacher, and dialectic merges into rhetoric. That’s a word that gets plenty of use these days, nearly always with a heavy cargo of cold pricklies attached to it. Until quite recently, though, rhetoric was well understood as one of the essential skills of citizenship: the ability to stand up and explain, in clear, concise, and compelling language, what you think about a given issue. Of all the skills of democracy, it’s hard to think of one more thoroughly misplaced than this one. How many times, dear reader, have you heard people bemoaning the fact that people in America aren’t willing to listen to one another? There’s a reason for that, though it’s not one you’re likely to hear; it’s that next to nobody in this country seems to be able to make a cogent, sensible comment on an issue—on anyissue—and then sit down, shut up, and let somebody else take the floor. It seems to have been completely forgotten nowadays that competent rhetoric makes the listener want to keep listening.

Rhetoric is another learned skill. There are plenty of good textbooks on the subject, ranging from ancient Greek texts to online tutorials packed with the latest buzzwords, and there’s also a voluntary organization—Toastmasters International—that teaches rhetorical skills via a network of local clubs. It’s not particularly difficult to learn, either. The great obstacle here is the terror of public speaking that’s efficiently instilled in American schoolchildren by the culture of bullying that pervades our public schools, and that can be outgrown; I had a world-class case of it not all that many years ago, for example. The benefits to learning it are not small, and are far from limited to its crucial role in fostering democracy, but we’ll stay focused on this latter for now.

When citizens can stand up in a meeting and present their points of view in concise, thoughtful, and convincing words, democracy becomes possible. When they can’t—when the only thing that takes place in a meeting is a collection of verbal noises denoting “warm fuzzy!” and “cold prickly!” to those others present who happen to link noises and emotions in the same way the speaker does—democracy is not an option, because it’s impossible to establish any shared basis for communication between those with different emotional reactions to any given set of verbal noises. Transform those noises into words with mutually agreed meanings and you can get past that barrier, but transforming verbal noises into words with mutually agreed meanings is a skill very few Americans know any more.

The ability to converse in a reasoned and reasonable fashion, and the ability to present a viewpoint in a clear, cogent, and convincing manner, are thus among the core skills of democratic process that have been lost by contemporary American society and need to be recovered in a hurry. Add these to the basic capacity to reason discussed in last week’s post, and you’ve got all the foundations for democratic process. You don’t yet have anything built on those foundations, but that’s the next step. Democratic process itself comprises one more set of skills—the skills that allow a group of people to meet together, discuss controversial issues, and agree on a collective response to them.

Those skills are not to be found in the so-called consensus methods that have kept activists on the Left spinning their wheels uselessly for three decades now. I trust my readers remember the flood of self-congratulatory verbiage put forth by the Occupy movement in 2011; that movement vanished with scarcely a trace once the weather turned cold last year, and despite loud claims that it would pop back up again in the spring, it did no such thing. There were a good many factors behind its failure, but among thwm was the manipulative behavior of activists who seized control of the movement using a supposedly egalitarian consensus system that placed all effective power, and a great deal of donated money, in their unelected and unsupervised hands.

After months of circular debate that never quite managed to result in meaningful action, the vast majority of the protesters were convinced that their concerns would not be addressed and their efforts were wasted, and simply went home. This would be significant enough if it was new; in point of fact, it’s been the outcome of nearly every attempt at organized protest since the early 1980s, when the current suite of manipulative pseudoconsensus methods were adopted across most of the activist Left. If you want to know why the Left accomplished next to nothing for thirty years, while activists on the right were getting candidates into office and laws on the books, that’s an important part of the reason.

This is all the more embarrassing in that the toolkit of democratic process has been sitting on the shelf the whole time, waiting for somebody to notice that liberal and radical groups in the past used to use methods of organization that, however unfashionable they have become, actually work. There are a lot of details, and entire books in fine print have been written on the minutiae, but the core elements of democratic process can be described in a paragraph.

This is how it works. Everyone has an equal voice and an equal vote, but the right to participate depends on willingness to follow the rules, and members can be ejected for abusive behavior; the chairperson of the meeting, and the handful of other people needed to make it work, are elected to be impartial referees of the process, and can be overruled or removed by vote if they abuse their positions; one person speaks at a time, and the chairperson determines who speaks next; an overly longwinded speaker can be told to shut up by the chairperson, or by vote of the members; once a vote takes place on any issue, the issue can’t be brought back up for debate again without a 2/3 majority, to keep a minority with an agenda from holding the meeting hostage; and the goal of the meeting, and of every part of the process, is to come to a decision, act on it, and get home at a reasonable hour.

That’s democratic process. It evolved organically over many centuries from its origins in the rough but functional practices of Anglo-Saxon tribal assemblies, and like other organic systems, it looks much sloppier but works much better than the idealized abstractions cooked up by radicals on either end of the spectrum. It’s easy to compare it unfavorably to one or another of those idealized abstractions, but the proof of the pudding is in the eating; those who want to demonstrate that some other system is as effective as democratic process are welcome to use that other system on smaller scales, with voluntary organizations and local communities, and prove that it works. That was, after all, how democratic process emerged as the default option in the Western world: in actual practice, in an assortment of voluntary organizations, local communities, political parties and protest groups, it proved to be more effective than the alternatives.

I should say, finally, that even the most lively revival of the core skills of democracy isn’t likely to affect the political sphere much for a couple of decades at least; if nothing else, the sheer inertia of a political dialogue debased as far as ours has been will take at least a generation to pass off. The point in reviving these things now is to lay foundations for the future. Right now, in the fading years of the Age of Abundance, it’s fairly easy to learn the things I’ve discussed in last week’s and this week’s post; the intellectual resources needed for such a project can be found readily in libraries and on the internet, and a great many people have enough spare time to invest in such a project that much could be done. The further we proceed into resource depletion, infrastructure breakdown, environmental instability, and the rest of the bubbling witch’s brew we’ve cooked up for ourselves in the cauldron of the near future, the less true that is likely to be. Thus any effort to make democratic process and the skills that make it possible available to the far future has to begin now, or soon.

It’s a good season to keep such logic in mind. Those of my readers who have gardens, or are planning to plant one in the new year, will already be glancing through seed catalogs and roughing out, at least in the mind’s eye, what they will need for the spring planting. In the same sense, though on a larger and more daunting scale, those of us who are watching the stormclouds of a greater winter gather on the horizon should be thinking about what seeds they intend to store for a more distant springtime. To my mind, at least, there is no greater challenge and no more important work to be done.

In the meantime, I wish each of you a blessed solstice, or whatever other festival your own faith or traditions assign to this time of year. Next week, when winter is here and the partying is done, we’ll have a lot more to talk about.

 

 

End of the World of the Week #53

The last months of 1999, the subject of last week’s End of the World of the Week, were in many ways just a running start for one of the most wildly popular apocalyptic dates in history, the year 2000. An astonishing number of predictions of all kinds clustered around that impressively round number. For decades beforehand, in fact, the odds were pretty good that any projection of trends in the fairly near future would begin, “In the year 2000…” Apocalyptic prophecies were among the many notions that clustered around that year, with the widely ballyhooed Y2K bug only the most heavily publicized among them. As far back as the 13th century, the Catholic theologian Peter Olivi predicted that the Second Coming would take place that year. Isaac Newton made the same prediction in an early prophetic work, before settling on 2060 in his later writings. Puritan pastor Jonathan Edwards, author of the famous sermon Sinners in the Hands of an Angry God, tapped 2000 as the beginning of Christ’s millennial reign, as did Edgar Cayce and Sun Myung Moon.

Plenty of more exotic apocalypses were pinned on the same date. Popular psychic Ruth Montgomery proclaimed that the Earth would be knocked off its axis. Melody Mehta, a less widely known figure in the same business, insisted that a comet would knock Mars’ moon Phobos out of its orbit and send it careening into the Earth. On a less cosmic scale, financial writers Peter Jay and Michael Stewart predicted economic collapse and the rise of dictatorships in Europe and the US in a book somewhat unoriginally titled Apocalypse 2000. No matter what kind of apocalypse you were in the market for, 2000 had something on offer.

Except, of course, that none of them happened. In fact, the vast majority of all the predictions made for the year 2000, from the most exotic to the the most prosaic, landed with a thud. The fact that so many predictions clustered around that date, it turned out, showed only that when people try to predict the future, some dates are more popular than others.

 

 

— for more failed end time prophecies, see my book Apocalypse Not

 

Enacting Democracy

[John Michael Greer]

Written by testudoetlepus

December 20th, 2012 at 5:56 pm

On The Border

without comments

The topic of last week’s post, the likely fate of Israel in the twilight years of American empire, makes a good example of more than one common theme. As I commented in that earlier discussion, Israel is one of several American client states for whom the end of our empire will also be the end of the line. At the same time, it also highlights a major source of international tension that bids fair to bring in a bumper crop of conflict in the decades before us.

The word “irredentism” doesn’t get a lot of play in the media just now, but my readers may wish to keep it in mind; there’s every reason to think they will hear it fairly often in the future. It’s the conviction, on the part of a group of people, that they ought to regain possession of some piece of real estate that their ancestors owned at some point in the past. It’s an understandably popular notion, and its only drawback is the awkward detail that every corner of the planet, with the exception of Antarctica and a few barren island chains here and there, is subject to more than one such claim. The corner of the Middle East currently occupied by the state of Israel has a remarkable number of irredentist claims on it, but there are parts of Europe and Asia that could match it readily—and of course it only takes one such claim on someone else’s territory to set serious trouble in motion.

It’s common enough for Americans, if they think of irredentism at all, to think of it as somebody else’s problem. Airily superior articles in the New York Times and the like talk about Argentina’s claim to the Falklands or Bolivia’s demand for its long-lost corridor to the sea, for example, as though nothing of the sort could possibly spill out of other countries to touch the lives of Americans. I can’t think of a better example of this country’s selective blindness to its own history, because the great-grandmother of irredentist crises is taking shape right here in North America, and there’s every reason to think it will blow sky-high in the not too distant future.

That’s the third and last of the hot button topics I want to discuss as we close in on the end of the current sequence of posts on the end of American empire, and yes, I’m talking about the southern border of the United States.

Many Americans barely remember that the southwestern quarter of the United States used to be the northern half of Mexico. Most of them never learned that the Mexican War, the conflict that made that happen, was a straightforward act of piracy. (As far as I know, nobody pretended otherwise at the time—the United States in those days had not yet fallen into the habit of dressing up its acts of realpolitik in moralizing cant.) North of the Rio Grande, if the Mexican War comes to mind at all, it’s usually brushed aside with bland insouciance: we won, you lost, get over it. South of the Rio Grande? Every man, woman and child knows all the details of that war, and they have not gotten over it.

That might not matter much on this side of the border, except for two things. The first, which I’ve discussed here several times, is the dominant fact of 21st century North American geopolitics, the failure of US settlement in the dryland West. In the heyday of American expansion, flush with ample wealth from undepleted resources and unexhausted topsoil, the United States flung a pattern of human ecology nurtured on the well-watered soils of the Ohio and upper Mississippi valleys straight across the continent, dotting the Great Plains and the dry lands between the mountains with farms and farm towns. The dream was that these would follow the same trajectory as their predecessors further east, and turn into a permanently settled agricultural hinterland feeding wealth into newborn cities.

The Dust Bowl of the 1930s was the first sign that this grand fantasy was not going to be fulfilled. Behind the catastrophic impact of farming techniques poorly suited to the fragile western soils was a deeper, natural cycle of drought, one that the native peoples of the West knew well but white settlers were by and large too arrogant to learn. Since then, as the vulnerability of agriculture on the southern Plains to cyclical drought and other ecological challenges has become more and more clear, the usual response—throw more money and technology at it—has solved problems in the near term by turning them into insoluble predicaments in the longer term. Thus, for example, farmers faced with drought turned to irrigation using water from underground aquifers that date from the Ice Age and haven’t been replenished since then, gaining temporary prosperity at the cost of permanent ruin later on.

The details vary from region to region but the effect is the same. Across the dryland West, from the Great Plains to the Cascade and Sierra Nevada ranges, a new kind of ghost town is emerging alongside the old breed from the days of the gold and silver rushes. Homes, churches, schools, city halls sit empty as tumbleweeds roll down the streets; with the decline of the old agricultural economy, all the townsfolk, or all but a few stubborn retirees, have gone elsewhere. There are county-sized areas in several of the Plains states these days that once again fit the old definition of frontier: fewer than two non-Native American people per square mile. In response, the vacuum is being filled by the nearest nation that has enough spare people and cultural vitality for the job.

I encourage those of my readers who doubt this claim to book a long bus trip through any of the major agricultural regions of the United States west of the Mississippi valley. You’ll want the run that stops at every other two-bit farm town along the way, because that’s where you’re going to see a significant part of America’s future: the towns that are Mexican by every standard except for a few lines on a map. It’s not just that the signs are all in Spanish; the movie posters in the video shop windows are for Mexican movies, the snacks in the gas stations are Mexican brands, the radio announcers are talking excitedly about Mexican sports teams and the people on the street are wearing Mexican fashions. Such towns aren’t limited these days to the quarter of the United States that used to be half of Mexico; they can be found in most of the country’s agricultural regions, and increasingly beyond them as well.

In the United States, this isn’t something you talk about. There’s plenty of rhetoric about immigration from Mexico, to be sure, but nearly all of it focuses on the modest fraction of those immigrants who cross into the US illegally. Behind that focus is another thing people in the United States don’t talk about, which is the bitter class warfare between America’s middle class and its working class. Illegal immigration is good for the middle class, because illegal immigrants—who have effectively no rights and thus can be paid starvation wages for unskilled and semiskilled labor—drive down the cost of labor, and thus decrease the prices of goods and services that middle class people want. By the same token, illegal immigration is bad for the working class, because the same process leaves working class Americans with shrinking paychecks and fewer job opportunities.

Nobody in the middle class wants to admit that it’s in their economic interest to consign the American working class to misery and impoverishment; nobody in the working class wants to use the language of class warfare, for fear of handing rhetorical weapons to the next class down; so both sides bicker about a convenient side issue, which in this case happens to be illegal immigration, and they bicker about it in the shrill moral language that afflicts discussions of most issues in today’s America, so that the straightforward political and economic issues don’t come up. Meanwhile, the demographic shift continues, and redefines the future history and cultural landscape of the North American continent.

Students of history will recognize in the failure of US settlement in the dryland West a familiar pattern, one that is also under way on the other side of the Pacific—the Russian settlement of Siberia is turning into a dead end of the same kind, and immigrants from China and other Asian countries are flooding northwards there, quite probably laying the foundations for a Greater China that may someday extend west to the Urals and north to the Arctic Ocean. Still, there’s another pattern at work in North America. To make sense of it, a glance at one of the core sources of inspiration for this blog—the writings of Arnold Toynbee—will be helpful.

Central to Toynbee’s project, and to the sprawling 12-volume work A Study of History that came out of it, was the idea of putting corresponding stages in the rise and fall of civilizations side by side, and seeing what common factors could be drawn from the comparison. Simple in theory, that proved to be a gargantuan undertaking in practice, which is why nearly all of Toynbee’s career as a writer of history was devoted to that one project. The result is a core resource for the kind of work I’m trying to do in this blog: the attempt to gauge the shape of our future by paying attention to the ways similar patterns have worked out in the historic past.

One pattern that has plenty of examples on offer is the evolution of borderland regions caught between an imperial power and a much poorer and less technologically complex society. Imperial China and central Asia, the Roman world and the Germanic barbarians, the Toltecs of ancient Mexico and their Chichimec neighbors to the north—well, the list goes on. It’s a very common feature of history, and it unfolds in a remarkably precise and stereotyped way.

The first phase of that unfoldment begins with the rise and successful expansion of the imperial power. That expansion quite often involves the conquest of lands previously owned by less wealthy and powerful nations next door. For some time thereafter, neighboring societies that are not absorbed in this way are drawn into the imperial power’s orbit and copy its political and cultural habits—German tribal chieftains mint their own pseudo-Roman coins and drape themselves in togas, people very far from America copy the institutions of representative democracy and don blue jeans, and so on. A successful empire has a charisma that inspires imitation, and while it retains its ascendancy, that charisma makes the continued domination of its borderlands easy to maintain.

It’s when the ascendancy fails and the charisma crumbles that things start to get difficult. Toynbee uses a neat if untranslatable Latin pun to denote the difference: the charisma of a successful imperial power makes its borderlands a limen or doorway, while the weakening of its power and the collapse of its charisma compels it to replace the limen with a limes, a defensive wall. Very often, in fact, it’s when a physical wall goes up along the border that the imperial power, in effect, serves notice to its historians that its days are numbered.

Once the wall goes up, literally or figuratively, the focus shifts to the lands immediately outside it, and those lands go through a series of utterly predictable stages. As economic and political stresses mount along the boundary, social order collapses and institutions disintegrate, leaving power in the hands of a distinctive social form, the warband—a body of mostly young men whose sole trade is violence, and who are bound by personal loyalties to a charismatic warlord. At first, nascent warbands strive mostly with one another and with the crumbling institutions of their own countries, but before long their attention turns to the much richer pickings to be found on the other side of the wall. Raids and counter-raids plunge the region into a rising spiral of violence that the warbands can afford much more easily than the imperial government.

The final stages of the process depend on the broader pattern of decline. In Toynbee’s analysis, a civilization in decline always divides into a dominant minority, which maintains its power by increasingly coercive means, and an internal proletariat—that is, the bulk of the population, who are formally part of the civilization but receive an ever smaller share of its benefits and become ever more alienated from its values and institutions. This condition applies to the imperial state and its inner circle of allies; outside that core lies the world of the external proletariat—in the terms used in earlier posts here, these are the peoples subjected to the business end of the imperial wealth pump, whose wealth flows inward to support the imperial core but who receive few benefits in exchange.

The rise of warband culture drives the collapse of that arrangement. As warbands rise, coalesce, and begin probing across the border, the machinery that concentrates wealth in the hands of the dominant minority begins to break apart; tax revenues plunge as wealth turns into warband plunder, and the imperial state’s capacity to enforce its will dwindles. The end comes when the internal proletariat, pushed to the breaking point by increasingly frantic demands from the dominant minority, throws its support to the external proletariat—or, more to the point, to the successful leadership of one or more of the external proletariat’s biggest warbands—and the empire begins its final collapse into a congeries of protofeudal statelets. Much more often than not, that’s how the final crisis of a civilization unfolds; it’s also one standard way that common or garden variety empires fall, even when they don’t take a civilization down with them.

As the United States faces the end of its overseas empire and the drastic contraction of an economy long inflated by imperial tribute, in other words, it faces a massive difficulty much closer to home: a proud and populous nation on its southern border, with a vibrant culture but disintegrating political institutions, emergent warbands of the classic type, a large and growing demographic presence inside US borders, and a burning sense of resentment directed squarely at the United States. This is not a recipe for a peaceful imperial decline.

Nor is there much hope that the classic pattern can be evaded: the wall has already gone up, in the most literal sense, and the usual consequences are following. The warbands? The US media calls them “drug gangs,” since their involvement in drug smuggling across the border makes good copy. They haven’t yet completed the trajectory that will make them the heirs of the Huns and Visigoths, and in particular, the rock-star charisma that surrounds great warlords in an age of imperial collapse has only just begun to flicker around the most successful leaders of the nascent Mexican warbands. Give it time; the glorification of the gangster life that pervades popular culture toward the bottom of the socioeconomic pyramid these days shows that the seeds of that change have long since been planted.

Can anything be done to prevent this from proceeding all the way to its normal completion? At this stage in the game, probably not. An empire in the days of its power can sometimes stop the spiral by conquering the entire region—not merely the border area, but all the way out to the nearest major geographical barrier—and absorbing it fully into the imperial system; that’s why Gaul, which had been a source of constant raids against Roman interests early on, didn’t produce many warbands of its own in the years of decline until it was conquered and settled by Germanic tribes from points further east. Had the United States conquered all of Mexico in the 1870s, admitted its states into the Union, and integrated Mexican society fully into the American project, that might have worked, but it’s far too late in the day for that; the polarization of the borderlands is already a fact, so is the bitterness of a dispossessed people, and so is the ongoing unraveling of American power.

The other endpoint of the process—the only other endpoint of the process that can be found anywhere in recorded history—is the collapse of the imperial power. The United States has prepared plenty of other disasters for itself, by way of its unusually clueless choices in recent decades, and some of them are likely to hit well before the defense of the southern border becomes its most pressing and insoluble security problem. Still, I would encourage those of my readers who live in the dryland West, especially those within a state or so of the southern border, to keep an eye open for the first tentative raids, and perhaps to read up on what happened to those parts of the Roman Empire most directly exposed to warband incursions in the twilight years of Roman rule.

I would also like to ask any of my readers who are incensed by the above to stop, take a deep breath, and pay attention to what is and is not being said here. Again, the shrill rhetoric of moral judgment that treats every political question as an opportunity for self-righteous indignation, popular as it is, has no particular value in this context. More than a century and a half ago, American politicians decided to go to war with Mexico; over the next century or so, as a result of that decision and its cascading consequences, the social order basic to any viable society will most likely be shredded over a sizable part of what is now the United States, and stay that way for a good long time. That’s simply one of the things that can happen when an empire falls, and it’s something many of us can expect to see here in America in the years ahead.

End of the World of the Week #50

As previous entries in this series have shown, predicting the end of the world is a chancy business, and your likelihood of being proved wrong and made to eat crow is very high. There’s at least one way to avoid that awkward detail, though—make sure you don’t survive to see the failure of the prophecy—and a certain number of apocalyptic true believers have used that escape hatch.

The Orderof the Solar Temple—l’Ordre du Temple Solaire, for purists—was one of those. It emerged out of the New Age scene in the late 1980s, attracting a wealthy clientele in Quebec and a variety of European countries with a free mix of New Age philosophy and rituals borrowed from a range of occult traditions. Its founders, Luc Jouret and Joseph Di Mambro, started with a set of utopian fantasies of the usual sort, but as time passed and a New Age of peace and brotherhood unaccountably failed to dawn, they strayed further and further into the apocalyptic flip side of those fantasies. By the early 1990s the Solar Temple was preaching that the middle of that decade would see vast environmental catastrophes that would exterminate most if not all of the human race.

Most prophets of doom prefer to wait around, like Harold Camping, to see the end arrive, but Jouret, Di Mambro, and many of their followers were made of sterner stuff. That’s why they killed themselves en masse over a period of a few days late in November, 1994. The vast environmental catastrophes failed to arrive, of course, but that was no longer anything Jouret or Di Mambro had to worry about.

—for more failed end time prophecies, see my book Apocalypse Not

On The Border

[John Michael Greer]

Written by testudoetlepus

November 29th, 2012 at 4:19 pm