Gramercy Images News

A Financial Novelty weblog

Archive for the ‘The Archdruid Report’ tag

Dark Age America: The Collapse of Political Complexity

without comments

 

Imagine for a moment that one of the current US elite—an executive from a too-big-to-fail investment bank, a top bureaucrat from inside the DC beltway, a trust-fund multimillionaire with a pro forma job at the family corporation, or what have you—were to turn up in some chaotic failed state on the fringes of the industrial world, with no money, no resources, no help from abroad, and no ticket home. What’s the likelihood that, without anything other than whatever courage, charisma, and bare-knuckle fighting skills he might happen to have, some such person could equal Odoacer’s feat, win the loyalty and obedience of thousands of gang members and unemployed mercenaries, and lead them in a successful invasion of a neighboring country?

 

by John Michael Greer

The senility that afflicts ruling elites in their last years, the theme of the previous post in this sequence, is far from the only factor leading the rich and influential members of a failing civilization to their eventual destiny as lamppost decorations or come close equivalent. Another factor, at least as important, is a lethal mismatch between the realities of power in an age of decline and the institutional frameworks inherited from a previous age of ascent.

That sounds very abstract, and appropriately so. Power in a mature civilization is very abstract, and the further you ascend the social ladder, the more abstract it becomes. Conspiracy theorists of a certain stripe have invested vast amounts of time and effort in quarrels over which specific group of people it is that runs everything in today’s America. All of it was wasted, because the nature of power in a mature civilization precludes the emergence of any one center of power that dominates all others.

Look at the world through the eyes of an elite class and it’s easy to see how this works. Members of an elite class compete against one another to increase their own wealth and influence, and form alliances to pool resources and counter the depredations of their rivals. The result, in every human society complex enough to have an elite class in the first place, is an elite composed of squabbling factions that jealously resist any attempt at further centralization of power. In times of crisis, that resistance can be overcome, but in less troubled times, any attempt by an individual or faction to seize control of the whole system faces the united opposition of the rest of the elite class.

One result of the constant defensive stance of elite factions against each other is that as a society matures, power tends to pass from individuals to institutions. Bureaucratic systems take over more and more of the management of political, economic, and cultural affairs, and the policies that guide the bureaucrats in their work slowly harden until they are no more subject to change than the law of gravity. Among its other benefits to the existing order of society, this habit—we may as well call it policy mummification—limits the likelihood that an ambitious individual can parlay control over a single bureaucracy into a weapon against his rivals.

Our civilization is no exception to any of this. In the modern industrial world, some bureaucracies are overtly part of the political sphere; others—we call them corporations—are supposedly apart from government, and still others like to call themselves “non-governmental organizations” as a form of protective camouflage. They are all part of the institutional structure of power, and thus function in practice as arms of government. They have more in common than this; most of them have the same hierarchical structure and organizational culture; those that are large enough to matter have executives who went to the same schools, share the same values, and crave the same handouts from higher up the ladder. No matter how revolutionary their rhetoric, for that matter, upsetting the system that provides them with their status and its substantial benefits is the last thing any of them want to do.

All these arrangements make for a great deal of stability, which the elite classes of mature civilizations generally crave. The downside is that it’s not easy for a society that’s proceeded along this path to change its ways to respond to new circumstances. Getting an entrenched bureaucracy to set aside its mummified policies in the face of changing conditions is generally so difficult that it’s often easier to leave the old system in place while redirecting all its important functions to another, newly founded bureaucracy oriented toward the new policies. If conditions change again, the same procedure repeats, producing a layer cake of bureaucratic organizations that all supposedly exist to do the same thing.

Consider, as one example out of many, the shifting of responsibility for US foreign policy over the years. Officially, the State Department has charge of foreign affairs; in practice, its key responsibilities passed many decades ago to the staff of the National Security Council, and more recently have shifted again to coteries of advisers assigned to the Office of the President. In each case, what drove the shift was the attachment of the older institution to a set of policies and procedures that stopped being relevant to the world of foreign policy—in the case of the State Department, the customary notions of old-fashioned diplomacy; in the case of the National Security Council, the bipolar power politics of the Cold War era—but could not be dislodged from the bureaucracy in question due to the immense inertia of policy mummification in institutional frameworks.

The layered systems that result are not without their practical advantages to the existing order. Many bureaucracies provide even more stability than a single bureaucracy, since it’s often necessary for the people who actually have day to day responsibility for this or that government function to get formal approval from the top officials of the agency or agencies that used to have that responsibility, Even when those officials no longer have any formal way to block a policy they don’t like, the personal and contextual nature of elite politics means that informal options usually exist. Furthermore, since the titular headship of some formerly important body such as the US State Department confers prestige but not power, it makes a good consolation prize to be handed out to also-rans in major political contests, a place to park well-connected incompetents, or what have you.

Those of my readers who recall the discussion of catabolic collapse three weeks ago will already have figured out one of the problems with the sort of system that results from the processes just sketched out: the maintenance bill for so baroque a form of capital is not small. In a mature civilization, a large fraction of available resources and economic production end up being consumed by institutions that no longer have any real function beyond perpetuating their own existence and the salaries and prestige of their upper-level functionaries. It’s not unusual for the maintenance costs of unproductive capital of this kind to become so great a burden on society that the burden in itself forces a crisis—that was one of the major forces that brought the French Revolution, for instance. Still, I’d like to focus for a moment on a different issue, which is the effect that the institutionalization of power and the multiplication of bureaucracy has on the elites who allegedly run the system from which they so richly benefit.

France in the years leading up to the Revolution makes a superb example, one that John Kenneth Galbraith discussed with his trademark sardonic humor in his useful book The Culture of Contentment. The role of ruling elite in pre-1789 France was occupied by close equivalents of the people who fill that same position in America today: the “nobility of the sword,” the old feudal aristocracy, who had roughly the same role as the holders of inherited wealth in today’s America, and the “nobility of the robe,” who owed their position to education, political office, and a talent for social climbing, and thus had roughly the same role as successful Ivy League graduates do here and now. These two elite classes sparred constantly against each other, and just as constantly competed against their own peers for wealth, influence, and position.

One of the most notable features of both sides of the French elite in those days was just how little either group actually had to do with the day-to-day management of public affairs, or for that matter of their own considerable wealth. The great aristocratic estates of the time were bureaucratic societies in miniature, ruled by hierarchies of feudal servitors and middle-class managers, while the hot new financial innovation of the time, the stock market, allowed those who wanted their wealth in a less tradition-infested form to neglect every part of business ownership but the profits. Those members of the upper classes who held offices in government, the church, and the other venues of power presided decorously over institutions that were perfectly capable of functioning without them.

The elite classes of mature civilizations almost always seek to establish arrangements of this sort, and understandably so. It’s easy to recognize the attractiveness of a state of affairs in which the holders of wealth and influence get all the advantages of their positions and have to put up with as few as possible of the inconveniences thereof. That said, this attraction is also a death wish, because it rarely takes the people who actually do the work long to figure out that a ruling class in this situation has become entirely parasitic, and that society would continue to function perfectly well were something suitably terminal to happen to the titular holders of power.

This is why most of the revolutions in modern history have taken place in nations in which the ruling elite has followed its predilections and handed over all its duties to subordinates. In the case of the American revolution, the English nobility had been directly involved in colonial affairs in the first century or so after Jamestown. Once it left the colonists to manage their own affairs, the latter needed very little time to realize that the only thing they had to lose by seeking independence was the steady hemorrhage of wealth from the colonies to England. In the case of the French and Russian revolutions, much the same thing happened without the benefit of an ocean in the way: the middle classes who actually ran both societies recognized that the monarchy and aristocracy had become disposable, and promptly disposed of them once a crisis made it possible to do so.

The crisis just mentioned is a significant factor in the process. Under normal conditions, a society with a purely decorative ruling elite can keep on stumbling along indefinitely on sheer momentum. It usually takes a crisis—Britain’s military response to colonial protests in 1775, the effective bankruptcy of the French government in 1789, the total military failure of the Russian government in 1917, or what have you—to convince the people who actually handle the levers of power that their best interests no longer lie with their erstwhile masters. Once the crisis hits, the unraveling of the institutional structures of authority can happen with blinding speed, and the former ruling elite is rarely in a position to do anything about it. All they have ever had to do, and all they know how to do, is issue orders to deferential subordinates. When there are none of these latter to be found, or (as more often happens) when the people to whom the deferential subordinates are supposed to pass the orders are no longer interested in listening, the elite has no options left.

The key point to be grasped here is that power is always contextual. A powerful person is a person able to exert particular kinds of power, using particular means, on some particular group of other people, and someone thus can be immensely powerful in one setting and completely powerless in another. What renders the elite classes of a mature society vulnerable to a total collapse of power is that they almost always lose track of this unwelcome fact. Hereditary elites are particularly prone to fall into the trap of thinking of their position in society as an accurate measure of their own personal qualifications to rule, but it’s also quite common for those who are brought into the elite from the classes immediately below to think of their elevation as proof of their innate superiority. That kind of thinking is natural for elites, but once they embrace it, they’re doomed.

It’s dangerous enough for elites to lose track of the contextual and contingent nature of their power when the mechanisms through which power is enforced can be expected to remain in place—as it was in the American colonies in 1776, France in 1789, and Russia in 1917. It’s far more dangerous if the mechanisms of power themselves are in flux. That can happen for any number of reasons, but the one that’s of central importance to the theme of this series of posts is the catabolic collapse of a declining civilization, in which the existing mechanisms of power come apart because their maintenance costs can no longer be met.

That poses at least two challenges to the ruling elite, one obvious and the other less so. The obvious one is that any deterioration in the mechanisms of power limits the ability of the elite to keep the remaining mechanisms of power funded, since a great deal of power is always expended in paying the maintenance costs of power. Thus in the declining years of Rome, for example, the crucial problem the empire faced was precisely that the sprawling system of imperial political and military administration cost more than the imperial revenues could support, but the weakening of that system made it even harder to collect the revenues on which the rest of the system depended, and forced more of what money there was to go for crisis management. Year after year, as a result, roads, fortresses, and the rest of the infrastructure of Roman power sank under a burden of deferred maintenance and malign neglect, and the consequences of each collapse became more and more severe because there was less and less in the treasury to pay for rebuilding when the crisis was over.

That’s the obvious issue. More subtle is the change in the nature of power that accompanies the decay in the mechanisms by which it’s traditionally been used. Power in a mature civilization, as already noted, is very abstract, and the people who are responsible for administering it at the top of the social ladder rise to those positions precisely because of their ability to manage abstract power through the complex machinery that a mature civilization provides them. As the mechanisms collapse, though, power stops being abstract in a hurry, and the skills that allow the manipulation of abstract power have almost nothing in common with the skills that allow concrete power to be wielded.

Late imperial Rome, again, is a fine example. There, as in other mature civilizations, the ruling elite had a firm grip on the intricate mechanisms of social control at their uppermost and least tangible end. The inner circle of each imperial administration—which sometimes included the emperor himself, and sometimes treated him as a sock puppet—could rely on sprawling many-layered civil and military bureaucracies to put their orders into effect. They were by and large subtle, ruthless, well-educated men, schooled in the intricacies of imperial administration, oriented toward the big picture, and completely dependent on the obedience of their underlings and the survival of the Roman system itself.

The people who replaced them, once the empire actually fell, shared none of these characteristics except the ruthlessness. The barbarian warlords who carved up the corpse of Roman power had a completely different set of skills and characteristics: raw physical courage, a high degree of competence in the warrior’s trade, and the kind of charisma that attracts cooperation and obedience from those who have many other options. Their power was concrete, personal, and astonishingly independent of institutional forms. That’s why Odoacer, whose remarkable career was mentioned in an earlier post in this sequence, could turn up alone in a border province, patch together an army out of a random mix of barbarian warriors, and promptly lead them to the conquest of Italy.

There were a very few members of the late Roman elite who could exercise power in the same way as Odoacer and his equivalents, and they’re the exceptions that prove the rule. The greatest of them, Flavius Aetius, spent many years in youth as a hostage in the royal courts of the Visigoths and the Huns and got his practical education there, rather than in Roman schools. He was for all practical purposes a barbarian warlord who happened to be Roman by birth, and played the game as well as any of the other warlords of his age. His vulnerabilities were all on the Roman side of the frontier, where the institutions of Roman society still retained a fingernail grip on power, and so—having defeated the Visigoths, the Franks, the Burgundians, and the massed armies of Attila the Hun, all for the sake of Rome’s survival—he was assassinated by the emperor he served.

Fast forward close to two thousand years and it’s far from difficult to see how the same pattern of elite extinction through the collapse of political complexity will likely work out here in North America. The ruling elites of our society, like those of the late Roman Empire, are superbly skilled at manipulating and parasitizing a fantastically elaborate bureaucratic machine which includes governments, business firms, universities, and many other institutions among its components. That’s what they do, that’s what they know how to do, and that’s what all their training and experience has prepared them to do. Thus their position is exactly equivalent to that of French aristocrats before 1789, but they’re facing the added difficulty that the vast mechanism on which their power depends has maintenance costs that their civilization can no longer meet. As the machine fails, so does their power.

Nor are they particularly well prepared to make the transition to a radically different way of exercising power. Imagine for a moment that one of the current US elite—an executive from a too-big-to-fail investment bank, a top bureaucrat from inside the DC beltway, a trust-fund multimillionaire with a pro forma job at the family corporation, or what have you—were to turn up in some chaotic failed state on the fringes of the industrial world, with no money, no resources, no help from abroad, and no ticket home. What’s the likelihood that, without anything other than whatever courage, charisma, and bare-knuckle fighting skills he might happen to have, some such person could equal Odoacer’s feat, win the loyalty and obedience of thousands of gang members and unemployed mercenaries, and lead them in a successful invasion of a neighboring country?

There are people in North America who could probably carry off a feat of that kind, but you won’t find them in the current ruling elite. That in itself defines part of the path to dark age America: the replacement of a ruling class that specializes in managing abstract power through institutions with a ruling class that specializes in expressing power up close and in person, using the business end of the nearest available weapon. The process by which the new elite emerges and elbows its predecessors out of the way, in turn, is among the most reliable dimensions of decline and fall; we’ll talk about it next week.

 

 

Dark Age America: The Collapse of Political Complexity

[John Michael Greer]

| Gramercy Images |

Written by testudoetlepus

October 9th, 2014 at 4:06 pm

The End of Employment

without comments

 

by John Michael Greer

Nothing is easier, as the Long Descent begins to pick up speed around us, than giving in to despair—and nothing is more pointless. Those of us who are alive today are faced with the hugely demanding task of coping with the consequences of industrial civilization’s decline and fall, and saving as many as possible of the best achievements of the last few centuries so that they can cushion the descent and enrich the human societies of the far future. That won’t be easy; so? The same challenge has been faced many times before, and quite often it’s been faced with relative success.

The circumstances of the present case are in some ways more difficult than past equivalents, to be sure, but the tools and the knowledge base available to cope with them are almost incomparably greater. All in all, factoring in the greater challenges and the greater resources, it’s probably fair to suggest that the challenge of our time is about on a par with other eras of decline and fall. The only question that still remains to be settled is how many of the people who are awake to the imminence of crisis will rise to the challenge, and how many will fail to do so.

The suicide of peak oil writer Mike Ruppert two days ago puts a bit of additional emphasis on that last point. I never met Ruppert, though we corresponded back in the days when his “From The Wilderness” website was one of the few places on the internet that paid any attention at all to peak oil, and I don’t claim to know what personal demons drove him to put a bullet through his brain. Over the last eight years, though, as the project of this blog has brought me into contact with more and more people who are grappling with the predicament of our time, I’ve met a great many people whose plans for dealing with a postpeak world amount to much the same thing. Some of them are quite forthright about it, which at least has the virtue of honesty. Rather more of them conceal the starkness of that choice behind a variety of convenient evasions, the insistence that we’re all going to die soon anyway being far and away the most popular of these just now.

I admit to a certain macabre curiosity about how that will play out in the years ahead. I’ve suspected for a while now, for example, that the baby boomers will manage one final mediagenic fad on the way out, and the generation that marked its childhood with coonskin caps and hula hoops and its puberty with love beads and Beatlemania will finish with a fad for suicide parties, in which attendees reminisce to the sound of the tunes they loved in high school, then wash down pills with vodka and help each other tie plastic bags over their heads. Still, I wonder how many people will have second thoughts once every other option has gone whistling down the wind, and fling themselves into an assortment of futile attempts to have their cake when they’ve already eaten it right down to the bare plate. We may see some truly bizarre religious movements, and some truly destructive political ones, before those who go around today insisting that they don’t want to live in a deindustrial world finally get their wish.

There are, of course, plenty of other options. The best choice for most of us, as I’ve noted here in previous posts, follows a strategy I’ve described wryly as “collapse first and avoid the rush:” getting ahead of the curve of decline, in other words, and downshifting to a much less extravagant lifestyle while there’s still time to pick up the skills and tools needed to do it competently. Despite the strident insistence from defenders of the status quo that anything less than business as usual amounts to heading straight back to the caves, it’s entirely possible to have a decent and tolerably comfortable life on a tiny fraction of the energy and resource base that middle class Americans think they can’t possibly do without. Mind you, you have to know how to do it, and that’s not the sort of knowledge you can pick up from a manual, which is why it’s crucial to start now and get through the learning curve while you still have the income and the resources to cushion the impact of the inevitable mistakes.

This is more or less what I’ve been saying for eight years now. The difficulty at this stage in the process, though, is that a growing number of Americans are running out of time. I don’t think it’s escaped the notice of many people in this country that despite all the cheerleading from government officials, despite all the reassurances from dignified and clueless economists, despite all those reams of doctored statistics gobbled down whole by the watchdogs-turned-lapdogs of the media and spewed forth undigested onto the evening news, the US economy is not getting better. Outside a few privileged sectors, times are hard and getting harder; more and more Americans are slipping into the bleak category of the long-term unemployed, and a great many of those who can still find employment work at part-time positions for sweatshop wages with no benefits at all.

Despite all the same cheerleading, reassurances, and doctored statistics, furthermore, the US economy is not going to get better: not for more than brief intervals by any measure, and not at all if “better” means returning to some equivalent of America’s late 20th century boomtime. Those days are over, and they will not return. That harsh reality is having an immediate impact on some of my readers already, and that impact will only spread as time goes on. For those who have already been caught by the economic downdrafts, it’s arguably too late to collapse first and avoid the rush; willy-nilly, they’re already collapsing as fast as they can, and the rush is picking up speed around them as we speak.

For those who aren’t yet in that situation, the need to make changes while there’s still time to do so is paramount, and a significant number of my readers seem to be aware of this. One measure of that is the number of requests for personal advice I field, which has gone up steeply in recent months. Those requests cover a pretty fair selection of the whole gamut of human situations in a failing civilization, but one question has been coming up more and more often of late: the question of what jobs might be likely to provide steady employment as the industrial economy comes apart.

That’s a point I’ve been mulling over of late, since its implications intersect the whole tangled web in which our economy and society is snared just now. In particular, it assumes that the current way of bringing work together with workers, and turning the potentials of human mind and muscle toward the production of goods and services, is likely to remain in place for the time being, and it’s becoming increasingly clear to me that this won’t be the case.

It’s important to be clear on exactly what’s being discussed here. Human beings have always had to produce goods and services to stay alive and keep their families and communities going; that’s not going to change. In nonindustrial societies, though, most work is performed by individuals who consume the product of their own labor, and most of the rest is sold or bartered directly by the people who produce it to the people who consume it. What sets the industrial world apart is that a third party, the employer, inserts himself into this process, hiring people to produce goods and services and then selling those goods and services to buyers. That’s employment, in the modern sense of the word; most people think of getting hired by an employer, for a fixed salary or wage, to produce goods and services that the employer then sells to someone else, as the normal and natural state of affairs—but it’s a state of affairs that is already beginning to break down around us, because the surpluses that make that kind of employment economically viable are going away.

Let’s begin with the big picture. In any human society, whether it’s a tribe of hunter-gatherers, an industrial nation-state, or anything else, people apply energy to raw materials to produce goods and services; this is what we mean by the word “economy.” The goods and services that any economy can produce are strictly limited by the energy sources and raw materials that it can access.

A principle that ecologists call Liebig’s law of the minimum is relevant here: the amount of anything that a given species or ecosystem can produce in a given place and time is limited by whichever resource is in shortest supply. Most people get that when thinking about the nonhuman world; it makes sense that plants can’t use extra sunlight to make up for a shortage of water, and that you can’t treat soil deficient in phosphates by adding extra nitrates. It’s when you apply this same logic to human societies that the mental gears jam up, because we’ve been told so often that one resource can always be substituted for another that most people believe it without a second thought.

What’s going on here, though, is considerably more subtle than current jargon reflects. Examine most of the cases of resource substitution that find their way into economics textbooks, and you’ll find that what’s happened is that a process of resource extraction that uses less energy on a scarcer material has been replaced by another process that takes more energy but uses more abundant materials. The shift from high-quality iron ores to low-grade taconite that reshaped the iron industry in the 20th century, for example, was possible because ever-increasing amounts of highly concentrated energy could be put into the smelting process without making the resulting iron too expensive for the market.

The point made by this and comparable examples is applicable across the board to what I’ve termed technic societies, that subset of human societies—ours is the first, though probably not the last—in which a large fraction of total energy per capita comes from nonbiological sources and is put to work by way of machines rather than human or animal muscles. Far more often than not, in such societies, concentrated energy is the limiting resource. Given an abundant enough supply of concentrated energy at a low enough price, it would be possible to supply a technic society with raw materials by extracting dissolved minerals from seawater or chewing up ordinary rock to get a part per million or so of this or that useful element. Lacking that—and there are good reasons to think that human societies will always be lacking that—access to concentrated energy is where Liebig’s law bites down hard.

Another way to make this same point is to think of how much of any given product a single worker can make in a day using a set of good hand tools, and comparing that to the quantity of the same thing that the same worker could make using the successive generations of factory equipment, from the steam-driven and belt-fed power tools of the late 19th century straight through to the computerized milling machines and assembly-line robots of today. The difference can be expressed most clearly as a matter of the amount of energy being applied directly and indirectly to the manufacturing process—not merely the energy driving the tools through the manufacturing process, but the energy that goes into manufacturing and maintaining the tools, supporting the infrastructure needed for manufacture and maintenance, and so on through the whole system involved in the manufacturing process.

Maverick economist E.F. Schumacher, whose work has been discussed in this blog many times already, pointed out that the cost per worker of equipping a workplace is one of the many crucial factors that mainstream economic thought invariably neglects. That cost is usually expressed in financial terms, but underlying the abstract tokens we call money is a real cost in energy, expressed in terms of the goods and services that have to be consumed in the process of equipping and maintaining the workplace. If you have energy to spare, that’s not a problem; if you don’t, on the other hand, you’re actually better off using a less complex technology—what Schumacher called “intermediate technology” and the movement in which I studied green wizardry thirty years ago called “appropriate technology.”

The cost per worker of equipping a workplace, in turn, also has a political dimension—a point that Schumacher did not neglect, though nearly all other economists pretend that it doesn’t exist. The more costly it is to equip a workplace, the more certain it is that workers won’t be able to set themselves up in business, and the more control the very rich will then have over economic production and the supply of jobs. As Joseph Tainter pointed out in The Collapse of Complex Societies, social complexity correlates precisely with social hierarchy; one of the functions of complexity, in the workplace as elsewhere, is thus to maintain existing social pecking orders.

Schumacher’s arguments, though, focused on the Third World nations of his own time, which had very little manufacturing capacity at all—most of them, remember, had been colonies of European empires, assigned the role of producing raw materials and buying finished products from the imperial center as part of the wealth pump that drove them into grinding poverty while keeping their imperial overlords rich. He focused on advising client nations on how to build their own economies and extract themselves from the political grip of their former overlords, who were usually all too eager to import high-tech factories which their upper classes inevitably controlled. The situation is considerably more challenging when your economy is geared to immense surpluses of concentrated energy, and the supply of energy begins to run short—and of course that’s the situation we’re in today.

Even if it were just a matter of replacing factory equipment, that would be a huge challenge, because all those expensive machines—not to mention the infrastructure that manufactures them, maintains them, supplies them, and integrates their products into the wider economy—count as sunk costs, subject to what social psychologists call the “Concorde fallacy,” the conviction that it’s less wasteful to keep on throwing money into a failing project than to cut your losses and do something else. The real problem is that it’s not just factory equipment; the entire economy has been structured from the ground up to use colossal amounts of highly concentrated energy, and everything that’s been invested in that economy since the beginning of the modern era thus counts as a sunk cost to one degree or another.

What makes this even more challenging is that very few people in the modern industrial world actually produce goods and services for consumers, much less for themselves, by applying energy to raw materials. The vast majority of today’s employees, and in particular all those who have the wealth and influence that come with high social status, don’t do this. Executives, brokers, bankers, consultants, analysts, salespeople—well, I could go on for pages: the whole range of what used to be called white-collar jobs exists to support the production of goods and services by the working joes and janes managing all the energy-intensive machinery down there on the shop floor. So does the entire vast maze of the financial industry, and so do the legions of government bureaucrats—local, state, and federal—who manage, regulate, or oversee one or another aspect of economic activity.

All these people are understandably just as interested in keeping their jobs as the working joes and janes down there on the shop floor, and yet the energy surpluses that made it economically viable to perch such an immensely complex infrastructure on top of the production of goods and services for consumers are going away. The result is a frantic struggle on everyone’s part to make sure that the other guy loses his job first. It’s a struggle that all of them will ultimately lose—as the energy surplus needed to support it dwindles away, so will the entire system that’s perched on that high but precarious support—and so, as long as that system remains in place, getting hired by an employer, paid a regular wage or salary, and given work and a workplace to produce goods and services that the employer then sells to someone else, is going to become increasingly rare and increasingly unrewarding.

That transformation is already well under way. Nobody I know personally who works for an employer in the sense I’ve just outlined is prospering in today’s American economy. Most of the people I know who are employees in the usual sense of the word are having their benefits slashed, their working conditions worsened, their hours cut, and their pay reduced by one maneuver or another, and the threat of being laid off is constantly hovering over their heads. The few exceptions are treading water and hoping to escape the same fate. None of this is accidental, and none of it is merely the result of greed on the part of the very rich, though admittedly the culture of executive kleptocracy at the upper end of the American social pyramid is making things a good deal worse than they might otherwise be.

The people I know who are prospering right now are those who produce goods and services for their own use, and provide goods and services directly to other people, without having an employer to provide them with work, a workplace, and a regular wage or salary. Some of these people have to stay under the radar screen of the current legal and regulatory system, since the people who work in that system are trying to preserve their own jobs by making life difficult for those who try to do without their services. Others can do things more openly. All of them have sidestepped as many as possible of the infrastructure services that are supposed to be part of an employee’s working life—for example, they aren’t getting trained at universities, since the US academic industry these days is just another predatory business sector trying to keep itself afloat by running others into the ground, and they aren’t going to banks for working capital for much the same reason. They’re using their own labor, their own wits, and their own personal connections with potential customers, to find a niche in which they can earn the money (or barter for the goods) they need or want.

I’d like to suggest that this is the wave of the future—not least because this is how economic life normally operates in nonindustrial societies, where the vast majority of people in the workforce are directly engaged in the production of goods and services for themselves and their own customers. The surplus that supports all those people in management, finance, and so on is a luxury that nonindustrial societies don’t have. In the most pragmatic of economic senses, collapsing now and avoiding the rush involves getting out of a dying model of economics before it drags you down, and finding your footing in the emerging informal economy while there’s still time to get past the worst of the learning curve.

Playing by the rules of a dying economy, that is, is not a strategy with a high success rate or a long shelf life. Those of my readers who are still employed in the usual sense of the term may choose to hold onto that increasingly rare status, but it’s not wise for them to assume that such arrangements will last indefinitely; using the available money and other resources to get training, tools, and skills for some other way of getting by would probably be a wise strategy. Those of my readers who have already fallen through the widening cracks of the employment economy will have a harder row to hoe in many cases; for them, the crucial requirement is getting access to food, shelter, and other necessities while figuring out what to do next and getting through any learning curve that might be required.

All these are challenges; still, like the broader challenge of coping with the decline and fall of a civilization, they are challenges that countless other people have met in other places and times. Those who are willing to set aside currently popular fantasies of entitlement and the fashionable pleasures of despair will likely be in a position to do the same thing this time around, too.

 

The End of Employment

John Michael Greer

| Gramercy Images |

Written by testudoetlepus

April 21st, 2014 at 7:13 pm

Man, Conqueror of Nature, Dead at 408

without comments

EARLY SELFIE by WilliamBanzai7/Colonel Flick

EARLY SELFIE, a photo by WilliamBanzai7/Colonel Flick on Flickr.

 

by John Michael Greer

Man, the conqueror of Nature, died Monday night of a petroleum overdose, the medical examiner’s office confirmed this morning. The abstract representation of the human race was 408 years old. The official announcement has done nothing to quell the rumors of suicide and substance abuse that have swirled around the death scene since the first announcement yesterday morning, adding new legal wrinkles to the struggle already under way over Man’s inheritance.

Man’s closest associates disagree about what happened. His longtime friend and confidant Technology thinks it was suicide. “Sure, Man liked to have a good time,” he said at a press conference Tuesday evening, “and he was a pretty heavy user, but it wasn’t like he was out of control or anything. No, I’m sure he did it on purpose. Just a couple of weeks ago we were hanging out at his place, looking up at the moon and talking about the trips we made out there, and he turned to me and said, ‘You know, Tech, that was a good time—a really good time. I wonder if I’ll ever do anything like that again.’ He got into moods like that more and more often in the last few years. I tried to cheer him up, talking about going to Mars or what have you, and he’d go along with it but you could tell his heart wasn’t in it.”

Other witnesses told a different story. “It was terrifying,” said a housekeeper who requested that her name not be given. “He was using more and more of the stuff every day, shooting it up morning, noon and night, and when his connections couldn’t get him as much as he wanted, he’d go nuts. You’d hear him screaming at the top of his lungs and pounding his fists on the walls. Everybody on the staff would hide whenever that happened, and it happened more and more often—the amount he was using was just unbelievable. Some of his friends tried to talk him into getting help, or even just cutting back a little on his petroleum habit, but he wouldn’t listen.”

The medical examiner’s office and the police are investigating Man’s death right now. Until their report comes out, the tragic end of humanity’s late self-image remains shrouded in mystery and speculation.

 

 

A Tumultuous Family Saga

“He was always a rebel,” said Clio, the muse of history, in an exclusive interview in her office on Parnassus this morning. “That was partly his early environment, of course.span style=”mso-spacerun: yes;”  /spanHe was born in the household of Sir Francis Bacon, remember, and brought up by some of the best minds of seventeenth-century Europe; an abstract image of humanity raised by people like that wasn’t likely to sit back and leave things as they were, you know. Still, I think there were strong family influences too. His father was quite the original figure himself, back in the day.”

Though almost forgotten nowadays, Man’s father Everyman, the abstract representation of medieval humanity, was as mediagenic in his own time as his son became later on.span style=”mso-spacerun: yes;”  /spanThe star of a wildly popular morality play and the subject of countless biographies, Everyman was born in extreme poverty in a hovel in post-Roman Europe, worked his way up to become a wealthy and influential figure in the Middle Ages and Renaissance, then stepped aside from his financial and political affairs to devote his last years to religious concerns. Savage quarrels between father and son kept the broadsheet and pamphlet press fed with juicy stories all through the seventeenth and eighteenth centuries, and eventually led to their final breach over Darwin’s theory of evolution in 1859.

By that time Man was already having problems with substance abuse. “He was just using coal at first,” Technology reminisced. “Well, let’s be fair, we both were. That was the hot new drug in those days.span style=”mso-spacerun: yes;”  /spanIt was cheap, you could get it without too much hassle, and everybody on the cutting edge was using it. I remember one trip we took together—it was on one of the early railroads, at thirty miles an hour. We thought that was really fast.span style=”mso-spacerun: yes;”  /spanWere we innocent back then, or what?”

Clio agreed with that assessment. “I don’t think Man had any idea what he was getting into, when he started abusing coal,” she said. “It was an easy habit to fall into, very popular in avant-garde circles just then, and nobody yet knew much about the long term consequences of fossil fuel abuse. Then, of course, he started his campaign to conquer Nature, and he found out very quickly that he couldn’t keep up the pace he’d set for himself without artificial help. That was when the real tragedy began.”

The Conquest of Nature

It’s an open question when Man first decided to conquer Nature. “The biographers all have their own opinions on that,” Clio explained, gesturing at a shelf loaded with books on Man’s dramatic and controversial career.span style=”mso-spacerun: yes;”  /span”Some trace it back to the influence of his foster-father Francis Bacon, or the other mentors and teachers he had in his early days. Others say that the inspiration came from the crowd he ran with when he was coming of age in the eighteenth and nineteenth centuries. He used to tell interviewers that it was a family thing, that everyone in his family all the way back to the Stone Age had been trying to conquer Nature and he was just the one who finally succeeded, but that won’t stand up to any kind of scrutiny. Examine the career of Everyman, for example, and you’ll find that he wasn’t interested in conquering Nature; he wanted to conquer himself.”

“The business about conquering Nature?” Technology said. “He got into that back when we were running around being young and crazy. I think he got the idea originally from his foster-father or one of the other old guys who taught him when he was a kid, but as far as I know it wasn’t a big deal to him until later. Now I could be wrong, you know. I didn’t know him that well in those days; I was mostly just doing my thing then, digging mines, building water mills, stuff like that. We didn’t get really close until we both got involved in this complicated coal deal; we were both using, but I was dealing, too, and I could get it cheaper than anybody else—I was using steam, and none of the other dealers knew how to do that. So we got to be friends and we had some really wild times together, and now and then when we were good and ripped, he’d get to talking about how Nature ought to belong to him and one of these days he was going to hire some soldiers and just take it.

“Me, I couldn’t have cared less, except that Man kept on bringing me these great technical problems, really sweet little puzzles, and I’ve always been a sucker for those. He figured out how I was getting the coal for him so cheap, you see, and guessed that I could take those same tricks and use them for his war against Nature. For me, it was just a game, for Nature, against Nature, I couldn’t care less.” Just give me a problem and let me get to work on it, and I’m happy.

“But it wasn’t just a game for him. I think it was 1774 when he really put me to work on it.span style=”mso-spacerun: yes;”  /spanHe’d hired some mercenaries by then, and was raising money and getting all kind of stuff ready for the war.span style=”mso-spacerun: yes;”  /spanHe wanted steam engines so, like the man said, it was steam engine time—I got working on factories, railroads, steamships, all the rest. He already had some of his people crossing the border into Nature to seize bits of territory before then, but the eighteenth century, that’s when the invasion started for real. I used to stand next to him at the big rallies he liked to hold in those days, with all the soldiers standing in long lines, and he’d go into these wild rants about the glorious future we were going to see once Nature was conquered. The soldiers loved it; they’d cheer and grab their scientific instruments and lab coats and go conquer another province of Nature.”

The Triumphant Years

It was in 1859, Technology recalled, that Man first started using petroleum. “He’d just had the big spat with his dad over this Darwin dude: the worst fight they ever had, and in fact Man never spoke to the old man again. Man was still steaming about the fight for days afterwards, and then we heard that this guy named Edwin Drake over in Pennsylvania could get you something that was an even bigger rush than coal. Of course Man had to have some, and I said to myself, hey, I’ll give it a try—and that was all she wrote, baby. Oh, we kept using coal, and a fair bit of it, but there’s nothing like petroleum.

“What’s more, Man figured out that that’s what he needed to finish his conquest of Nature. His mercs had a good chunk of Nature by then, but not all of it, not even half, and Man was having trouble holding some of the territory he’d taken—there were guerrillas behind his lines, that sort of thing. He’d pace around at headquarters, snapping at his staff, trying to figure out how to get the edge he needed to beat Nature once and for all. ‘I’ve gotta have it all, Tech,’ he’d say sometimes, when we were flopped on the couch in his private quarters with a couple of needles and a barrel of petroleum, getting really buzzed. ‘I’ve conquered distance, the land, the surface of the sea—it’s not enough. I want it iall/i.’ And you know, he got pretty close.”

Petroleum was the key, Clio explained. “It wasn’t just that Man used petroleum, all his soldiers and his support staff were using it too, and over the short term it’s an incredibly powerful drug; it gives users a rush of energy that has to be seen to be believed. Whole provinces of Nature that resisted every attack in the first part of the war were overrun once Man started shipping petroleum to his forces. By the 1950s, as a result, the conquest of Nature was all but complete. Nature still had a few divisions holed up in isolated corners where they couldn’t be gotten at by Man’s forces, and partisan units were all over the conquered zone, but those were minor irritations at that point. It was easy enough for Man and his followers to convince themselves that in a little while the last holdouts would be defeated and Nature would be conquered once and for all.

“That’s when reality intervened, though, because all those years of abusing coal, petroleum, and other substances started to catch up with Man. He was in bad shape, and didn’t know it—and then he started having problems feeding his addiction.”

On and Off the Wagon

“I forget exactly how it happened,” Technology recounted. “It was some kind of disagreement with his suppliers—he was getting a lot of his stuff from some Arab guys at that point, and he got into a fight with them over something, and they said, ‘Screw you, man, if you’re going to be like that we’re just not going to do business with you any more.’ So he tried to get the stuff from somebody else, and it turned out the guy from Pennsylvania was out of the business, and the connections he had in Texas and California couldn’t get enough. The Arab guys had a pretty fair corner on the market. So Man went into withdrawal, big time. We got him to the hospital, and the doctor took one look at him and said, ‘You gotta get into rehab, now.’ So me and some of his other friends talked him into it.”

“The records of his stays in rehab are heartbreaking,” Clio said, pulling down a tell-all biography from her shelf. “He’d start getting the drug out of his system, convince himself that he was fine, check himself out, and start using again almost immediately. Then, after a little while, he’d have problems getting a fix, end up in withdrawal, and find his way back into rehab. Meanwhile the war against Nature was going badly as the other side learned how to fight back effectively. There were rumors of ceasefire negotiations, even a peace treaty between him and Nature.”

“I went to see him in rehab one day,” said Technology. “He looked awful. He looked iold/i—like his old man Everyman. He was depressed, too, talking all the time about this malaise thing. The thing is, I think if he’d stuck with it then he could have gotten off the stuff and straightened his life out. I really think he could have done it, and I tried to help. I brought him some solar panels, earth-sheltered housing, neat stuff like that, to try to get him interested in something besides the war on Nature and his petroleum habit. That seemed to cheer him up, and I think all his friends had high hopes for a while.

“Then the next thing I heard, he was out of rehab. He just couldn’t hack it any longer. I went to his place, and there he was, laughing and slapping everybody’s back and full of big ideas and bigger plans, just like before. That’s what it looked like at first, but the magic was gone. He tried to do a comeback career, but he just couldn’t get it back together, and things went downhill from there.”

The Final Years

The last years of Man’s career as representation of the human race were troubled. “The war against Nature wasn’t going well by then,” Clio explained. “Man’s forces were holding onto the most important provinces and cities, but insurgencies were springing up all over—drug-resistant microbes here, herbicide-tolerant weeds there. Morale was faltering, and a growing fraction of Man’s forces in the struggle against Nature no longer believed in what they were doing. They were in it for the money, nothing more, and the money was running out. Between the costs of the war, the costs of Man’s lavish lifestyle, and the rising burden of his substance abuse problem, Man was in deep financial trouble; there’s reason to believe that he may have been engaged in outright fraud to pay his bills during the last few years of his life.”

Meanwhile, Man was becoming increasingly isolated. “He’d turned his back on most of his friends,” said the anonymous housekeeper quoted earlier. “Art, Literature, Philosophy—he stopped talking to any of them, because they kept telling him to get off the stuff and straighten out his life. I remember the last time Science came to visit—she wanted to talk to Man about the state of the atmosphere, and Man literally threw her out of the house and slammed the door in her face.span style=”mso-spacerun: yes;”  /spanI was working downstairs in the laundry, where you usually can’t hear much, but I could hear Man screaming, ‘I own the atmosphere! I own the planet! I own the solar system! I own the goddam istars/i! They’re mine, mine, imine/i—how dare you tell me what to do with my property?’ He went on like that for a while, then collapsed right there in the entry. A couple of us went up, carried him into his bedroom, and got him cleaned up and put to bed. We had to do that pretty often, the last year or so.”

His longtime friend Technology was apparently the last person to see Man alive. “I went over to his place Monday afternoon,” Technology recalled. “I went there pretty often, and we’d do some stuff and hang out, and I’d start rapping about all kinds of crazy stuff, omniscient supercomputers, immortal robot bodies, stuff like that. I told him, ‘Look, Man, if you want to get into stuff like omniscience and immortality, go talk to Religion.span style=”mso-spacerun: yes;”  /spanThat’s her bag, not mine.’ But he didn’t want to do that; he had some kind of falling out with her a while back, you know, and he wanted to hear it from me, so I talked it up. It got him to mellow out and unwind, and that’s what mattered to me.

“Monday, though, we get to talking, and it turns out that the petroleum he had was from this really dirty underground source in North Dakota. I said to him, ‘Man, what the frack were you thinking?’ He just looked at me and said, ‘I’ve gotta have the stuff, Tech. I’ve gotta have the stuff.’ Then he started blubbering, and I reached out to, like, pat his shoulder—and he just blew up at me. He started yelling about how it was my fault he was hooked on petroleum, my fault the war against Nature wasn’t going well, my fault this and that and blah blah blah. Then he got up and stormed out of the room and slammed the door behind him. I should have gone after him, I know I should have, but instead I just shook my head and left. Maybe if I’d gone and tried to talk him down, he wouldn’t have done it.”

“Everything was quiet,” the housekeeper said. “Too quiet. Usually we’d hear Man walking around, or he’d put some music on or something, but Monday night, the place might as well have been empty. Around ten o’ clock, we were really starting to wonder if something was wrong, and two of us from the housekeeping staff decided that we really had to go check on Man and make sure he was all right. We found him in the bathroom, lying on the floor. It was horrible—the room stank of crude oil, and there was the needle and all his other gear scattered around him on the floor. We tried to find a pulse, but he was already cold and stiff; I went and called for an ambulance anyway, and—well, you know the rest.”

The Troubled Aftermath

Man’s death leaves a great many questions unanswered. “By the time Everyman died,” Clio explained, “everyone knew who his heir would be.span style=”mso-spacerun: yes;”  /spanMan had already taken over his father’s role as humanity’s idealized self-image. That hasn’t happened this time, as you know. Man didn’t leave a will, and his estate is a mess—it may be years before the lawyers and the accountants finish going through his affairs and figure out whether there’s going to be anything at all for potential heirs to claim. Meanwhile there are at least half a dozen contenders for the role of abstract representation of the human race, and none of them is a clear favorite. It may be a long time before all the consequences are sorted out.”

Meanwhile, one of the most important voices in the debate has already registered an opinion. Following her invariable habit, Gaia refused to grant any personal interviews, but a written statement to the media was delivered by a spokesrabbit on Tuesday evening. “Please accept My sympathy for the tragic demise of Man, the would-be conqueror of Nature,” it read. “I hope it will not be out of place, though, to suggest that whomever My human children select as their new self-image might consider being a little less self-centered—not to mention a little less self-destructive.”

 

[WilliamBanzai]

[Man, Conqueror of Nature, Dead at 408]

[The Archdruid Report]

Written by testudoetlepus

December 23rd, 2013 at 3:13 pm

The End of the Shale Bubble?

without comments

by John Michael Greer

It’s been a little more than a year since I launched the present series of posts on the end of America’s global empire and the future of democracy in the wake of this nation’s imperial age. Over the next few posts I plan on wrapping that theme up and moving on.However traumatic the decline and fall of the American empire turns out to be, after all, it’s just one part of the broader trajectory that this blog seeks to explore, and other parts of that trajectory deserve discussion as well.

I’d planned to have this week’s post take last week’s discussion of voluntary associations further, and talk about some of the other roles that can be filled, in a time of economic contraction and social disarray, by groups of people using the toolkit of democratic process and traditional ways of managing group activities and assets. Still, that topic is going to have to wait another week, because one of the other dimensions of the broader trajectory just mentioned is moving rapidly toward crisis.

It’s hard to imagine that anybody in today’s America has escaped the flurry of enthusiastic media coverage of the fracking phenomenon.Still, that coverage has included so much misinformation that it’s probably a good idea to recap the basics here. Hydrofracturing—“fracking” in oil industry slang—is an old trick that has been used for decades to get oil and natural gas out of rock that isn’t porous enough for conventional methods to get at them. As oil and gas extraction techniques go, it’s fairly money-, energy- and resource-intensive, and so it didn’t see a great deal of use until fairly recently.

Then the price of oil climbed to the vicinity of $100 a barrel and stayed there. Soaring oil prices drove a tectonic shift in the US petroleum industry, making it economically feasible to drill for oil in deposits that weren’t worth the effort when prices were lower. One of those deposits was the Bakken shale, a sprawling formation of underground rock in the northern Great Plains, which was discovered back in the 1970s and sat neglected ever since due to low oil prices. To get any significant amount of oil out of the Bakken, you have to use fracking technology, since the shale isn’t porous enough to let go of its oil any other way.Once the rising price of crude oil made the Bakken a paying proposition, drilling crews headed that way and got to work, launching a lively boom.

Another thoroughly explored rock formation further east, the Marcellus shale, attracted attention from the drilling rigs for a different reason, or rather a different pair of reasons.The Marcellus contains no oil to speak of, but some parts of it have gas that is high in natural gas liquids—“wet gas” is the industry term for this—and since those liquids can replace petroleum in some applications, they can be sold at a much higher price than natural gas.Meanwhile, companies across the natural gas industry looked at the ongoing depletion of US coal reserves, and the likelihood of government mandates favoring natural gas over coal for power generation, and decided that these added up to a rosy future for natural gas prices.Several natural gas production firms thus started snapping up leases in the Marcellus country of Pennsylvania and neighboring states, and a second boom got under way.

As drilling in the Bakken and Marcellus shales took off, several other shale deposits, some containing oil and natural gas, others just natural gas, came in for the same sort of treatment. The result was a modest temporary increase in US petroleum production, and a more substantial but equally temporary increase in US natural gas production.It could never be anything more than temporary, for reasons hardwired into the way fracking technology works.

If you’ve ever shaken a can of soda pop good and hard and then opened it, you know something about fracking that countless column inches of media cheerleading on the subject have sedulously avoided. The technique is different, to be sure, but the effect of hydrofracturing on oil and gas trapped in shale is not unlike the effect of a hard shake on the carbon dioxide dissolved in soda pop:in both cases, you get a sudden rush toward the outlet, which releases most of what you’re going to get.Oil and gas production from fracked wells thus starts out high but suffers ferocious decline rates—up to 90% in the first year alone.Where a conventional, unfracked well can produce enough oil or gas to turn a profit for decades if it’s well managed, fracked wells in tight shales like the Bakken and Marcellus quite often stop becoming a significant source of oil or gas within a few years of drilling.

The obvious response to this problem is to drill more wells, and this accordingly happened. That isn’t a panacea, however. Oil and gas exploration is a highly sophisticated science, and oil and gas drilling companies can normally figure out the best sites for wells long before the drill bit hits the ground. Since they are in business to make money, they normally drill the best sites first. When that sensible habit intersects with the rapid production decline rates found in fracked wells, the result is a brutal form of economic arithmetic:as the best sites are drilled and the largest reserves drained, drilling companies have to drill more and more wells to keep the same amount of oil or gas flowing.Costs go up without increasing production, and unless prices rise, profits get hammered and companies start to go broke.

They start to go broke even more quickly if the price of the resource they’re extracting goes down as the costs of maintaining production go up.In the case of natural gas, that’s exactly what happened. Each natural gas production company drew up its projections of future prices on the assumption that ordinary trends in production would continue.As company after company piled into shale gas, though, production soared, and the harsh economic downturn that followed the 2008 housing market crash kept plummeting natural gas prices from spurring increased use of the resource; so many people were so broke that even cheap natural gas was too expensive for any unnecessary use.

Up to that point, the fracking story followed a trajectory painfully familiar to anyone who knows their way around the economics of alternative energy.From the building of the first solar steam engines before the turn of the last century, through the boom-and-bust cycle of alternative energy sources in the late 1970s, right up to the ethanol plants that were launched with so much fanfare a decade ago and sold for scrap much more quietly a few years later, the pattern’s the same, a repeated rhythm of great expectations followed by shattered dreams. .

Here’s how it works.A media panic over the availability of some energy resource or other sparks frantic efforts to come up with a response that won’t require anybody to change their lifestyles or, heaven help us, conserve. Out of the flurry of available resources and technologies, one or two seize the attention of the media and, shortly thereafter, the imagination of the general public.  Money pours into whatever the chosen solution happens to be, as investors convince themselves that there’s plenty of profit to be made backing a supposedly sure thing, and nobody takes the time to ask hard questions.In particular, investors tend to lose track of the fact that something can be technically feasible without being economically viable, and rosy estimates of projected cash flow and return on investment take the place of meaningful analysis.

Then come the first financial troubles, brushed aside by cheerleading “analysts” as teething troubles or the results of irrelevant factors certain to pass off in short order.The next round of bad news follows promptly, and then the one after that; the first investors begin to pull out; sooner or later, one of the hot companies that has become an icon in the new industry goes suddenly and messily bankrupt, and the rush for the exits begins.Barring government subsidies big enough to keep some shrunken form of the new industry stumbling along thereafter, that’s usually the end of the road for the former solution du jour, and decades can pass before investors are willing to put their money into the same resource or technology again.

That’s the way that the fracking story started, too. By the time it was well under way, though, a jarring new note had sounded:the most prestigious of the US mass media suddenly started parroting the most sanguine daydreams of the fracking industry.They insisted at the top of their lungs that the relatively modest increases in oil and gas production from fracked shales marked a revolutionary new era, in which the United States would inevitably regain the energy independence it last had in the 1950s, and prosperity would return for all—or at least for all who jumped aboard the new bandwagon as soon as possible. Happy days, we were told, were here again.

What made this barrage of propaganda all the more fascinating was the immense gaps that separated it from the realities on and under the ground in Pennsylvania and North Dakota. The drastic depletion rates from fracked wells rarely got a mention, and the estimates of how much oil and gas were to be found in the various shale deposits zoomed upwards with wild abandon.Nor did the frenzy stop there; blatant falsehoods were served up repeatedly by people who had every reason to know that they were false—I’m thinking here of the supposedly energy-literate pundits who insisted, repeatedly and loudly, that the Green River shale in the southwest was just like the Bakken and Marcellus shales, and would yield abundant oil and gas once it was fracked. (The Green River shale, for those who haven’t been keeping score, contains no oil or gas at all; instead, it contains kerogen, a waxy hydrocarbon goo that would have turned into oil or gas if it had stayed deep underground for a few million years longer, and kerogen can’t be extracted by fracking—or, for that matter, by any other economically viable method.)

Those who were paying attention to all the hoopla may have noticed that the vaporous claims being retailed by the mainstream media around the fracking boom resembled nothing so much as the equally insubstantial arguments most of the same media were serving up around the housing boom in the years immediately before the 2008 crash.The similarity isn’t accidental, either. The same thing happened in both cases:Wall Street got into the act.

A recent report from financial analyst Deborah Rogers, Shale and Wall Street (you can download a copy in PDF format here), offers a helpful glimpse into the three-ring speculative circus that sprang up around shale oil and shale gas during the last three years or so.Those of my readers who suffer from the delusion that Wall Street might have learned something from the disastrous end of the housing bubble are in for a disappointment:the same antics, executed with the same blissful disregard for basic honesty and probity, got trotted out again, with results that will be coming down hard on what’s left of the US economy in the months immediately ahead of us.

If you remember the housing bubble, you know what happened.Leases on undrilled shale fields were bundled and flipped on the basis of grotesquely inflated claims of their income potential; newly minted investment vehicles of more than Byzantine complexity—VPPs, “volumetric production payments,” are an example you’ll be hearing about quite a bit in a few months, once the court cases begin—were pushed on poorly informed investors and promptly began to crash and burn; as the price of natural gas dropped and fracking operations became more and more unprofitable, “pump and dump” operations talked up the prospects of next to worthless properties, which could then be unloaded on chumps before the bottom fell out.It’s an old story, if a tawdry one, and all the evidence suggests that it’s likely to finish running its usual course in the months immediately ahead.

There are at least two points worth making as that happens. The first is that we can expect more of the same in the years immediately ahead.Wall Street culture—not to mention the entire suite of economic expectations that guides the behavior of governments, businesses, and most individuals in today’s America—assumes that the close-to-zero return on investment that’s become standard in the last few years is a temporary anomaly, and that a good investment ought to bring in what used to be considered a good annual return:4%, 6%, 8%, or more. What only a few thinkers on the fringes have grasped is that such returns are only normal in a growing economy, and we no longer have a growing economy.

Sustained economic growth, of the kind that went on from the beginning of the industrial revolution around 1700 to the peak of conventional oil production around 2005, is a rare anomaly in human history.It became a dominant historical force over the last three centuries because cheap abundant energy from fossil fuels could be brought into the economy at an ever-increasing rate, and it stopped because geological limits to fossil fuel extraction put further increases in energy consumption permanently out of reach. Now that fossil fuels are neither cheap nor abundant, and the quest for new energy sources vast and concentrated enough to replace them has repeatedly drawn a blank, we face several centuries of sustained economic contraction—which means that what until recently counted as the groundrules of economics have just been turned on their head.

You will not find many people on Wall Street capable of grasping this. The burden of an outdated but emotionally compelling economic orthodoxy, to say nothing of a corporate and class culture that accords economic growth the sort of unquestioned aura of goodness other cultures assign to their gods, make the end of growth and the coming of permanent economic decline unthinkable to the financial industry, or for that matter to the millions of people in the industrial world who rely on investments to pay their bills.There’s a strong temptation to assume that those 8% per annum returns must still be out there, and when something shows up that appears to embody that hope, plenty of people are willing to rush into it and leave the hard questions for later.Equally, of course, the gap thus opened between expectations and reality quickly becomes a happy hunting ground for scoundrels of every stripe.

Vigorous enforcement of the securities laws might be able to stop the resulting spiral into a permanent bubble-and-bust economy. For all the partisan bickering in Washington DC, though, a firm bipartisan consensus since the days of George W. Bush has placed even Wall Street’s most monumental acts of piracy above the reach of the law.The Bush and Obama administrations both went out of their way to turn a blind eye toward the housing bubble’s spectacular frauds, and there’s no reason to think Obama’s appointees in the Justice Department will get around to doing their jobs this time either.Once the imminent shale bust comes and goes, in other words, it’s a safe bet that there will be more bubbles, each one propping up the otherwise dismal prospects of the financial industry for a little while, and then delivering another body blow to the economies of America and the world as it bursts.

This isn’t merely a problem for those who have investments, or those whose jobs depend in one way or another on the services the financial industry provides when it’s not too busy committing securities fraud to get around to it. The coming of a permanent bubble-and-bust economy puts a full stop at the end of any remaining prospect for even the most tentative national transition away from our current state of dependence on fossil fuels.Pick a project, any project, from so sensible a step as rebuilding the nation’s long-neglected railroads all the way to such pie-in-the-sky vaporware as solar power satellites, and it’s going to take plenty of investment capital.If it’s to be done on any scale, furthermore, we’re talking about a period of decades in which more capital every year will have to flow into the project.

The transition to a bubble-and-bust economy makes that impossible. Bubbles last for an average of three years or so, so even if the bubble-blowers on Wall Street happen by accident on some project that might actually help, it will hardly have time to get started before the bubble turns to bust, the people who invested in the project get burned, and the whole thing tumbles down into disillusionment andbankruptcy.If past experience is anything to go by, furthermore, most of the money thus raised will be diverted from useful purposes into the absurd bonuses and salaries bankers and brokers think society owes them for their services.

Over the longer run, a repeated drumbeat of failed investments and unpunished fraud puts the entire system of investment itself at risk.The trust that leads people to invest their assets, rather than hiding them in a hole in the ground, is a commons; like any commons, it can be destroyed by abuse; and since the federal government has abandoned its statutory duty to protect that commons by enforcing laws against securities fraud, a classic tragedy of the commons is the most likely outcome, wrecking the system by which our society directs surplus wealth toward productive uses and putting any collective response to the end of the fossil fuel age permanently out of reach.

 

 

All these are crucial issues. Still, there’s a second point of more immediate importance. I don’t think anybody knows exactly how big the shale bubble has become, but it’s been one of Wall Street’s few really large profit centers over the last three years.  It’s quite possible that the bubble is large enough to cause a major financial panic when it bursts, and send the United States and the world down into yet another sharp economic downturn.  As Yogi Berra famously pointed out, it’s tough to make predictions, especially about the future; still, I don’t think it’s out of place to suggest that sensible preparations for hard times might be wise just now, and if any of my readers happen to have anything invested in the shale or financial industries, I’d encourage them to consider other options in the fairly near term.

 

The End of the Shale Bubble?

[John Michael Greer]

Written by testudoetlepus

March 4th, 2013 at 2:19 pm

The Center Cannot Hold

without comments

by John Michael Greer

When William Butler Yeats put the phrase I’ve used as the title for this week’s post into the powerful and prescient verses of “The Second Coming,” he had deeper issues in mind than the crisis of power in a declining American empire. Still, the image is anything but irrelevant here; the political evolution of the United States over the last century has concentrated so many of the responsibilities of government in Washington DC that the entire American system is beginning to crack under the strain.

This is admittedly not the way you’ll hear the centralization of power in America discussed by those few voices in our national conversation who discuss it at all. On the one hand are the proponents of centralized power, who insist that leaving any decision at all in the hands of state or local authorities is tantamount to handing it over to their bogeyman du jour—whether that amounts to the bedsheet-bedecked Southern crackers who populate the hate speech of the left, say, or the more diverse gallery of stereotypes that plays a similar role on the right. On the other hand are those who insist that the centralization of power in America is the harbinger of a totalitarian future that will show up George Orwell as an incurable optimist.

I’ve already talked in a number of previous posts about the problems with this sort of thinking, with its flattening out of the complexities of contemporary politics into an opposition between warm fuzzy feelings and cold prickly ones. I’d like, to pursue the point a little further, to offer two unpopular predictions about the future of American government.The first is that the centralization of power in Washington DC has almost certainly reached its peak, and will be reversing in the decades ahead of us. The second is that, although there will inevitably be downsides to that reversal, it will turn out by and large to be an improvement over the system we have today.These predictions unfold from a common logic; both are consequences of the inevitable failure of overcentralized power.

It’s easy to get caught up in abstractions here, and even easier to fall into circular arguments around the functions of political power that attract most of the attention these days—for example, the power to make war. I’ll be getting to this latter a bit further on in this post, but I want to start with a function of government slightly less vexed by misunderstandings. The one I have in mind is education.

In the United States, for a couple of centuries now, the provision of free public education for children has been one of the central functions of government. Until fairly recently, in most of the country, it operated in a distinctive way. Under legal frameworks established by each state, local school districts were organized by the local residents, who also voted to tax themselves to pay the costs of building and running schools.Each district was managed by a school board, elected by the local residents, and had extensive authority over the school district’s operations.

In most parts of the country, school districts weren’t subsets of city, township, or county governments, or answerable to them; they were single-purpose independent governments on a very small scale, loosely supervised by the state and much more closely watched by the local voters. On the state level, a superintendent of schools or a state board of education, elected by the state’s voters, had a modest staff to carry out the very limited duties of oversight and enforcement assigned by the state legislature.On the federal level, a bureaucracy not much larger supervised the state boards of education, and conducted the even more limited duties assigned it by Congress.

Two results of that system deserve notice.First of all, since individual school districts were allowed to set standards, chose textbooks, and manage their own affairs, there was a great deal of diversity in American education. While reading, writing, and ‘rithmetic formed the hard backbone of the school day, and such other standards as history and geography inevitably got a look in as well, what else a given school taught was as varied as local decisions could make them. What the local schools put in the curriculum was up to the school board and, ultimately, to the voters, who could always elect a reform slate to the school board if they didn’t like what was being taught.

Second, the system as a whole gave America a level of public literacy and general education that was second to none in the industrial world, and far surpassed the poor performance of the far more lavishly funded education system the United States has today.In a previous post, I encouraged readers to compare the Lincoln-Douglas debates of 1858 to the debates in our latest presidential contest, and to remember that most of the people who listened attentively to Lincoln and Douglas had what then counted as an eighth-grade education.The comparison has plenty to say about the degeneration of political thinking in modern America, but it has even more to say about the extent to which the decline in public education has left voters unprepared to get past the soundbite level of thinking.

Those of my readers who want an even more cogent example are encouraged to leaf through a high school textbook from before the Second World War. You’ll find that the reading comprehension, reasoning ability, and mathematical skill expected as a matter of course from ninth-graders in 1930 is hard to find among American college graduates today. If you have kids of high school age, spend half an hour comparing the old textbook with the one your children are using today.You might even consider taking the time to work through a few of the assignments in the old textbook yourself.

Plenty of factors have had a role in the dumbing-down process that gave us our current failed system of education, to be sure, but I’d like to suggest that the centralization of power over the nation’s educational system in a few federal bureaucracies played a crucial role. To see how this works, again, a specific example is useful. Let’s imagine a child in an elementary school in Lincoln, Nebraska, who is learning how to read. Ask yourself this: of all the people concerned with her education, which ones are able to help that individual child tackle the daunting task of figuring out how to transform squiggles of ink into words in her mind?

The list is fairly small, and her teacher and her parents belong at the top of it. Below them are a few others: a teacher’s aide if her classroom has one, an older sibling, a friend who has already managed to learn the trick. Everyone else involved is limited to helping these people do their job. Their support can make that job somewhat easier—for example, by making sure that the child has books, by seeing to it that the classroom is safe and clean, and so on—but they can’t teach reading. Each supporting role has supporting roles of its own; thus the district’s purchasing staff, who keep the school stocked with textbooks, depend on textbook publishers and distributors, and so on. Still, the further you go from the child trying to figure out that C-A-T means “cat,” the less effect any action has on her learning process.

Now let’s zoom back 1200 miles or so to Washington DC and the federal Department of Education. It’s a smallish federal bureaucracy, which means that in the last year for which I was able to find statistics, 2011, it spent around $71 billion.Like many other federal bureaucracies, its existence is illegal. I mean that quite literally; the US constitution assigns the federal government a fairly limited range of functions, and “those powers necessary and convenient” to exercise them; by no stretch of the imagination can managing the nation’s public schools be squeezed into those limits. Only the Supreme Court’s embarrassingly supine response to federal power grabs during most of the twentieth century allows the department to exist at all.

So we have a technically illegal bureaucracy running through $71 billion of the taxpayers’ money in a year, which is arguably not a good start. The question I want to raise, though, is this:what can the staff of the Department of Education do that will have any positive impact on that child in the classroom in Lincoln, Nebraska? They can’t teach the child themselves; they can’t fill any of the supporting roles that make it possible for the child to be taught. They’re 1200 miles away, enacting policies that apply to every child in every classroom, irrespective of local conditions, individual needs, or any of the other factors that make teaching a child to read different from stamping out identical zinc bushings.

There are a few—a very few—things that can usefully be done for education at the national level. One of them is to make sure that the child in Lincoln is not denied equal access to education because of her gender, her skin color, or the like. Another is to provide the sort of overall supervision to state boards of education that state boards of education traditionally provided to local school boards. There are a few other things that belong on the same list.All of them can be described, to go back to a set of ideas I sketched out a couple of weeks ago, as measures to maintain the commons.

Public education is a commons. The costs are borne by the community as a whole, while the benefits go to individuals:the children who get educated, the parents who don’t have to carry all the costs of their children’s education, the employers who don’t have to carry all the costs of training employees, and so on. Like any other commons, this one is vulnerable to exploitation when it’s not managed intelligently, and like most commons in today’s America, this one has taken quite a bit of abuse lately, with the usual consequences. What makes this situation interesting, in something like the sense of the apocryphal Chinese proverb, is that the way the commons of public education is being managed has become the principal force wrecking the commons.

The problem here is precisely that of centralization. The research for which economist Elinor Ostrom won her Nobel Prize a few years back showed that, by and large, effective management of a commons is a grassroots affair; those who will be most directly affected by the way the commons is managed are also its best managers.The more distance between the managers and the commons they manage, the more likely failure becomes, because two factors essential to successful management simply aren’t there. The first of them is immediate access to information about how management policies are working, or not working, so that those policies can be adjusted immediately if they go wrong; the second is a personal stake in the outcome, so that the managers have the motivation to recognize when a mistake has been made, rather than allowing the psychology of previous investment to seduce them into pursuing a failed policy right into the ground.

Those two factors don’t function in an overcentralized system.Politicians and bureaucrats don’t get to see the consequences of their failed decisions up close, and they don’t have any motivation to admit that they were wrong and pursue new policies—quite the contrary, in fact.Consider, for example, the impact of the No Child Left Behind (NCLB) Act, pushed through Congress by bipartisan majorities and signed with much hoopla by George W. Bush in 2002. In the name of accountability—a term that in practice means “finding someone to punish”—the NCLB Act requires mandatory standardized testing at specific grade levels, and requires every year’s scores to be higher than the previous year’s, in every school in the nation. Teachers and schools that fail to accomplish this face draconian penalties.

My readers may be interested to know that next year, by law, every child in America must perform at or above grade level. It’s reminiscent of the imaginary town of Lake Wobegon—“where all the children are above average”—except that this is no joke; what’s left of America’s public education system is being shredded by the efforts of teachers and administrators to save their jobs in a collapsing economy, by teaching to the tests and gaming the system, under the pressure of increasingly unreal mandates from Washington DC. Standardized test scores have risen slightly; meaningful measures of literacy, numeracy, and other real-world skills have continued to move raggedly downward, and you can bet that the only response anybody in Washington is going to be willing to discuss is yet another round of federal mandates, most likely even more punitive and less effective than the current set.

Though I’ve used education as an example, nearly every part of American life is pervaded by the same failed logic of overcentralization. Another example?Consider the Obama administration’s giddy pursuit of national security via drone attacks.As currently operated, Predator drones are the ne plus ultra in centralized warfare; each drone attack has to be authorized by Obama himself, the drone is piloted via satellite link from a base in Nevada, and you can apparently sit in the situation room in the White House and watch the whole thing live. Hundreds of people have been blown to kingdom come by these attacks so far, in the name of a war on terror that Obama’s party used to denounce.

Now of course that habit only makes sense if you’re willing to define young children and wedding party attendees as terrorists, which seems a little extreme to me. Leaving that aside, though, there’s a question that needs to be asked: is it working? Since none of the areas under attack are any less full of anti-American insurgents than they have been, and the jihadi movement has been able to expand its war dramatically in recent weeks into Libya and Mali, the answer is pretty clearly no. However technically superlative the drones themselves are, the information that guides them comes via the notoriously static-filled channels of intelligence collection and analysis, and the decision to use them takes place in the even less certain realms of tactics and strategy; nor is it exactly bright, if you want to dissuade people from seeking out Americans and killing them, to go around vaporizing people nearly at random in parts of the world where avenging the murder of a family member is a sacred duty.

In both cases, and plenty of others like them, we have other alternatives, but all of them require the recognition that the best response to a failed policy isn’t a double helping of the same. That recognition is nowhere in our collective conversation at the moment. It would be useful if more of us were to make an effort to put it there, but there’s another factor in play. The center really cannot hold, and as it gives way, a great many of today’s political deadlocks will give way with it.

Eliot Wigginton, the teacher in rural Georgia who founded the Foxfire project and thus offered the rest of us an elegant example of what can happen when a purely local educational venture is given the freedom to flower and bear fruit, used to say that the word “learn” is properly spelled F-A-I-L. That’s a reading lesson worth taking to heart, if only because we’re going to have some world-class chances to make use of it in the years ahead. One of the few good things about really bad policies is that they’re self-limiting; sooner or later, a system that insists on embracing them is going to crash and burn, and once the rubble has stopped bouncing and the smoke clears away, it’s not too hard for the people standing around the crater to recognize that something has gone very wrong.In that period of clarity, it’s possible for a great many changes to be made, especially if there are clear alternatives available and people advocating for them.

In the great crises that ended each of America’s three previous rounds of anacyclosis—in 1776, in 1861, and in 1933—a great many possibilities that had been unattainable due to the gridlocked politics of the previous generation suddenly came within reach. In those past crises, the United States was an expanding nation, geographically, economically, and in terms of its ability to project power in the world; the crisis immediately ahead bids fair to arrive in the early stages of the ensuing contraction. That difference has important effects on the nature of the changes before us.

Centralized power is costly—in money, in energy, in every other kind of resource.Decentralized systems are much cheaper.In the days when the United States was mostly an agrarian society, and the extravagant abundance made possible by a global empire and reckless depletion of natural resources had not yet arrived, the profoundly localized educational system I sketched out earlier was popular because it was affordable.Even a poor community could count on being able to scrape together the political will and the money to establish a school district, even if that meant a one-room schoolhouse with one teacher taking twenty-odd children a day through grades one through eight. That the level of education that routinely came out of such one-room schoolhouses was measurably better than that provided by today’s multimillion-dollar school budgets is just one more irony in the fire.

 

 

On the downside of America’s trajectory, as we descend from empire toward whatever society we can manage to afford within the stringent limits of a troubled biosphere and a planet stripped of most of its nonrenewable resources, local systems of the one-room schoolhouse variety are much more likely to be an option than centralized systems of the sort we have today. That shift toward the affordably local will have many more consequences; I plan on exploring another of them next week.

 

The Center Cannot Hold<

[John Michael Greer]

Written by testudoetlepus

February 7th, 2013 at 8:02 pm

We Don’t Live In Neverland

without comments

byJohn Michael Greer

The return to an older American concept of government as the guarantor of the national commons, the theme of last week’s post here on The Archdruid Report, is to my mind one of the crucial steps that might just succeed in making a viable future for the post-imperial United States. A viable future, mind you, does not mean one in which any signficant number of Americans retain any significant fraction of the material abundance we currently get from the “wealth pump” of our global empire. The delusion that we can still live like citizens of an imperial power when the empire has gone away will be enormously popular, not least among those who currently insist they want nothing to do with the imperial system that guarantees their prosperity, but it’s still a delusion.

The end of American empire, it deserves repeating, means the end of a system in which the five per cent of humanity that live in the United States get to dispose of a quarter of the planet’s energy and a third of its raw materials and industrial product. Even if the fossil fuels that undergird the industrial product weren’t depleting out of existence—and of course they are—the rebalancing of global wealth driven by the decline of one empire and the rise of another will involve massive and often traumatic impacts, especially for those who have been living high on the hog under the current system and will have to get used to a much smaller portion of the world’s wealth in the years immediately ahead. Yes, dear reader, if you live in the United States or its inner circle of allies—Canada, Britain, Australia, Japan, and a few others—this means you.

I want to stress this point, because habits of thought already discussed in this sequence of posts make it remarkably difficult for most Americans to think about a future that isn’t either all warm fuzzy or all cold prickly. If an imagined future is supposed to be better than the one we’ve got, according to these habits of thought, it has to be better in every imaginable way, and if it’s worse, it has to be worse just as uniformly. Suggest that the United States might go hurtling down the far side of its imperial trajectory and come out of the process as a Third World nation, as I’ve done here, and you can count on blank incomprension or self-righteous anger if you go on to suggest that the nation that comes out the other side of this project might still be able to provide a range of basic social goods to its citizens, and might even recover some of the values it lost a century ago in the course of its headlong rush to empire.

Now in fact I’m going to suggest this, and indeed I’ve already sketched out some of the steps that individual Americans might choose to take to lay the foundations for that project. Still, it’s also worth noting that the same illogic shapes the other end of the spectrum of possible futures. These days, if you pick up a book offering a vision of a better future or a strategy to get there, it’s usually a safe bet that you can read the thing from cover to cover no reference whatsoever to any downsides, drawbacks, or tradeoffs that might be involved in pursuing the vision or enacting the strategy. Since every action in the real world has downsides, drawbacks, and tradeoffs, this is not exactly a minor omission, nor does the blithe insistence on ignoring such little details offer any reason to feel confident that the visions and strategies will actually work as advertised.

One example in particular comes to mind here, because it has immediate relevance to the project of this series of posts. Those of my readers who have been following the peak oil scene for any length of time will have encountered any number of enthusiastic discussions of relocalization: the process, that is, of disconnecting from the vast and extravagant global networks of production, consumption, and control that define so much of industrial society, in order to restore or reinvent local systems that will be more resilient in the face of energy shortages and other disruptions, and provide more security and more autonomy to those who embrace them.

A very good case can be made for this strategy. On the one hand, the extreme centralization of the global economy has become a source of massive vulnerabilities straight across the spectrum from the most abstract realms of high finance right down to the sprawling corporate structures that put food on your table. Shortfalls of every kind, from grain and fuel to financial capital, are becoming a daily reality for many people around the world as soaring energy costs put a galaxy of direct and indirect pressures on brittle and overextended systems. That’s only going to become worse as petroleum reserves and other vital resources continue to deplete. As this process continues, ways of getting access to necessities that are deliberately disconnected from the global economic system, and thus less subject to its vulnerabilities, are going to be well worth having in place.

At the same time, participation in the global economy brings with it vulnerabilities of another kind. For anyone who has to depend for their daily survival on the functioning of a vast industrial structure which is not answerable to the average citizen, talk about personal autonomy is little more than a bad joke, and the ability of communities to make their own choices and seek their own futures in such a context is simply another form of wishful thinking. Many people involved in efforts to relocalize have grasped this, and believe that deliberately standing aside from systems controlled by national governments and multinational corporations offers one of the few options for regaining personal and community autonomy in the face of an increasingly troubled future.

There are more points that can be made in favor of relocalization schemes, and you can find them rehashed endlessly on pro-relocalization websites all over the internet. For our present purposes, though, this fast tour of the upside will do, because each of these arguments comes with its own downside, which by and large you won’t find mentioned anywhere on those same websites.

The downside to the first argument? When you step out of the global economy, you cut yourself off from the imperial wealth pump that provides people in America with the kind of abundance they take for granted, and the lifestyles that are available in the absence of that wealth pump are far more restricted, and far more impoverished, than most would-be relocalizers like to think. Peasant cultures around the world are by and large cultures of poverty, and there’s a good reason for that: by the time you, your family, and the other people of your village have provided food on the table, thatch on the roof, a few necessary possessions, and enough of the local equivalent of cash to cover payments to the powers that be, whether those happen to be feudal magnates or the local property tax collector, you’ve just accounted for every minute of labor you can squeeze out of a day.

That’s the rock on which the back-to-the-land movement of the Sixties broke; the life of a full-time peasant farmer scratching a living out of the soil is viable, and it may even be rewarding, but it’s not the kind of life that the pampered youth of the Baby Boom era was willing to put up with for more than a fairly brief interval. It may well be that economic relocalization is still the best available option for dealing with the ongoing unraveling of the industrial economy—in fact, I’d agree that this is the case—but I wonder how many of its proponents have grappled with the fact that what they’re proposing may amount to no more than a way to starve with dignity while many others are starving without it.

The downside to the second argument is subtler, but in some ways even more revealing. The best way to grasp it is to imagine two relocalization projects, one in Massachusetts and the other in South Carolina. The people in both groups are enthusiastic about the prospect of regaining their personal autonomy from the faceless institutions of a centralized society, and just as eager to to bring back home to their own communities the power to make choices and pursue a better future. Now ask yourself this: what will these two groups do if they get that power? And what will the people in Massachusetts think about what the people in South Carolina will do once they get that power?

I’ve conducted a modest experiment of sorts along these lines, by reminding relocalization fans in blue states what people in red states are likely to do with the renewed local autonomy the people in the blue states want for themselves, and vice versa. Every so often, to be sure, I run across someone—more often on the red side of the line than on the blue one—whose response amounts to “let ‘em do what they want, so long as they let us do what we want.” Far more often, though, people on either side are horrified to realize that their opposite numbers on the other side of America’s widening cultural divide would use relocalization to enact their own ideals in their own communities.

More than once, in fact, the response has amounted to a flurry of proposals to hedge relocalization about with restrictions so that it can only be used to support the speaker’s own political and social agendas, with federal bureaucracies hovering over every relocalizing community, ready to pounce on any sign that a community might try to do something that would offend sensibilities in Boston or San Francisco, on the one hand, or the Bible Belt on the other. You might think, dear reader, that it would be obvious that this would be relocalization in name only; you might also think that it would be just as obvious that those same bureaucracies would fall promptly into the hands of the same economic and political interests that have made the current system as much of a mess as it is. Permit me to assure you that in my experience, among a certain segment of the people who like to talk about relocalization, these things are apparently not obvious at all.

By this point in the discussion, I suspect most of my readers have come to believe that I’m opposed to relocalization schemes. Quite the contrary, I think they’re among the best options we have, and the fact that they have significant downsides, drawbacks, and tradeoffs does not nullify that. Every possible strategy, again, has downsides, drawbacks, and tradeoffs; whatever we choose to do to face the onset of the Long Descent, as individuals, as communities, or as a nation, problems are going to ensue and people are going to get hurt. Trying to find an option that has no downsides simply guarantees that we will do nothing at all; and in that case, equally, problems are going to ensue and people are going to get hurt. That’s how things work in the real world—and it may be worth reminding my readers that we don’t live in Neverland.

Thus I’d like to suggest that a movement toward relocalization is another crucial ingredient of a viable post-imperial America. In point of fact, we’ve got the structures in place to do the thing already; the only thing that’s lacking is a willingness to push back, hard, against certain dubious habits in the US political system that have rendered those structures inoperative.

Back in 1787, when the US constitution was written, the cultural differences between Massachusetts and South Carolina were very nearly as sweeping as they are today. That’s one of the reasons why the constitution as written left most internal matters in the hands of the individual states, and assigned to the federal government only those functions that concerned the national commons as a whole: war, foreign policy, minting money, interstate trade, postal services, and a few other things. The list was expanded in a modest way before the rush to empire, so that public health and civil rights, for example, were brought under federal supervision over the course of the 19th century. Under the theory of government I described last week, these were reasonable extensions, since they permitted the federal government to exercise its function of securing the national commons.

Everything else remained in the hands of the states and the people. In fact, the tenth amendment to the US constitution specifically requires that any power not granted to the federal government in so many words be left to the states and the people—a principle which, perhaps not surprisingly, has been roundly ignored by everyone in Washington DC for most of a century now. Under the constitution and its first nineteen amendments, in fact, the states were very nearly separate countries who happened to have an army, navy, foreign policy, and postal system in common.

Did that system have problems? You bet. What rights you had and what benefits you could expect as a citizen depended to a huge extent on where you lived—not just which state, but very often which county and which township or city as well. Whole classes of citizens might be deprived of their rights or the protection of the laws by local politicians or the majorities that backed them, and abuses of power were pervasive. All of that sounds pretty dreadful, until you remember that the centralization of power that came with America’s pursuit of empire didn’t abolish any of those things; it simply moved them to a national level. Nowadays, serving the interests of the rich and influential at the expense of the public good is the job of the federal government, rather than the local sheriff, and the denial of civil rights and due process that used to be restricted to specific ethnic and economic subgroups within American society now gets applied much more broadly.

Furthermore, one of the things that’s rendered the US government all but incapable of taking any positive action at all in the face of a widening spiral of crises is precisely the insistence, by people in Massachusetts, South Carolina, and the other forty-eight states as well, that their local views and values ought to be the basis of national policy. The rhetoric that results, in tones variously angry and plaintive, amounts to “Why can’t everyone else be reasonable and do it my way?”—which is not a good basis for the spirit of compromise necessary to the functioning of democracy, though it makes life easy for advocacy groups who want to shake down the citizenry for another round of donations to pay for the never-ending fight.

One of the few things that might succeed in unsticking the gridlock, so that the federal government could get back to doing the job it’s supposed to do, would be to let the people in Massachusetts, South Carolina, and the other forty-eight states pursue the social policies they prefer on a state by state basis. Yes, that would mean that people in South Carolina would do things that outraged the people in Massachusetts, and people in Massachusetts would return the favor. Yes, it would also mean that abuses and injustices would take place. Of course abuses and injustices take place now, in both states and all the others as well, but the ones that would take place in the wake of a transfer of power over social issues back to the states would no doubt be at least a little different from the current ones.

Again, the point of relocalization schemes is not that they will solve every problem. They won’t, and in fact they will certainly cause new problems we don’t have yet. The point of relocalization schemes is that, all things considered, if they’re pursued intelligently, the problems that they will probably solve are arguably at least a little worse than the problems that they will probably cause. Does that sound like faint praise? It’s not; it’s as much as can be expected for any policy this side of Neverland, in the real world, where every solution brings new problems of its own.

Now in fact relocalization has at least two other benefits that tip the balance well into positive territory. One of them is an effect I haven’t discussed in this series of posts, and I haven’t seen covered anywhere else in the peak oil blogosphere yet; it will need a post of its own, and that will have to wait a week. The other, though, is a simple matter of resilience.

The more territory has to be governed from a single political center, all things considered, the more energy and resources will be absorbed in the process of governing. This is why, before the coming of the industrial age, nations on the scale of the present United States of America rarely existed, and when they did come into being, they generally didn’t last for more than a short time. In an age of declining energy availability and depleting resources, the maintenance costs of today’s sprawling, centralized United States government won’t be affordable for long.  Devolving all nonessential functions of the central government to the individual states, as the US constitution mandates, might just cut costs to the point that some semblance of civil peace and democratic governance can hang on for the long term. That probably doesn’t seem like much to those whose eyes are fixed on fantasies of a perfect world, and are convinced they can transform it from fantasy to reality as soon as everyone else stops being unreasonable and agrees with them. Still, it’s better than most potential outcomes available to us in the real world—and again, we don’t live in Neverland.

 

We Don’t Live In Neverland

[John Michael Greer]

Written by testudoetlepus

February 6th, 2013 at 12:03 am

Restoring the Commons

without comments

by John Michael Greer

The hard work of rebuilding a post-imperial America, as I suggested in last week’s post, is going to require the recovery or reinvention of many of the things this nation chucked into the dumpster with whoops of glee as it took off running in pursuit of its imperial ambitions. The basic skills of democratic process are among the things on that list; so, as I suggested last month, are the even more basic skills of learning and thinking that undergird the practice of democracy.

All that remains crucial. Still, it so happens that a remarkably large number of the other things that will need to be put back in place are all variations of a common theme. What’s more, it’s a straightforward theme—or, more precisely, would be straightforward if so many people these days weren’t busy trying to pretend that the concept at its center either doesn’t exist or doesn’t present the specific challenges that have made it so problematic in recent years. The concept in question? The mode of collective participation in the use of resources, extending from the most material to the most abstract, that goes most often these days by the name of “the commons.”

The redoubtable green philosopher Garrett Hardin played a central role decades ago in drawing attention to the phenomenon in question with his essay The Tragedy of the Commons. It’s a remarkable work, and it’s been rendered even more remarkable by the range of contortions engaged in by thinkers across the economic and political spectrum in their efforts to evade its conclusions. Those maneuvers have been tolerably successful; I suspect, for example, that many of my readers will recall the flurry of claims a few years back that the late Nobel Prize-winning economist Elinor Ostrom had “disproved” Hardin with her work on the sustainable management of resources.

In point of fact, she did no such thing. Hardin demonstrated in his essay that an unmanaged commons faces the risk of a vicious spiral of mismanagement that ends in the common’s destruction; Ostrom got her Nobel, and deservedly so, by detailed and incisive analysis of the kinds of management that prevent Hardin’s tragedy of the commons from taking place. A little later in this essay, we’ll get to why those kinds of management are exactly what nobody in the mainstream of American public life wants to talk about just now; the first task at hand is to walk through the logic of Hardin’s essay and understand exactly what he was saying and why it matters.

Hardin asks us to imagine a common pasture, of the sort that was common in medieval villages across Europe. The pasture is owned by the village as a whole; each of the villagers has the right to put his cattle out to graze on the pasture. The village as a whole, however, has no claim on the milk the cows produce; that belongs to the villager who owns any given cow. The pasture is a collective resource, from which individuals are allowed to extract private profit; that’s the basic definition of a commons.

In the Middle Ages, such arrangements were common across Europe, and they worked well because they were managed by tradition, custom, and the immense pressure wielded by informal consensus in small and tightly knit communities, backed up where necessary by local manorial courts and a body of customary law that gave short shrift to the pursuit of personal advantage at the expense of others. The commons that Hardin asks us to envision, though, has no such protections in place. Imagine, he says, that one villager buys additional cows and puts them out to graze on the common pasture. Any given pasture can only support so many cows before it suffers damage; to use the jargon of the ecologist, it has a fixed carrying capacity for milk cows, and exceeding the carrying capacity will degrade the resource and lower its future carrying capacity. Assume that the new cows raise the total number of cows past what the pasture can support indefinitely, so once the new cows go onto the pasture, the pasture starts to degrade.

Notice how the benefits and costs sort themselves out. The villager with the additional cows receives all the benefit of the additional milk his new cows provide, and he receives it right away. The costs of his action, by contrast, are shared with everyone else in the village, and their impact is delayed, since it takes time for pasture to degrade. Thus, according to today’s conventional economic theories, the villager is doing the right thing. Since the milk he gets is worth more right now than the fraction of the discounted future cost of the degradation of the pasture he will eventually have to carry, he is pursuing his own economic interest in a rational manner.

The other villagers, faced with this situation, have a choice of their own to make. (We’ll assume, again, that they don’t have the option of forcing the villager with the new cows to get rid of them and return the total herd on the pasture to a level it can support indefinitely.) They can do nothing, in which case they bear the costs of the degradation of the pasture but gain nothing in return, or they can buy more cows of their own, in which case they also get more milk, but the pasture degrades even faster. According to most of today’s economic theories, the latter choice is the right one, since it allows them to maximize their own economic interest in exactly the same way as the first villager. The result of the process, though, is that a pasture that would have kept a certain number of cattle fed indefinitely is turned into a barren area of compacted subsoil that won’t support any cattle at all. The rational pursuit of individual advantage thus results in permanent impoverishment for everybody.

This may seem like common sense. It is common sense, but when Hardin first published “The Tragedy of the Commons” in 1968, it went off like a bomb in the halls of academic economics. Since Adam Smith’s time, one of the most passionately held beliefs of capitalist economics has been the insistence that individuals pursuing their own economic interest without interference from government or anyone else will reliably produce the best outcome for everybody. You’ll still hear defenders of free market economics making that claim, as if nobody but the Communists ever brought it into question. That’s why very few people like to talk about Hardin’s tragedy of the commons these days; it makes it all but impossible to uphold a certain bit of popular, appealing, but dangerous nonsense.

Does this mean that the rational pursuit of individual advantage always produces negative results for everyone? Not at all. The theorists of capitalism can point to equally cogent examples in which Adam Smith’s invisible hand passes out benefits to everyone, and a case could probably be made that this happens more often than the opposite. The fact remains that the opposite does happen, not merely in theory but also in the real world, and that the consequences of the tragedy of the commons can reach far beyond the limits of a single village.

Hardin himself pointed to the destruction of the world’s oceanic fisheries by overharvesting as an example, and it’s a good one. If current trends continue, many of my readers can look forward, over the next couple of decades, to tasting the last seafood they will ever eat. A food resource that could have been managed sustainably for millennia to come is being annihilated in our lifetimes, and the logic behind it is that of the tragedy of the commons: participants in the world’s fishing industries, from giant corporations to individual boat owners and their crews, are pursuing their own economic interests, and exterminating one fishery after another in the process.

Another example? The worldwide habit of treating the atmosphere as an aerial sewer into which wastes can be dumped with impunity. Every one of my readers who burns any fossil fuel, for any purpose, benefits directly from being able to vent the waste CO2 directly into the atmosphere, rather than having to cover the costs of disposing of it in some other way. As a result of this rational pursuit of personal economic interest, there’s a very real chance that most of the world’s coastal cities will have to be abandoned to the rising oceans over the next century or so, imposing trillions of dollars of costs on the global economy.

Plenty of other examples of the same kind could be cited. At this point, though, I’d like to shift focus a bit to a different class of phenomena, and point to the Glass-Steagall Act, a piece of federal legislation that was passed by the US Congress in 1933 and repealed in 1999. The Glass-Steagall Act made it illegal for banks to engage in both consumer banking activities such as taking deposits and making loans, and investment banking activities such as issuing securities; banks had to choose one or the other. The firewall between consumer banking and investment banking was put in place because in its absence, in the years leading up to the 1929 crash, most of the banks in the country had gotten over their heads in dubious financial deals linked to stocks and securities, and the collapse of those schemes played a massive role in bringing the national economy to the brink of total collapse.

By the 1990s, such safeguards seemed unbearably dowdy to a new generation of bankers, and after a great deal of lobbying the provisions of the Glass-Steagall Act were eliminated. Those of my readers who didn’t spend the last decade hiding under a rock know exactly what happened thereafter: banks went right back to the bad habits that got their predecessors into trouble in 1929, profited mightily in the short term, and proceeded to inflict major damage on the global economy when the inevitable crash came in 2008.

That is to say, actions performed by individuals (and those dubious “legal persons” called corporations) in the pursuit of their own private economic advantage garnered profits over the short term for those who engaged in them, but imposed long-term costs on everybody. If this sounds familiar, dear reader, it should. When individuals or corporations profit from their involvement in an activity that imposes costs on society as a whole, that activity functions as a commons, and if that commons is unmanaged the tragedy of the commons is a likely result. The American banking industry before 1933 and after 1999 functioned, and currently functions, as an unmanaged commons; between those years, it was a managed commons. While it was an unmanaged commons, it suffered from exactly the outcome Hardin’s theory predicts; when it was a managed commons, by contrast, a major cause of banking failure was kept at bay, and the banking sector was more often a source of strength than a source of weakness to the national economy.

It’s not hard to name other examples of what I suppose we could call “commons-like phenomena”—that is, activities in which the pursuit of private profit can impose serious costs on society as a whole—in contemporary America. One that bears watching these days is food safety. It is to the immediate financial advantage of businesses in the various industries that produce food for human consumption to cut costs as far as possible, even if this occasionally results in unsafe products that cause sickness and death to people who consume them; the benefits in increased profits are immediate and belong entirely to the business, while the costs of increased morbidity and mortality are borne by society as a whole, provided that your legal team is good enough to keep the inevitable lawsuits at bay. Once again, the asymmetry between benefits and costs produces a calculus that brings unwelcome outcomes.

The American political system, in its pre-imperial and early imperial stages, evolved a distinctive response to these challenges. The Declaration of Independence, the wellspring of American political thought, defines the purpose of government as securing the rights to life, liberty, and the pursuit of happiness. There’s more to that often-quoted phrase than meets the eye. In particular, it doesn’t mean that governments are supposed to provide anybody with life, liberty, or happiness; their job is simply to secure for their citizens certain basic rights, which may be inalienable—that is, they can’t be legally transferred to somebody else, as they could under feudal law—but are far from absolute. What citizens do with those rights is their own business, at least in theory, so long as their exercise of their rights does not interfere too drastically with the ability of others to do the same thing. The assumption, then and later, was that citizens would use their rights to seek their own advantage, by means as rational or irrational as they chose, while the national community as a whole would cover the costs of securing those rights against anyone and anything that attempted to erase them.

That is to say, the core purpose of government in the American tradition is the maintenance of the national commons. It exists to manage the various commons and commons-like phenomena that are inseparable from life in a civilized society, and thus has the power to impose such limits on people (and corporate pseudopeople) as will prevent their pursuit of personal advantage from leading to a tragedy of the commons in one way or another. Restricting the capacity of banks to gamble with depositors’ money is one such limit; restricting the freedom of manufacturers to sell unsafe food is another, and so on down the list of reasonable regulations. Beyond those necessary limits, government has no call to intervene; how people choose to live their lives, exercise their liberties, and pursue happiness is up to them, so long as it doesn’t put the survival of any part of the national commons at risk.

As far as I know, you won’t find that definition taught in any of the tiny handful of high schools that still offer civics classes to young Americans about to reach voting age. Still, it’s a neat summary of generations of political thought in pre-imperial and early imperial America. These days, by contrast, it’s rare to find this function of government even hinted at. Rather, the function of government in late imperial America is generally seen as a matter of handing out largesse of various kinds to any group organized or influential enough to elbow its way to a place at the feeding trough. Even those people who insist they are against all government entitlement programs can be counted on to scream like banshees if anything threatens those programs from which they themselves benefit; the famous placard reading “Government Hands Off My Medicare” is an embarrassingly good reflection of the attitude that most American pseudoconservatives adopt in practice, however loudly they decry government spending in theory.

A strong case can be made, though, for jettisoning the notion of government as national sugar daddy and returning to the older notion of government as guarantor of the national commons. The central argument in that case is simply that in the wake of empire, the torrents of imperial tribute that made the government largesse of the recent past possible in the first place will go away. As the United States loses the ability to command a quarter of the world’s energy supplies and a third of its natural resources and industrial product, and has to make do with the much smaller share it can expect to produce within its own borders, the feeding trough in Washington DC—not to mention its junior equivalents in the fifty state capitals, and so on down the pyramid of American government—is going to run short.

In point of fact, it’s already running short. That’s the usually unmentioned factor behind the intractable gridlock in our national politics: there isn’t enough largesse left to give every one of the pressure groups and veto blocs its accustomed share, and the pressure groups and veto blocs are responding to this unavoidable problem by jamming up the machinery of government with ever more frantic efforts to get whatever they can. That situation can only end in crisis, and probably in a crisis big enough to shatter the existing order of things in Washington DC; after the rubble stops bouncing, the next order of business will be piecing together some less gaudily corrupt way of managing the nation’s affairs.

That process of reconstruction might be furthered substantially if the pre-imperial concept of the role of government were to get a little more air time these days. I’ve spoken at quite some length here and elsewhere about the very limited contribution that grand plans and long discussions can make to an energy future that’s less grim than the one toward which we’re hurtling at the moment, and there’s a fair bit of irony in the fact that I’m about to suggest exactly the opposite conclusion with regard to the political sphere. Still, the circumstances aren’t the same. The time for talking about our energy future was decades ago, when we still had the time and the resources to get new and more sustainable energy and transportation systems in place before conventional petroleum production peaked and sent us skidding down the far side of Hubbert’s peak. That time is long past, the options remaining to us are very narrow, and another round of conversation won’t do anything worthwhile to change the course of events at this point.

That’s much less true of the political situation, because politics are subject to rules very different from the implacable mathematics of petroleum depletion and net energy. At some point in the not too distant future, the political system of the United States of America is going to tip over into explosive crisis, and at that time ideas that are simply talking points today have at least a shot at being enacted into public policy. That’s exactly what happened at the beginning of the three previous cycles of anacyclosis I traced out in a previous post in this series. In 1776, 1860, and 1933, ideas that had been on the political fringes not that many years beforehand redefined the entire political dialogue, and in all three cases this was possible because those once-fringe ideas had been widely circulated and widely discussed, even though most of the people who circulated and discussed them never imagined that they would live to see those ideas put into practice. There are plenty of ideas about politics and society in circulation on the fringes of today’s American dialogue, to be sure. I’d like to suggest, though, that there’s a point to reviving an older, pre-imperial vision of what government can do, and ought to do, in the America of the future. A political system that envisions its role as holding an open space in which citizens can pursue their own dreams and experiment with their own lives is inherently likely to be better at dissensus than more regimented alternatives, whether those come from the left or the right—and dissensus, to return to a central theme of this blog, is the best strategy we’ve got as we move into a future where nobody can be sure of having the right answers.

 

Restoring the Commons

[John Michael Greer]

Written by testudoetlepus

January 25th, 2013 at 9:54 pm

The Road Down from Empire

without comments


by John Michael Greer

Here in the Appalachians, at least, there’s something about the month of January that encourages sober thoughts. Maybe it’s the weather, which is pretty reliably gray and cold; maybe it’s the arrival of the bills from the holiday season just ended, or the awkward way that those bills usually arrive about the same time that the annual crop of New Year’s resolutions start landing in the recycle bin. Pick your reason, but one way or another it seems like a good time to circle back and finish up the theme I’ve been developing here for most of a year now, the decline and fall of America’s global empire and the difficult task of rebuilding something worthwhile in its wake.

The hard work of reinventing democracy in a post-imperial America, the subject of several of last month’s posts, is only one facet of this broader challenge. I’ve mentioned before that the pursuit of empire is a drug, and like most other drugs, it makes you feel great at the time and then wallops you the next morning. It’s been just over a hundred years now since the United States launched itself on its path to global empire, and the hangover that was made inevitable by that century-long bender is waiting in the wings. I suspect one of the reasons the US government is frantically going through the empties in the trash, looking for one that still has a few sips left in it, is precisely that first dim dawning awareness of just how bad the hangover is going to be.

It’s worth taking a few moments to go over some of the more visible signposts of the road down from empire. To begin with, the US economy has been crippled by a century of imperial tribute flowing in from overseas. That’s what happened to our manufacturing sector; once the rest of the industrial world recovered from the Second World War, manufacturers in an inflated tribute economy couldn’t compete with the lower costs of factories in less extravagantly overfunded parts of the world, and America’s industrial heartland turned into the Rust Belt. As the impact of the tribute economy spread throughout US society, in turn, it became next to impossible to make a living doing anything productive, and gaming the imperial system in one way or another—banking, investment, government contracts, you name it—turned into the country’s sole consistent growth industry.

That imposed distortions on every aspect of American society, which bid fair to cripple its ability to pick up the pieces when the empire goes away. As productive economic sectors withered, the country’s educational system reoriented itself toward the unproductive, churning out an ever-expanding range of administrative specialties for corporations and government while shutting down what was once a world-class system of vocational and trade schools. We now have far more office fauna than any sane society needs, and a drastic shortage of people who have any less abstract skill set. For the time being, we can afford to offshore jobs, or import people from other countries to do them at substandard wages; as our empire winds down and those familiar bad habits stop being possible, the shortage of Americans with even the most basic practical skills will become a massive economic burden.

Meanwhile the national infrastructure is caught in a downward spiral of malign neglect made inevitable by the cash crunch that always hits empires on the way down. Empire is an expensive habit;the long-term effects of the imperial wealth pump on those nations subjected to its business end mean that the income from imperial arrangements goes down over time, while the impact of the tribute economy at home generally causes the costs of empire go up over time. The result can be seen on Capitol Hill day by day, as one fantastically expensive weapons system after another sails through Congress with few dissenting votes, while critically important domestic programs are gutted by bipartisan agreement, or bog down in endless bickering. The reliable result is a shell of a nation, seemingly strong when observed from outside but hollowing out within, and waiting for the statistically inevitable shove that will launch it on its final skid down the rough slope into history’s compost bin.

You may well be thinking, dear reader, that the logical response of a nation caught in a predicament of this sort would be to bite the bullet, back away from empire in a deliberate fashion, and use the last bit of income from the tribute economy to pay for the expenses of rebuilding a domestic economy of a more normal kind. You’d be right, too, but there are compelling reasons why very few empires in history have had the great good sense to manage their decline in this manner. Imperial China did it in the fifteenth century, scrapping a burgeoning maritime empire in the Indian Ocean, and of course Britain did it after 1945, though that was largely because a 500-pound gorilla named the United States was sitting on Britannia’s prostrate body, informing her politely that in future, the global empire would be American, thank you very much; other than that, examples are few and far between.

The logic here is easy to follow. Any attempt to withdraw from imperial commitments will face concerted resistance from those who profit from the status quo, while those who claim to oppose empire are rarely willing to keep supporting a policy of imperial retreat once it turns out, as it inevitably does, that the costs of that policy will include a direct impact on their own incomes or the value of their investments. Thus politicians who back a policy of withdrawal from empire can count on being pilloried by their opponents as traitors to their country, and abandoned by erstwhile allies who dislike empire in the abstract but want to retain lifestyles that only an imperial tribute economy can support. Since politicians are, after all, in the business of getting into office and staying there, their enthusiasm for such self-sacrificing policies is understandably limited.

The usual result is a frantic effort to kick the can as far as possible down the road, so that somebody else has to deal with it. Most of what’s going on in Washington DC these days can be described very exactly in those terms. Despite popular rhetoric, America’s politicians these days are not unusually wicked or ignorant; they are, by and large, roughly as ethical as their constituents, and rather better educated—though admittedly neither of these is saying much. What distinguishes them from the statesmen of an earlier era, rather, is that they are face to face with an insoluble dilemma that their predecessors in office spent the last few decades trying to ignore. As the costs of empire rise, the profits of empire dwindle, the national economy circles the drain, the burden of deferred maintenance on the nation’s infrastructure grows, and the impact of the limits to growth on industrial civilization worldwide becomes ever harder to evade, they face the unenviable choice between massive trouble now and even more massive trouble later; being human, they repeatedly choose the latter, and console themselves with the empty hope that something might turn up.

It’s a common hope these days. I’ve commented here more than once about the way that the Rapture, the Singularity, and all the other apocalyptic fantasies on offer these days serve primarily as a means by which people can pretend to themselves that the future they’re going to get isn’t the one that their actions and evasions are busily creating for them. The same is true of a great many less gaudy fictions about the future—the much-ballyhooed breakthroughs that never quite get around to happening, the would-be mass movements that never attract anyone but the usual handful of activists, the great though usually unspecified leaps in consciousness that will allegedly happen any day now, and all the rest of it. The current frenzy of meretricious twaddle in the media about how shale gas is going to make the US a net energy exporter gets a good share of its impetus from the same delusive hope—though admittedly the fact that a great many people have invested a great deal of money in companies in the fracking business, and are trying to justify their investments using the same sort of reasoning that boosted the late housing bubble, also has more than a little to do with it.

There’s likely to be plenty more of the same thing in the decades ahead. Social psychologists have written at length about what James Howard Kunstler has usefully termed the psychology of previous investment, the process by which people convince themselves to throw bad money after good, or to remain committed to a belief system even though all available evidence demonstrates that it isn’t true and doesn’t work. The critical factor in such cases is the emotional cost of admitting that the decision to buy the stock, adopt the belief system, or make whatever other mistake is at issue, was in fact a mistake. The more painful it is to make that admission, the more forcefully most people will turn away from the necessity to do so, and it’s safe to assume that they’ll embrace the most consummate malarkey if doing so allows them to insist to themselves that the mistake wasn’t a mistake after all.

As America stumbles down from its imperial peak,in other words, the one growth industry this country will have left will consist of efforts to maintain the pretense that America doesn’t have an empire, that the empire isn’t falling, and that the fall doesn’t matter anyway. (Yes, those statements are mutually contradictory. Get used to it; you’ll be hearing plenty of statements in the years to come that are even more more incoherent. )As the decline accelerates, anyone who offers Americans a narrative that allows them to pretend they’ll get the shiny new future that our national mythology promises them will be able to count on a large and enthusiastic audience. The narratives being marketed for this purpose need not be convincing; they need not even be sane. So long as they make it possible for Americans to maintain the fiction of a brighter future in the teeth of the facts, they’ll be popular.

The one bit of hope I can offer here is that such efforts at collective make-believe don’t last forever. Sooner or later, the fact of decline will be admitted and, later still, accepted; sooner or later, our collective conversation will shift from how America can maintain perpetual growth to how America can hold onto what it has, then to how America can recover some of what it lost, and from there to figuring out how America—or whatever grab bag of successor societies occupies the territory currently held by the United States—can get by in the harsh new deindustrial world that grew up around it while nobody was looking. It’s a normal process in an age of decline, and can be traced in the literature of more than one civilization before ours.

It bears remembering, though, that individuals are going through the same process of redefinition all by themselves. This process differs from the five stages of peak oil, which I’ve discussed elsewhere, in that it’s not primarily about the emotional impact of loss; it’s a matter of expectations, and of the most pragmatic sort of economic expectations at that. Consider a midlevel managerial employee in some corporation or other whose job, like so many other jobs these days, is about to go away forever. Before the rumors start flying, she’s concerned mostly with clawing her way up the corporate ladder and increasing her share of the perks and privileges our society currently grants to its middle classes. Then the rumors of imminent layoffs start flying, and she abruptly has to shift her focus to staying employed. The pink slips come next, bearing bad news, and her focus shifts again, to getting a new job; when that doesn’t happen and the reality of long term joblessness sinks in, a final shift of focus takes place, and she has to deal with a new and challenging world.

This has already happened to a great many people in America. It’s going to happen, over the years ahead, to a great many more—probably, all things considered, to a large majority of people in the American middle class, just as it happened to a large majority of the industrial working class a few decades further back. Not everyone, it has to be said, will survive the transition; alcoholism, drug abuse, mental and physical illness, and suicide are among the standard risks run by the downwardly mobile. A fair number of those who do survive will spend the rest of their lives clinging to the vain hope that something will happen and give them back what they lost.

It’s a long, rough road down from empire, and the losses involved are not merely material in nature. Basing one’s identity on the privileges and extravagances made possible by the current US global empire may seem like a silly thing to do, but it’s very common. To lose whatever markers of status are respected in any given social class, whether we’re talking about a private jet and a Long Island mansion, a fashionable purse and a chic condo in an upscale neighborhood, or a pickup and a six-pack, can be tantamount to losing one’s identity if that identity has no more solid foundation—and a great many marketing firms have spent decades trying to insure that most Americans never think of looking for more solid foundations.

That last point has implications we’ll be exploring in a later sequence of posts. For the time being, though, I want to talk a bit about what all this means to those of my readers who have already come to terms with the reality of decline, and are trying to figure out how to live their lives in a world in which the conventional wisdom of the last three hundred years or so has suddenly been turned on its head. The first and, in many ways, the most crucial point is one that’s been covered here repeatedly already:you are going to have to walk the road down from empire yourself. Nobody else is going to do it for you, and you can’t even assume that anybody else will make it easier for you. What you can do, to make it a little easier than it will otherwise be, is to start walking it before you have to.

That means, to return to a slogan I’ve used more than once in this blog, using LESS—Less Energy, Stuff, and Stimulation. The more energy you need to maintain your everyday lifestyle, the more vulnerable you’ll be to sudden disruptions when the sprawling infrastructure that supplies you with that energy starts having running into serious trouble. Today, routine blackouts and brownouts of the electrical grid, and rationing or unpredictable availability of motor fuel, have become everyday facts of life in Third World nations that used to have relatively reliable access to energy. As America’s global empire unravels and the blowback from a century of empire comes home to roost, we can expect the same thing here. Get ready for that in advance, and you won’t face a crisis when it happens.

The same is true of the extravagant material inputs most Americans see as necessities, and of the constant stream of sensory stimulation that most Americans use to numb themselves to the unwelcome aspects of their surroundings and their lives. You will be doing without those at some point. The sooner you learn how to get by in their absence, the better off you’ll be—and the sooner you get out from under the torrent of media noise you’ve been taught to use to numb yourself, the sooner you can start assessing the world around you with a relatively clear head, and the sooner you’ll notice just how far down the arc of America’s descent we’ve already come.

Using LESS isn’t the only thing that’s worth doing in advance, of course. I’ve discussed elsewhere, for example, the need to develop the skills that will enable you to produce goods or provide services for other people, using relatively simple tools and, if at all possible, the energy of your own muscles. As the imperial tribute economy winds down and the United States loses the ability to import cheap goods and cheap labor from abroad, people will still need goods and services, and will pay for them with whatever measure of value is available—even if that amounts to their own unskilled labor. There are plenty of other steps that can be taken to prepare for life in a post-imperial society skidding down the far side of Hubbert’s peak, and the sooner you start taking those steps, the better prepared you will be to cope with that unfamiliar world.

Still, it may be possible to go further than that. In several of December’s posts here I raised the possibility that, in the wake of empire, the deliberate cultivation of certain core skills—specifically, clear reasoning, public speaking, and democratic process—might make it possible to kickstart a revival of America’s formerly vibrant democratic traditions. The same principle, I’d like to suggest, may be able to be applied more generally. Certain core insights that were central to pre-imperial America’s more praiseworthy achievements, but were tossed into the dumpster during the rush to empire, could be revived and put back to work in the post-imperial era. If that can be done at all, it’s going to involve a lot of work and a willingness to challenge some widely held notions of contemporary American culture, but I think the attempt is worth making. We’ll begin that discussion next week.

 

The Road Down from Empire

[John Michael Greer]

Written by testudoetlepus

January 18th, 2013 at 3:24 pm

Into an Unknown Country

without comments

by John Michael Greer

Was it just my imagination, or was the New Year’s celebration just past even more halfhearted than those of the last few years? My wife and I welcomed 2013 with a toast, and breakfasted the next morning on the traditional good-luck foods—rice and beans, corn bread, greens and bacon—that I learned to enjoy back when I was studying old-fashioned Southern folk magic. Outside our little house, though, the midnight air seemed remarkably quiet; the whoops, horns, and firecrackers of New Years past were notable mostly by their absence, and the next day’s hush seemed less a matter of hangovers than a not unreasonable dread of what 2013 might have in store for us all.

No doubt some of that was a function of the media panic about the so-called Fiscal Cliff. The New Yorker scored a palpable hit by headlining a piece on the subject “Washington Celebrates Solving Totally Unnecessary Crisis They Created,” but there’s more to it than that. What, after all, was this “fiscal cliff”? A measure that would have repealed some of the tax breaks and hikes in Federal spending put in place since 2000, and thus reduced the annual Federal deficit by a modest amount. All that yelling, in other words, was provoked by the possibility that the US government might have to take a few steps in the direction of living within its means. If the frantic struggle to avert that outcome is any measure of the kind of statesmanship we can expect from the White House and Congress in the year to come, it’s no wonder that hiding under the mattress has so much evident appeal just now.

There’s more involved in the evident lack of enthusiasm for the new year, though, than the latest clown acts playing in the three-ring circus that is today’s Washington DC. A great many of the comforting rationalizations that have played so large a role in justifying a continued reliance on the unsustainable are wearing very thin. Consider the claims, retailed by the media at ever-increasing volume these days, that recent upturns in the rate of domestic petroleum production in the US offer a conclusive disproof to the idea of peak oil, and herald the arrival of a new age of cheap abundant fuel. Courtesy of Jim Kunstler’s latest blog post, I’d like to offer a chart of US petroleum production, from 1920 to now, that puts those claims in perspective.

See the tiny little uptick in production over there on the far right? That’s the allegedly immense rise in petroleum production that drives all the rhetoric. If that blip doesn’t look like a worldchanging event to you, dear reader, you’re getting the message. It isn’t a worldchanging event; it’s the predictable and, by the way, repeatedly predicted result of the rise in oil prices from around $30 a barrel to between three and four times that, following the 2008 spike and crash. Triple or quadruple the price of any other commodity, and sources of that commodity that weren’t economically feasible to produce at the lower price will suddenly become paying propositions, too. (Yes, that’s spelled “Bakken shale” in the present tense.) If the price of oil were to triple or quadruple again over the next few years, we’ll probably see another increase on the same very modest scale, too. That increase still won’t be a worldchanging event, though the economic impact of another round of price increases on that scale might be.

More generally, we’ve got a real shortage of worldchanging events just now. There are good reasons for that, just as there are equally—well, equally strong, if not equally good—reasons why so many people are pinning all their hopes on a worldchanging event of one kind or another. Therapists like to point out that if you always do what you’ve always done, you’ll always get what you’ve always gotten, and of late it’s become a truism (though it’s also a truth) that doing the same thing and expecting to get different results is a good working definition of insanity. The attempt to find some way around that harsh but inescapable logic is the force that drove the prophetic hysteria about 2012, and drives end-of-the-world delusions more generally: if the prospect of changing the way you live terrifies you, but the thought of facing the consequences of the way you live terrifies you just as much, daydreaming that some outside force will come along and change everything for you can be a convenient way to avoid having to think about the future you’re making for yourself.

With that in mind, and with an eye toward the year ahead of us, I’d like to attend to three New Year customs that haven’t gotten as much attention here on The Archdruid Report as they probably should. First, I’d like to go over my predictions for the year just finished, and see how well they did; second, I’d like to offer up some predictions for the year to come; and third, I’d like to make some suggestions for what my readers might consider doing about it all.

My 2012 predictions appeared in the first January post here last year. Here they are:

“I’d like to suggest that when we take a backwards look in the early days of 2013, we will most likely see that that’s what happened in 2012, too: a slow worsening across a wide range of trends, punctuated by localized crises and regional disasters. I’d like to predict, in fact, that when we take that backward look, the US dollar and the Euro will both still exist and be accepted as legal tender, though the Eurozone may have shed a couple of countries who probably shouldn’t have joined it in the first place; that stock markets around the world will have had another volatile year, but will still be trading. Here in the US, whoever is unlucky enough to win the 2012 presidential election will be in the middle of an ordinary transition to a new term of office; the new Congress will be gearing up for another two years of partisan gridlock; gas stations will still have gas for sale and grocery stores will be stocked with groceries; and most Americans will be making the annual transition between coping with their New Year’s hangovers and failing to live up to their New Year’s resolutions, just as though it was any other year.

“Official US statistics will no doubt insist that the unemployment rate has gone down…but the number of people out of work in the United States will likely set another all-time record; the number of people in severe economic trouble will have gone up another good-sized notch, and public health clinics will probably be seeing the first wave of malnutrition-caused illness in children. If you happen to have spent the year in one of the areas unfortunate enough to get hit by the hard edge of the increasingly unstable weather, you may have had to spend a week or two in an emergency shelter while the flood waters receded or the wreckage got hauled away, and you might even notice that less and less gets rebuilt every year.

“Unless that happens, though, or unless you happen to pay close attention to the things that don’t usually make the evening news, you may well look back in the first days of 2013 and think that business as usual is still ongoing. You’d be right, too, so long as you recognize that there’s been a stealthy change in what business as usual now means. Until the peak of world conventional petroleum production arrived in 2005, by and large, business as usual meant the continuation of economic growth. Since then, by and large, it has meant the continuation of economic decline.”

No countries left the Eurozone in 2012, and if malnutrition-caused illness in children has had a notable uptick in America, I haven’t yet heard of it. Other than that, I think it’s fair to say that I called it. I’d like to put on my sorcerer’s cap, furthermore, and gaze a little deeper into the mists of futurity; I thus predict that just as 2012 looked like a remake of 2011 a little further down the curve of decline, 2013 will look a good deal like 2012, but with further worsening along the same broad array of trends and yet another round of local crises and regional disasters. The number of billion-dollar weather disasters will tick up further, as will the number of Americans who have no job—though, to be sure, the official unemployment rate and other economic statistics will be gimmicked then as now. The US dollar, the Euro, and the world’s stock markets will still be in business at year’s end, and there will still be gas for sale in gas stations, groceries for sale in grocery stores, and more people interested in the Super Bowl than in global warming or peak oil, as 2013 gives way to 2014.

As the year unfolds, I’d encourage my readers to watch the fracking bubble. Yes, it’s a speculative bubble of the classic sort, one that has soaked up a vast amount of investment money over the last few years, and the glorious future of American energy independence being touted by the media has the same function, and the same relationship to reality, as the glorious future of endlessly rising house prices that got waved around with equal abandon in 2006 and 2007. I don’t expect the bubble to pop this year—my best guess at this point is that that’ll happen in 2014—but it’s already losing air as the ferocious decline rates experienced by fracked oil and gas wells gnaw the bottom out of the fantasy. Expect the new year to bring more strident claims of the imminent arrival of a shiny new future of energy abundance, coupled with a steady drumbeat of bad financial news suggesting, in essence, that the major players in that end of the oil and gas industry are well and truly fracked.

I’d also encourage my readers to watch the climate. The tendency to focus on predicted apocalypses to come while ignoring the reality of ongoing collapse in the present is as evident here as in every other corner of contemporary culture; whether or not the planet gets fried to a crackly crunch by some more or less distant future date, it’s irrefutable that the cost of weather-related disasters across the world has been climbing year over year for decades, and this is placing an increasingly harsh burden on local and regional economies here in the US and elsewhere. It’s indicative that many coastal towns in Louisiana and Mississippi that were devastated by Hurricane Katrina have never been rebuilt, and it’s probably a safe bet that a similar fate waits for a fair number of the towns and poorer neighborhoods hit hardest by Hurricane Sandy. As global warming pumps more heat into the heat engine we call Earth’s climate, the inevitable result is more extreme weather—drier droughts, fiercer storms, more serious floods, and so on down a litany that’s become uncomfortably familiar in recent years.

Most of the infrastructure of industrial society was built during the period of abnormally good weather we call the twentieth century. A fair amount of it, as New York subway riders have had reason to learn, is poorly designed to handle extreme weather, and if those extremes become normal, the economics of maintaining such complex systems as the New York subways in the teeth of repeated flooding start to look very dubious indeed. I don’t expect to see significant movements out of vulnerable coastal areas quite yet, but if 2011’s Hurricane Irene and 2012’s Hurricane Sandy turn out to have a bouncing baby sibling who decides to pay a visit to the Big Apple in 2013, 2014 might see the first businesses relocating further inland, perhaps to the old mill towns of the southern Hudson valley and the eastern end of Pennsylvania, perhaps further still.

That’s speculative. What isn’t speculative is that all the trends that have been driving the industrial world down the arc of the Long Descent are still in play, and so are all the parallel trends that are pushing America’s global empire along its own trajectory toward history’s dustbin Those things haven’t changed; even if anything could be done about them, which is far from certain, nothing is being done about them; indeed, outside of a handful of us on the fringes of contemporary culture, nobody is even talking about the possibility that something might need to be done about them. That being the case, it’s a safe bet that the trends I’ve sketched out will continue unhindered, and give us another year of the ordinary phenomena of slowly accelerating decline and fall.

That, in turn, leads to the question of what my readers might do about it all.

My advice hasn’t changed. It’s a source of some amusement to me, though, that no matter how clearly I try to communicate that advice, a fair number of people will hear what they want to hear, or perhaps what they expect to hear, rather than what I’m saying. Over the course of this last week, for example, several people commenting on this post on one of the many other forums where it appears insisted with some heat that I claimed that activism was worthless, while one of the commenters here on The Archdruid Report took me to task for what he thought was a rejection of community in favor of an unworkable go-it-alone approach.

Not so. What I’m saying is that any meaningful response to the crisis of our time has to begin on the individual level, with changes in our own lives. To say that it should begin there doesn’t mean that it should end there; what it does mean is that without the foundation of personal change, neither activism nor community building nor anything else is going to do much. We’ve already seen what happens when climate activists go around insisting that other people ought to decrease their carbon footprint, while refusing to do so themselves, and the results have not exactly been good. Equally, if none of the members of a community are willing to make the changes necessary to decrease their own dependence on a failing industrial system, just what good is the community as a whole supposed to do?

A great many people like to insist that changing your own life isn’t enough, and then act as though that means that changing your own life isn’t necessary. Again, not so. If industrial society as a whole has to stop dumping excess carbon dioxide into the atmosphere, dear reader, that means among many other things that you, personally, have to stop contributing your share of that excess. Equally, if industrial society as a whole is running short of fossil fuels, that means among many other things that you, personally, are going to have to get used to living without them. That being the case, why not start with the part of the problem about which you can actually do something—your own consumption of fossil fuels and your own production of carbon dioxide—and then go from there?

Political activism, community building, and a great many other proposed responses to the crisis of our time are entirely valid and workable approaches if those who pursue them start by making the changes in their own lives they expect other people to make in turn. Lacking that foundation, they go nowhere. It’s not even worth arguing any more about what happens when people try to get other people to do the things they won’t do themselves; we’ve had decades of that, it hasn’t helped, and it’s high time that the obvious lessons get drawn from that fact. Once again, if you always do what you’ve always done…

That being said, here are some suggested New Year’s resolutions for those of my readers who are interested in being part of the solution:

1. Caulk, weatherstrip, and insulate the place where you live. Most Americans can cut between 5% and 25% of their total annual energy use by weatherizing their homes. None of the work is rocket science; your local hardware store can sell you everything you need for a very modest amount of money, and there are plenty of sources in print and online that can teach you everything you need to know. The sooner you get to work, the sooner you start saving money, and the sooner a good chunk of your share of excess carbon dioxide stops messing with the atmosphere.

2. Make at least one commute or run at least one errand a week on foot, by bicycle, or by public transit. A great many Americans don’t actually need cars at all. A good many of those who do, due to a half century of idiotic land use planning, need them a great deal less often than they think. The best way to learn this is to experience what it’s like to travel by some other means. It’s long past time to ditch the “yuppie logic” that suggests that it’s a good idea to drive a mile to the health club to get on a treadmill and get the exercise you didn’t get by walking to the health club. It’s also long past time to ditch the equally false logic that insists that getting there faster is the only thing that matters.

3. If you take a vacation, take the train. Traveling by train uses a small fraction of the fuel per mile that a plane needs, and the trip is part of the vacation rather than an ordeal to endure between one place and the next. Give it a try. If you live in the US, you might also consider supporting the National Association of Railroad Passengers, which lobbies for expanded passenger rail service and offers a discount on fares for members.

4. Buy it used. This applies to everything from cars, should you actually need one, to the cheapest of trinkets. By buying a used product rather than a new one, you save the energy cost of manufacturing the new product, and you also keep things out of the waste stream. Used computers are particularly worth your while; if you live in a tolerably large urban area in the US, you can often get more computers than you need by letting your circle of friends know that you’ll take used but working devices off their hands for free. You won’t be able to play the latest computer games on them, sure, but if you’re obsessed with playing the latest computer games, you don’t need a computer; you need a life. Speaking of getting a life…

5. Turn off the boob tube. Better still, if you can talk the people you live with into it, get rid of the thing altogether. Commercial television exists to fill your brain with emotionally manipulative imagery that lures you into buying products you wouldn’t otherwise need or want. Public television? Replace “products” with “opinions” and you’re not too far off. (Huge rapacious corporations spend millions of dollars to fund public TV programs; I hope none of my readers are naive enough to think that these corporations do this out of some vague sense of moral obligation.) You don’t need any of that stuff cluttering up your brain. While you’re at it…

6. Take up an art, craft, or hobby. Once you turn off the TV, you’re going to have the one luxury that nobody in a modern consumer society is ever supposed to have: actual, unstructured free time. It’s worth luxuriating in that for a bit, but pretty soon you’ll find that you want to do something with that time, and one of the best options is to learn how to do something interesting with your hands. Three quarters of a century ago, most people had at least one activity that gave them something creative to do in their off hours, and a good many of those activities also produced useful and valuable things. Unless you’re at least seventy years old or come from a very unusual family, you have no idea how many arts, crafts and hobbies Americans used to pursue, or how little money it takes to get started with most of them. By the way, if you think you’re too old to take up playing the guitar or doing some other seemingly complicated skill, you’re not.

7. Do without something this year. This is the scary one for most people in today’s consumer society. To be able to have something, and choose not to have it, challenges some of the deepest of modern taboos. Give it a try. The point isn’t to strike an assumed pose of ecological virtue, by the way, so don’t tell anybody what you’re doing without, or even that you’re doing without something. Nor is this about “being good” in some socially approved manner, so don’t choose something that you’re supposed to want to do without. Just quietly neglect to make something part of your life, and pay attention to your own emotional reactions. If you’re like most people in today’s America, you’ll be in for a wild ride, but the destination is worth reaching.

So there you are. As we head deeper into the unknown country of 2013, have a happy and sustainable new year!

 

A couple of notes might be worth placing here for fans of my writing. First of all, my latest peak oil book, Not The Future We Ordered: The Psychology of Peak Oil and the Myth of Eternal Progress, is available for preorder. Karnac Press, the publisher, is a specialty press publishing mostly in the field of psychology; the book is primarily intended for psychologists, therapists, and members of the healing professions, who will need to know what they’re dealing with as the psychological impacts of peak oil take their toll, but it may also be of interest to peak oil readers generally. Much of what’s covered in Not The Future We Ordered hasn’t appeared here or in any of my other books, so it may be worth a look.

I’m also pleased to announce that I’ve been offered a position as contributing editor and monthly columnist with PeakProsperity.com(formerly ChrisMartenson.com). My first column there will be appearing later this month. My working plan at this point is to head deeper into the territory I explored in my book The Wealth of Nature, with an eye toward the practical and personal implications of the end of the age of abundance. This is a paid gig, and so the meat of my monthly columns will be in the subscribers-only area, but I plan on doing my level best to make sure it’s worth the price of admission. Again, might be worth a look.

 

 

Into an Unknown Country

[John Michael Greer]

Written by testudoetlepus

January 3rd, 2013 at 5:44 pm

The Beginning of the World

without comments

by John Michael Greer

Friday was, as I’m sure most of my readers noticed, an ordinary day. Here in the north central Appalachians, it was chilly but not unseasonably so, with high gray clouds overhead and a lively wind setting the dead leaves aswirl; wrens and sparrows hopped here and there in my garden, poking among the recently turned soil of the beds. No cataclysmic earth changes, alien landings, returning messiahs, or vast leaps of consciousness disturbed their foraging. They neither knew nor cared that one of the great apocalyptic delusions of modern times was reaching its inevitable end around them.

The inimitable Dr. Rita Louise, on whose radio talk show I spent a couple of hours on Friday, may have summed it up best when she wished her listeners a happy Mayan Fools Day. Not that the ancient Mayans themselves were fools, far from it, but then they had precisely nothing to do with the competing fantasies of doom and universal enlightenment that spent the last decade and more buzzing like flies around last Friday’s date.

It’s worth taking a look back over the genesis of the 2012 hysteria, if only because we’re certain to see plenty of reruns in the years ahead. In the first half of the 20th century, as archeologists learned to read dates in the Mayan Long Count calendar, it became clear that one of the major cycles of the old Mayan timekeeping system would roll over on that day. By the 1970s, that detail found its way into alternative culture in the United States, setting off the first tentative speculations about a 2012 apocalypse, notably drug guru Terence McKenna’s quirky “Timewave Zero” theory.

It was the late New Age promoter Jose Arguelles, though, who launched the 2012 fad on its way with his 1984 book The Mayan Factor and a series of sequels, proclaiming that the rollover of the Mayan calendar in 2012 marked the imminent transformation of human consciousness that the New Age movement was predicting so enthusiastically back then. The exactness of the date made an intriguing contrast with the vagueness of Arguelles’ predictions about it, and this contrast left ample room for other authors in the same field to jump on the bandwagon and redefine the prophecy to fit whatever their own eschatological preferences happened to be. This they promptly did.

Early on, 2012 faced plenty of competition from alternative dates for the great transformation. The year 2000 had been a great favorite for a century, and became 2012’s most important rival, but it came and went without bringing anything more interesting than another round of sordid business as usual. Thereafter, 2012 reigned supreme, and became the center of a frenzy of anticipation that was at least as much about marketing as anything else. I can testify from my own experience that for a while there, late in the last decade, if you wanted to write a book about anything even vaguely tangential to New Age subjects and couldn’t give it a 2012 spin, many publishers simply weren’t interested.

So the predictions piled up. The fact that no two of them predicted the same thing did nothing to weaken the mass appeal of the date. Neither did the fact, which became increasingly clear as the last months of 2012 approached, that a great many people who talked endlessly about the wonderful or terrible things that were about to happen weren’t acting as though they believed a word of it. That was by and large as true of the New Age writers and pundits who fed the hysteria as it was of their readers and audiences; I long ago lost track of the number of 2012 prophets who, aside from scheduling a holiday trip to the Yucatan or some other fashionable spot for the big day, acted in all respects as though they expected the world to keep going in its current manner straight into 2013 and beyond.

That came as a surprise to me. Regular readers may recall my earlier speculation that 2012 would see scenes reminiscent of the “Great Disappointment” of 1844, with crowds of true believers standing on hilltops waiting for their first glimpse of alien spacecraft descending from heaven or what have you. Instead, in the last months of this year, some of the writers and pundits most deeply involved in the 2012 hysteria started claiming that, well, actually, December 21st wasn’t going to be the day everything changed; it would, ahem, usher in a period of transition of undefined length during which everything would sooner or later get around to changing. The closer last Friday came, the more evasive the predictions became, and Mayan Fools Day and its aftermath were notable for the near-total silence that spread across the apocalyptic end of the blogosphere. Say what you will about Harold Camping, at least he had the courage to go on the air after his May prophecy flopped and admit that he must have gotten his math wrong somewhere.

Now of course Camping went on at once to propose a new date for the Rapture, which flopped with equal inevitability a few months later. It’s a foregone conclusion that some of the 2012 prophets will do the same thing shortly, if only to kick the apocalypse marketing machine back into gear. It’s entirely possible that they’ll succeed in setting off a new frenzy for some other date, because the social forces that make apocalyptic fantasies so tempting to believe just now have not lost any of their potency.

The most important of those forces, as I’ve argued in previous posts, is the widening mismatch between the fantasy of entitlement that has metastasized through contemporary American society, on the one hand, and the ending of an age of fossil-fueled imperial extravagance on the other. As the United States goes bankrupt trying to maintain its global empire, and industrial civilization as a whole slides down the far side of a dizzying range of depletion curves, it’s becoming harder by the day for Americans to make believe that the old saws of upward mobility and an ever brighter future have any relevance to their own lives—and yet those beliefs are central to the psychology, the self-image, and the worldview of most Americans. The resulting cognitive dissonance is hard to bear, and apocalyptic fantasies offer a convenient way out. They promise that the world will change, so that the believers don’t have to.

That same frantic desire to ignore the arrival of inescapable change pervades today’s cultural scene, even in those subcultures that insist most loudly that change is what they want. In recent months, to cite only one example, nearly every person who’s mentioned to me the claim that climate change could make the Earth uninhabitable has gone on to ask, often in so many words, “So why should I consume less now?” The overt logic here is usually that individual action can’t possibly be enough. Whether or not that’s true is anyone’s guess, but cutting your own carbon footprint actually does something, which is more than can be said for sitting around enjoying a standard industrial world lifestyle while waiting for that imaginary Kum Ba Ya moment when everyone else in the world will embrace limits not even the most ardent climate change activists are willing to accept themselves.

Another example? Consider the rhetoric of elite privilege that clusters around the otherwise inoffensive label “1%.” That rhetoric plays plenty of roles in today’s society, but one of them pops up reliably any time I talk about using less. Why, people ask me in angry tones, should they give up their cars when the absurdly rich are enjoying gigantic luxury yachts? Now of course we could have a conversation about the total contribution to global warming of cars owned by people who aren’t rich, compared to that of the fairly small number of top-end luxury yachts that usually figure in such arguments, but there’s another point that needs to be raised. None of the people who make this argument to me have any control over whether rich people have luxury yachts. All of them have a great deal of control over whether and how often they themselves use cars. Blaming the global ecological crisis on the very rich thus functions, in practice, as one more way to evade the necessity of unwelcome change.

Along these same lines, dear reader, as you surf the peak oil and climate change blogosphere and read the various opinions on display there, I’d encourage you to ask yourself what those opinions amount to in actual practice. A remarkably large fraction of them, straight across the political landscape from furthest left to furthest right and including all stops in between, add up to demands that somebody else, somewhere else, do something. Since the people making such demands rarely do anything to pressure, or even to encourage, those other people elsewhere to do whatever it is they’re supposed to do, it’s not exactly hard to do the math and recognize that here again, these opinions amount to so many ways of insisting that the people holding them don’t have to give up the extravagant and unsustainable lifestyles most people in the industrial world think of as normal and justifiable.

There’s another way to make the same point, which is that most of what you’ll see being proposed in the peak oil and climate change blogosphere has been proposed over and over and over again already, without the least impact on our predicament. From the protest marches and the petitions, through the latest round of grand plans for energy futures destined to sit on the shelves cheek by jowl with the last round, right up to this week’s flurry of buoyantly optimistic blog posts lauding any technofix you care to name from cold fusion and algal biodiesel to shale gas and drill-baby-drill: been there, done that, used the T-shirt to wipe another dozen endangered species off the face of the planet, and we’re still stuck in the same place. The one thing next to nobody wants to talk about is the one thing that distinguished the largely successful environmental movement of the 1960s and 1970s from the largely futile environmental movement since that time, which is that activists in the earlier movement were willing to start the ball rolling by making the necessary changes in their own lives first.

The difficulty, of course, is that making these changes is precisely what many of today’s green activists are desperately trying to avoid. That’s understandable, since transitioning to a lifestyle that’s actually sustainable involves giving up many of the comforts, perks, and privileges central to the psychology and identity of people in modern industrial societies. In today’s world of accelerating downward mobility, especially, the thought of taking any action that might result in being mistaken for the poor is something most Americans in particular can’t bear to contemplate—even when those same Americans recognize on some level that sooner or later, like it or not, they’re going to end up poor anyway.

Those of my readers who would like to see this last bit of irony focused to incandescence need only get some comfortably middle class eco-liberal to start waxing lyrical about life in the sustainable world of the future, when we’ll all have to get by on a small fraction of our current resource base. This is rarely difficult; I field such comments quite often, sketching out a rose-colored contrast between today’s comfortable but unsatisfying lifestyles and the more meaningful and fulfilling existence that will be ours in a future of honest hard work in harmony with nature. Wait until your target is in full spate, and then point out that he could embrace that more meaningful and fulfilling lifestyle right now by the simple expedient of discarding the comforts and privileges that stand in the way. You’ll get to watch backpedaling on a heroic scale, accompanied by a flurry of excuses meant to justify your target’s continued dependence on the very comforts and privileges he was belittling a few moments before.

What makes the irony perfect is that, by and large, the people whom you’ll hear criticizing the modern lifestyles they themselves aren’t willing to renounce aren’t just mouthing verbal noises. They realize, many of them, that the lifestyles that industrial societies provide even to their more privileged inmates are barren of meaning and value, that the pursuit and consumption of an endless series of increasingly shoddy manufactured products is a very poor substitute for a life well lived, and that stepping outside the narrowing walls of a world defined by the perks of the consumer economy is the first step toward a more meaningful existence. They know this; what they lack, by and large, is the courage to act on that knowledge, and so they wander the beach like J. Alfred Prufrock in Eliot’s poem, letting the very last inch or so of the waves splash over their feet—the bottoms of their trousers rolled up carefully, to be sure, to keep them from getting wet—when they know that a running leap into the green and foaming water is the one thing that can save them. Thus it’s not surprising that their daydreams cluster around imaginary tidal waves that will come rolling in from the deep ocean to sweep them away and make the whole question moot.

This is why it’s as certain as anything can be that within a year or so at most, a good many of the people who spent the last decade or so talking endlessly about last Friday will have some other date lined up for the end of the world, and will talk about it just as incessantly. It’s that or face up to the fact that the only way to live up to the ideals they think they espouse is to walk straight toward the thing they most fear, which is the loss of the perks and privileges and comforts that define their identity—an identity many of them hate, but still can’t imagine doing without.

Meanwhile, of course, the economy, the infrastructure, and the resource flows that make those perks and privileges and comforts possible are coming apart around them. There’s a great deal of wry amusement to be gained from watching one imaginary cataclysm after another seize the imagination of the peak oil scene or society as a whole, while the thing people think they’re talking about—the collapse of industrial civilization—has been unfolding all around them for several years now, in exactly the way that real collapses of real civilizations happen in the real world.

Look around you, dear reader, as the economy stumbles through another round of contraction papered over with increasingly desperate fiscal gimmicks, the political system of your country moves ever deeper into dysfunction, jobs and livelihoods go away forever, whatever social safety net you’re used to having comes apart, towns and neighborhoods devastated by natural disasters are abandoned rather than being rebuilt, and the basic services that once defined a modern society stop being available to a larger and larger fraction of the people of the industrial world. This is what collapse looks like. This is what people in the crumbling Roman Empire and all those other extinct civilizations saw when they looked out the window. To those in the middle of the process, as I’ve discussed in previous posts, it seems slow, but future generations with the benefit of hindsight will shake their heads in wonder at how fast industrial civilization went to pieces.

I commented in a post at the start of this year that the then-current round of fast-collapse predictions—the same predictions, mind you, that had been retailed at the start of the year before, the year before that, and so on—were not only wrong, as of course they turned out to be, but missed the collapse that was already under way. The same point holds good for the identical predictions that will no doubt be retailed over the next few weeks, insisting that this is the year when the stock market will plunge to zero, the dollar and/or the Euro will lose all their value, the economy will seize up completely and leave the grocery shelves bare, and so on endlessly; or, for that matter, that this is the year when cold fusion or algal biodiesel or some other vaporware technology will save us, or the climate change Kum Ba Ya moment I mentioned earlier will get around to happening, or what have you.

It’s as safe as a bet can be that none of these things will happen in 2013, either. Here again, though, the prophecies in question are not so much wrong as irrelevant. If you’re on a sinking ocean liner and the water’s rising fast belowdecks, it’s not exactly useful to get into heated debates with your fellow passengers about whether the ship is most likely to be vaporized by aliens or eaten by Godzilla. In the same way, it’s a bit late to speculate about how industrial civilization will collapse, or how to prevent it from collapsing, when the collapse is already well under way. What matters at that stage in the game is getting some sense of how the process will unfold, not in some abstract sense but in the uncomfortably specific sense of where you are, with what you have, in the days and weeks and months and years immediately ahead of you; that, and then deciding what you are going to do about it.

With that in mind, dear reader, I’d like to ask you to do something right now, before going on to the paragraph after this one. If you’re in the temperate or subarctic regions of the northern hemisphere, and you’re someplace where you can adjust the temperature, get up and go turn the thermostat down three degrees; if that makes the place too chilly for your tastes, take another moment or two to put on a sweater. If you’re in a different place or a different situation, do something else simple to decrease the amount of energy you’re using at this moment. Go ahead, do it now; I’ll wait for you here.

Have you done it? If so, you’ve just accomplished something that all the apocalyptic fantasies, internet debates, and protest marches of the last two decades haven’t: you’ve decreased, by however little, the amount of carbon dioxide going into the atmosphere. That sweater, or rather the act of putting it on instead of turning up the heat, has also made you just a little less dependent on fossil fuels. In both cases, to be sure, the change you’ve made is very small, but a small change is better than no change at all—and a small change that can be repeated, expanded, and turned into a stepping stone on the way to bigger changes, is infinitely better than any amount of grand plans and words and handwaving that never quite manage to accomplish anything in the real world.

Turning down your thermostat, it’s been said repeatedly, isn’t going to save the world. That’s quite true, though it’s equally true that the actions that have been pursued by climate change and peak oil activists to date don’t look particularly likely to save the world, either, and let’s not even talk about what wasn’t accomplished by all the wasted breath over last Friday’s nonevent. That being the case, taking even the smallest practical steps in your own life and then proceeding from there will take you a good deal further than waiting for the mass movements that never happen, the new technologies that never pan out, or for that matter the next deus ex machina some canny marketer happens to pin onto another arbitrary date in the future, as a launching pad for the next round of apocalyptic hysteria.

Meanwhile, a world is ending. The promoters of the 2012 industry got that right, though they missed just about everything else; the process has been under way for some years now, and it won’t reach its conclusion in our lifetimes, but what we may as well call the modern world is coming to an end around us. The ancient Mayans knew, however, that the end of one world is always the beginning of another, and it’s an interesting detail of all the old Mesoamerican cosmological myths that the replacement for the old world doesn’t just pop into being. Somebody has to take action to make the world begin. It’s a valid point, and one that can be applied to our present situation, when so many people are sitting around waiting for the end and so few seem to be willing to kickstart the beginning in the only way that matters—that is, by making actual changes in their own lives. The deindustrial world of the future is poised to begin, but someone has to begin it. Shall we?

 

The Beginning of the World

[John Michael Greer]

Audio clip: Adobe Flash Player (version 9 or above) is required to play this audio clip. Download the latest version here. You also need to have JavaScript enabled in your browser.

Written by testudoetlepus

December 27th, 2012 at 4:06 pm