Toggle menu
Toggle personal menu
Not logged in
Your IP address will be publicly visible if you make any edits.

THE PEOPLE’S REPUBLIC OF WALMART (Original by Leigh Phillips and Michal Rozworski. Revised Edition by Comrade Milly Graha)

From ProleWiki, the proletarian encyclopedia
More languages


THE PEOPLE’S REPUBLIC OF WALMART
AuthorOriginal by Leigh Phillips and Michal Rozworski. Revised Edition by Comrade Milly Graha
Written in5, March, 2019. Revised 3, August, 2023
PublisherVerso Books. Revised by USU
TypeBook
Sourcehttps://clarion.unity-struggle-unity.org/peoples-republic-of-walmart-a-salvageable-trainwreck/


Foreword by ProleWiki

Below is an abridged version of "The People's Republic of Walmart" referred to as "The Abridged People's Republic of Walmart". Prolewiki will add original the book in time but have added the Abridged version so that people can disern the essential points of the book without needing to be bogged down in the Liberalism of the original authors, of whom, Leigh Phillips is a Zionist.

Included here also is a critique of the original book by Rachel Nagant.

https://clarion.unity-struggle-unity.org/peoples-republic-of-walmart-a-salvageable-trainwreck/

ProleWiki logo


AN INTRODUCTION to ECONOMIC PLANNING, HOLD THE LIBERALISM.

By Leigh Phillips and Michal Rozworski

Revised by Comrade Milly Graham

Note From the Editor

The People’s Republic of Walmart, as originally published by Jakkkobin and Verso Books, was a text rife with internal contradiction. The information about the use of planning in capitalist firms — planning as a subordinate social form within a market economy — is well researched, highly enlightening, and, I believe, imminently important for contemporary revolutionaries to understand. At the same time, however, I could not in good conscience recommend the book to anyone as it was, because its social and political commentary was — pardon my language — quite shit. Kautskyism, Bernsteinism, Lassalleanism, Narodnikism, Proudhonism — the book had it all like a bingo card of revisionism. Writing an entirely new book would be a waste of effort (and a missed opportunity to annoy the authors), so, instead, I have liberated what I believe to be a worthwhile read, particularly for developing Marxists or for a Marxist study group. I won’t claim to have perfectly picked clean every morsel off the bones of the original text, but I think you’ll find this revised edition much more valuable nonetheless. For the most part, edits of the remaining text are stylistic rather than changes to the content, but for the sake of transparency, any sections I have added or substantively altered will be marked with a red asterisk: *

Hugs and kisses,

- Comrade Milly Graham

WHY PLANNING?

There is certainly overlap between the set of all goods and services that are useful to humanity, on the one hand, and the set of all goods and services that are profitable, on the other. You likely find underwear to be a useful product (though for commandos, this is no certainty); The Gap, meanwhile, finds it profitable to produce such a product—a happy coincidence, of which there are many. But the set of all useful things and the set of all profitable things are not in perfect correspondence. If something is profitable, even if it is not useful or is even harmful, someone will continue producing it so long as the market is left to its own devices.

Fossil fuels are a contemporary example of this irremediable, critical flaw. Wonderful though they have been due to their energy density and portability, we now know that the greenhouse gasses emitted by fossil fuel combustion will rapidly shift the planet away from an average temperature that has remained optimal for human flourishing since the last ice age. Yet, so long as governments do not intervene to curtail the use of fossil fuels and build out the clean electricity infrastructure needed to replace them, the market will continue to produce them. Likewise, it was not the market that ended production of the chlorofluorocarbons that were destroying the ozone layer; instead it was regulatory intervention—planning of a sort—that forced us to use other chemicals for our fridges and cans of hairspray, allowing that part of the stratosphere that is home to high concentrations of ultraviolet ray—deflecting tripartite oxygen molecules to largely mend itself. We could recount similar tales about how the problems of urban air pollution in most Western cities or of acid rain over the Great Lakes were solved, or how car accident mortality rates or airline crashes have declined: through active state intervention in the market to curb or transform the production of harmful—but profitable—goods and services. The impressive health and safety standards of most modern mining operations in Western countries were achieved not as a result of any noblesse oblige on the part of the owners of the companies, but rather begrudgingly, as a concession following their defeat by militant trade unions.

Conversely, if something is useful but unprofitable, it will not be produced. In the United States, for instance, there is no universal public healthcare system, though healthcare for all would certainly be wonderfully useful. But because it is not profitable, it is not produced. High-speed internet in rural areas is also not profitable, so private telecommunications companies are loath to provide it there, preferring instead to cherry-pick profitable population-dense neighborhoods. And amid a growing global crisis of antimicrobial resistance, in which microbial evolution is defeating antibiotic after antibiotic and patients are increasingly dying from routine infections, pharmaceutical companies have all but given up research into new families of the life-saving drugs, simply because they are not profitable enough.

That amputation or surgery to scrape out infected areas might return as common medical responses is not a pleasant thought. But this course of action was the only one left to the doctors of nineteen-year-old David Ricci of Seattle when they surgically removed part of his leg, following repeated infections from drug-resistant bacteria—acquired in a train accident in India—that could not be treated, even with highly toxic last resort antibiotics. Each time the infection returned, more and more of the leg had to be cut off. Although Ricci has since recovered, he has lived in perpetual fear of the reappearance of the bugs that can’t be fought. As a 2008 “call to arms” paper from the Infectious Diseases Society of America (IDSA) put it, “[Antibiotics] are less desirable to drug companies and venture capitalists because they are more successful than other drugs.” Antibiotics are successful if they kill off an infection, at which point—days or weeks, or at most months, later—the patient stops taking the drug. For chronic diseases, however, patients may have to take their medicine every day, sometimes for the rest of their lives. Thus, the paper concluded, it is long-term therapy—not cures—that drives interest in drug development. Policy proposals from the likes of the IDSA, the World Health Organization and the European Union amount to begging and bribing the pharmaceutical companies to lift a finger; but even here, however unambitious the approach, it is still external to the market. (Nationalization of the pharmaceutical industry would be cheaper, and a much more rapid and effective approach, but most pundits deem it too “radical,” giving off too much of a whiff of socialism).

Beyond this one sector, we might note that basic blue-sky research in any field simply cannot be done by the private sector, because it is extremely expensive but makes no guarantee of any return on investment. Thus research is almost entirely a phenomenon characteristic of public institutions (or at least public funding). Similarly, it was not the market that got us to the moon, but rather a little ol’ public sector enterprise called NASA. Maybe you’ve heard of it?

*Besides determining what kinds of things are produced, there is another irresolvable, fundamental problem with the market: the propensity towards crisis. The “anarchy of production,” that is, the market system, compels every capitalist, as a compulsory law, to improve his means of production, to drive down the price of his commodities with the development of labor-saving technologies. But the expansion of production inevitably grows faster than the market which must absorb its products for the capitalist to realize his investment. Try as the capitalist might to expand his markets, to expand the sphere of circulation or to drive up consumption with advertising, he will forever be stuck in a tragic “boom and bust” cycle. The forces of production created by society thus overwhelm the form of exchange of society, creating a crisis, not of scarcity, but, absurdly, of overproduction. For the biggest of capitalists, these crises are merely another opportunity to improve their accumulated horde of wealth, to buy out the failing businesses. For the rest of us in society, crisis means losing one's business, one’s job, one’s home, and all the other elements of one’s means of subsistence. It means the destruction of the means of production, or the surplus commodities, resetting the cycle back to the start. This contradiction between the forces of production and the relations of production — the systems of exchange and ownership — cannot be resolved without social revolution, without abolishing the market!

In general, criticisms of the current way of doing things propose that the market be replaced, or at least reined in. But if allocation does not proceed via the market, then it will occur via economic planning, also known as “direct allocation”—made not by the “invisible hand” but by very visible humans. Indeed, this form of planned allocation already takes place widely in our current system, on the part of elected and unelected individuals alike, by both states and private enterprises, and in centralized and decentralized forms. Even arch-capitalist America is home not only to Walmart and Amazon, but also to the Pentagon: in spite of being incredibly destructive, the US Department of Defense is the single-largest employer in the world, and a centrally planned public sector operation. In fact, almost all countries are, to varying degrees, “mixed” economies, making use of both markets and planning.

Indeed, planning has accompanied human societies as long as they have existed. Thousands of years ago, the civilizations of ancient Mesopotamia created a nexus of economic institutions that connected the workshops and temples of the cities to peasant agricultural production in the countryside. The Third Dynasty of Ur (Ur III), which flourished around the Tigris and Euphrates Rivers near the end of the third millennium BCE, was among the first to make the breakthrough to widespread permanent record keeping. Clay tablets from Ur III include predictions of crop yields based on averages of soil quality, themselves derived from years of record keeping. Even though the economy was still at the mercy of uncontrollable weather, it could be managed at a rudimentary level. With the advent of detailed accounts, expectations and approximations—both crucial to planning— became features of economic life. Unlike the localized gift-exchange economy of prehistory, ancient Mesopotamia saw systems of centralized redistribution that mimic today’s welfare states: taxes and levies in, transfers of goods and services out.

Increasingly complex economic record-keeping, accounting and social institutions all point to early ancient civilizations producing something that cannot but be described as economic calculation and planning. This is not to say there was some Arcadia of central planning at this time, any more than it is accurate to describe hunter-gatherer society as some peaceful egalitarian Eden. The planning of the ancients was not only rudimentary and partial; it was also far from being a rational way of securing the shared benefit of all. Indeed, ancient planning was at the service of an economic system created for the benefit of a small coterie of elites who were motivated to maintain their wealth and power. Sound familiar?

There is not only a crying need for us to talk about what an alternative to the market would be, but also a great deal of confusion about what planning is and its history. To take one example: China appears to be the last man standing in the global economy; its growth rates, even if they have declined recently from eye popping to merely gobsmacking, have been achieved through an admixture of free market mechanisms and very heavy shepherding by central planners and party-state managers. It seems even some members of the ascendant bourgeoisie in that country believe that Mao’s economic planning was less mistaken than premature. A 2018 Financial Times feature describes Jack Ma, founder of the Chinese e-commerce colossus Alibaba Group, as part of a growing movement in the People’s Republic who argue that “the fatal flaw of state planning was simply that planners did not have enough information to make good decisions.” He and his co-thinkers believe that “big data” can solve this problem. Could he be right?

In such volatile times, it cannot be ruled out that a socialist revolution might, within our lifetimes, burst forth even within the capitalist heartlands, within the belly of world imperialism. If we do not take pains to sketch out ahead of time what an alternative to the market might look like, those involved will inevitably fall back on versions of what they already know. The capitalist-realist earworm, like the Ceti eel in Star Trek II: The Wrath of Khan, remains wrapped around our cerebral cortex, foreclosing the possibility of transformation even at the moment of its realization.

The time, then, is as ripe as browning avocados on toast to uncover a very old conversation: a long-standing but largely forgotten argument over the question of planning. Our aim is not to offer a comprehensive, definitive survey of this almost century-long discussion, which economists refer to as the “economic calculation debate” (or “socialist calculation debate”)—whether it is mathematically and physically possible to plan an economy, and whether this is desirable—but to provide a plain-language, hopefully even enjoyable, introduction for the uninitiated. In the main, we aim here to bring together and make more easily comprehensible ideas and findings that have been forgotten or are otherwise jargon filled, mathematical, or computer science-oriented, or which lie buried in the pages of little-read operations research or business-management journals. Thus, we lean heavily on the work of economic historians, computer scientists and scholars of commerce. In writing a primer on planning, and on the challenge of logistics and economic calculation, we hope to take this vital debate down from moldering academy shelves and reintroduce it into the field of live political combat.

Above all, our goal with this brief text is simply to flag a rarely recognized, yet obvious, fact that in some sense makes the “calculation debate” anachronistic: it is already the case that great swaths of the global economy are planned. Walmart is a prime example. Thus the question as to whether planning can exist at large scales without crippling economic inefficiencies could be moot. There might be no single machine that we can simply take over, run them with new operators but otherwise leave them unchanged; but there is a foundation of planning that a more just society could surely take up and make its own.

This is not so much a book about a future society, but one about our own. We plan. And it works.

COULD WALMART BE A SECRET SOCIALIST PLOT?

Could Walmart be a secret socialist plot? This is, in effect, the question that Fredric Jameson, American literary critic, Marxist political theorist, and cheeky devil, all too briefly poses in a footnote to his 2005 volume Archaeologies of the Future, a discussion of the nature of utopia in the age of globalization. Mr Jameson, gleefully poking at the progressive consensus that regards Walmart as a globe-barnacling chain of retail hypermarkets, the Galactus of capitalism, the beau idéal—perhaps more so even than Goldman Sachs—of everything that is wrong with everything that is wrong, Jameson wonders whether we might in fact be missing a trick about this transcontinental marvel of planning and logistics:

The literary utopists have scarcely kept pace with the businessmen in the process of imagination and construction…ignoring a global infrastructural deployment in which, from this quite different perspective, the Walmart celebrated by Friedman becomes the very anticipatory prototype of some new form of socialism for which the reproach of centralization now proves historically misplaced and irrelevant. It is in any case certainly a revolutionary reorganization of capitalist production, and some acknowledgment such as “Waltonism” or “Walmartification” would be a more appropriate name for this new stage.

But beyond these comments, the provocation is not fully developed. He lets the suggestion just hang there until the publication five years later of an essay on the subject: “Walmart as Utopia.” Here, he insists more full-throatedly that Walmart is not merely a useful institution from which, “after the revolution,” progressives could (per Lenin) “lop off what capitalistically mutilates this excellent apparatus.” It is not residual of the old society, he says, but rather something truly emergent of the new one yet to be born. Walmart is “the shape of a Utopian future looming through the mist, which we must seize as an opportunity to exercise the Utopian imagination more fully, rather than an occasion for moralizing judgments or regressive nostalgia.” *Jameson could be right in all but one way: if the seeds of a future social form are truly growing within the soil of the biggest capitalist firms, then this is no “utopia,” no arbitrary pipedream, but a very concrete sneak peak at the “real movement” of history.

Jameson compares the emergence of this novel entity to the discovery of a new species of organism, or of a new strain of virus. He delights at the apparent contradiction of how the largest company in the world, even in its full-spectrum dominance—indeed precisely because of this omnipotence—is described by admiring, horrified business writers as a boa constrictor slowly but inexorably strangling market capitalism. But even here, Jameson is still mostly interested in using Walmart as a thought experiment—a demonstration of “the dialectical character of the new reality,” and an example of the notion within dialectics of the unity of opposites: the firm as “the purest expression of that dynamic of capitalism which devours itself, which abolishes the market by means of the market itself.”

Such philosophical flourishes are more than worthwhile, but we are curious about something perhaps a measure more concrete. We want to take Jameson’s provocation beyond a footnote or a thought experiment and, in the light of what we know about Walmart’s operations, revisit a nearly century-old argument between those who favored socialism and those who asserted that capitalism offered the best of all possible worlds. For beneath the threadbare cliché of the maxim that socialism is “fine in theory, but impossible in practice,” there in fact lie claims about economic planning, and about how to calculate an egalitarian distribution of goods and services without need for markets.

Furthermore, the appearance that these claims have been settled by the defeat of the Communist bloc is merely superficial. And counterintuitive as it may seem at first, the infamously undemocratic Walmart, and a handful of other examples we will consider, offer powerful encouragement to the socialist hypothesis that a planned economy—democratically coordinated by ordinary working people, no less—is not merely feasible, but more efficient than the market.

But before we begin to explain how Walmart is the answer, we first have to ask: What is the question?

The Socialist Calculation Debate

Since the neoliberal revolution of the 1970s and its acceleration following the end of the Cold War, economic planning at scale has been widely derided from right to center-left, and planned endeavors such as public healthcare have been under attack from marketization in most countries. In most jurisdictions, the electricity systems that were once in public hands have long since been privatized; therefore governments committed to efforts to decarbonize electricity companies have had little choice but to employ market mechanisms such as emissions trading or carbon taxation, rather than reducing greenhouse gas emissions via democratic fiat—that is, simply ordering the electricity provider to switch to non-emitting fuel sources. Almost everywhere, transportation, communication, education, prisons, policing and even emergency services are being spun off wholly or in part from the public sector and provided instead by market actors. Only the armed forces remain a state monopoly, and here only up to a point, given the rise of private security multinationals such as the notorious G4S and Blackwater (rebranded as Academi since 2011). The handful of social democratic and liberal parties that still defend public healthcare and public education do so while making vague assertions that “government has a role to play” or that “government can be a force for good,” but they don’t really say why. Social democrats today will argue for a mixed economy, or for a mixture of state planning and the market—but again, they do so without saying why. If planning is superior, then why not plan everything? But if some goods and services are better produced by the market than by planning, then what are the attributes of these particular goods and services that make them so? All this activity and argument empty of actual argument reflects a set of policies enacting surrender to an unchangeable status quo, the architects of which only retroactively attempt to transform such capitulation into a coherent ideology. For much of social democracy in the twenty-first century, beliefs follow from policies, rather than policies from beliefs. And while those centrists and conservatives who cheerlead the market stop short of advocating a world where everything is allocated via markets, they still do not offer arguments explaining why their preferred admixture of market and planning is superior. When challenged, they simply describe the current state of affairs: “No economy is completely planned or completely market-based.”

Well, plainly this is true. But again, this gives no explanation as to why their favored configuration is optimal.

Perhaps this is understandable. It seems, at first glance, almost manifest that the market won the Cold War and that planning lost. Yet if the market is conclusively, unassailably, incontestably the optimal mechanism for the allocation of goods and services, then why have the economies of Western nations continued to experience mismatches between what is produced and what is required—mismatches that have led to severe recessions and near-catastrophic economic crises since 1991? Why was the global economy barely (and likely temporarily) saved from a Depression-scale collapse in investment in 2008, not by market mechanisms, but as a result of (modest) Keynesian pump priming? What is the source of economic stagnation since the Great Recession? Why, after three decades of steady decreases in inequality in the West in the post-war period leading up to the 1970s, has inequality in the developed countries grown over the last forty years, triggering an explosion in popular anger, along with hard-right reaction, in country after country? And why can’t the market, left to its own devices, meet the civilizationally existential challenge of climate change? So the question of market versus planning should appear as unresolved as ever.

In the early decades of the last century, the question of whether the market or planning is the optimal mechanism for the allocation of goods and services was widely accepted as unanswered. In the 1920s and 1930s, left-wing economists influenced by Marxism, on the one hand, and rightwing economists of the neoclassical Austrian School, on the other, were engaged in a vigorous discussion—subsequently known as the “economic calculation problem”—over whether economic planning at scale was feasible. At the time, neoclassicals were not arguing from a position of ideological hegemony. The Soviet Union had recently been established, and the war efforts of both the Allies and the Central Powers were expansive exercises in central planning. By the 1930s, the Bolsheviks had rapidly launched a feudal Russia into electrified, industrial modernity, meaning economists who would criticize planning would have to counter what appeared to be substantial evidence in its favor. As a result, partisans on both sides took the idea of planning seriously, and the Austrians had to work hard to try to prove their point, to show how economic planning was an impossibility.

Ludwig von Mises, Austrian School economist and hero of latter-day neoliberals, launched the first counter-volley against the advocates of planning. In his seminal 1920 essay “Economic Calculation in the Socialist Commonwealth,” Mises posed the following questions: In any economy larger than the primitive family level, how could socialist planning boards know which products to produce, how much of each should be produced at each stage, and which raw materials should be used and how much of them? Where should production be located, and which production process was most efficient? How would they gather and calculate this vast array of information, and how could it then be retransmitted back to all actors in the economy? The answer, he said, is that the mammoth scale of information needed—for producers, consumers and every actor in between, and for every stage and location of production of the multitude of products needed in society—is beyond the capacity of such planning boards. No human process could possibly gather all the necessary data, assess it in real time, and produce plans that accurately describe supply and demand across all sectors. Therefore, any economy the size of an entire country that tried to replace the myriad decisions from the multitude of sovereign consumers with the plans of bureaucrats working from incorrigibly flawed data would regularly produce vast, chasm-like mismatches between what is demanded and what is supplied.

These inefficiencies would result in such social and economic barbarities—shortages, starvation, frustration and chaos—that even if one accepts the inevitability of inequalities and attendant myriad other horrors of capitalism, the market will still appear benign by comparison.

Meanwhile, Mises argued that the extraordinarily simple mechanism of prices in the market, reflecting the supply and demand of resources, already contains all this information. Every aspect of production—from the cost of all inputs at all times, to the locations of inputs and products, and the changing demands and taste of purchasers—is implicitly captured by price.

There is much more to the calculation debate, and we’ll briefly outline some of the additional mathematical and computational aspects later on, but for now this theoretical standoff should suffice. It is enough to know that as a result of this impasse, depending on our political persuasions, we have opted either for the information imperfections of the market, or for the information imperfections of planning, without ever resolving the debate. The stalemate could even be tweeted in less than 140 characters: “What about data imperfections leading to shortages?” “Oh yeah? Well what about data imperfections leading to injustices?”

Thus we are stuck. Or so it has seemed for a long time.

Planning in Practice

Walmart is perhaps the best evidence we have that while planning appears not to work in Mises’s theory, it certainly does in practice. And then some. Founder Sam Walton opened his first store, Wal-Mart Discount City, on July 2, 1962, in the non-city of Rogers, Arkansas, population 5,700. From that clichédly humble, East Bumphuck beginning, Walmart has gone on to become the largest company in the world, enjoying eye-watering, People’s Republic of China–sized cumulative average growth rates of 8 percent during its five and a half decades. Today, it employs more workers than any other private firm; if we include state enterprises in our ranking, it is the world’s third-largest employer after the US Department of Defense and the People’s Liberation Army. If it were a country—let’s call it the People’s Republic of Walmart—its economy would be roughly the size of a Sweden or a Switzerland. Using the 2015 World Bank country-by-country comparison of purchasing-power parity GDP, we could place it as the 38th largest economy in the world.

Yet while the company operates within the market, internally, as in any other firm, everything is planned. There is no internal market. The different departments, stores, trucks and suppliers do not compete against each other in a market; everything is coordinated. Walmart is not merely a planned economy, but a planned economy on the scale of the USSR smack in the middle of the Cold War. (In 1970, Soviet GDP clocked in at about $800 billion in today’s money, then the second-largest economy in the world; Walmart’s 2017 revenue was $485 billion.)

As we will see, Walmart’s suppliers cannot really be considered external entities, so the full extent of its planned economy is larger still. According to Supply Chain Digest, Walmart stocks products from more than seventy nations, operating some 11,000 stores in twenty-seven countries. TradeGecko, an inventory-management software firm, describes the Walmart system as “one of history’s greatest logistical and operational triumphs.” They’re not wrong. As a planned economy, it’s beating the Soviet Union at its height before stagnation set in.

Yet if Mises and friends were right, then Walmart should not exist. The firm should long since have hit their wall of too many calculations to make. Moreover, Walmart is not unique; there are hundreds of multinational companies whose size is on the same order of magnitude as Sam Walton’s behemoth, and they too are all, at least internally, planned economies.

In 1970, Walmart opened its first distribution center, and five years later, the company leased an IBM 370/135 computer system to coordinate stock control, making it one of the first retailers to electronically link up store and warehouse inventories. It may seem strange now, but prior to this time, stores were largely stocked directly by vendors and wholesalers, rather than using distributors. Large retailers sell thousands of products from thousands of vendors. But direct stockage—sending each product directly to each store—was profoundly inefficient, leading to regular over- or understocks. Even smaller retailers, who cannot afford their own distribution centers, today find it more efficient to outsource distribution center functions to a logistics firm that provides this service for multiple companies.

Logistical outsourcing happens because it would be far too expensive in terms of labor costs for one tiny store to be able to maintain a commercial relationship with thousands of record labels, and vice versa; but that store can have a relationship with, say, five distributors, each of whom have a relationship with, say, a hundred labels. The use of distributors also minimizes inventory costs while maximizing the variety that a store can offer, at the same time offering everyone along the supply chain a more accurate knowledge of demand. So while your local shop may not carry albums from Hello Kitty Pencil Case Records, via the magic of distributors, your tiny local shop will have a relationship with more record labels than they otherwise could.

In 1988, Procter & Gamble, the detergents and toiletries giant, introduced the stocking technique of continuous replenishment, partnering first with Schnuck Markets, a chain of St. Louis grocery stores. Their next step was to find a large firm to adopt the idea, and they initially shopped it to Kmart, which was not convinced. Walmart, however, embraced the concept, and thus it was that the company’s path to global domination truly began.

“Continuous replenishment” is a bit of a misnomer, as the system actually provides merely very frequent restocking (from the supplier to the distributor and thence the retailer), in which the decision on the amount and the timing of replenishment lies with the supplier, not the retailer. The technique, a type of vendor-managed inventory, works to minimize what businesses call the “bullwhip effect.” First identified in 1961, the bullwhip effect describes the phenomenon of increasingly wild swings in mismatched inventories against product demand the further one moves along the supply chain toward the producer, ultimately extending to the company’s extraction of raw materials. Therein, any slight change in customer demand reveals a discord between what the store has and what the customers want, meaning there is either too much stock or too little.

To illustrate the bullwhip effect, let’s consider the “too-little” case. The store readjusts its orders from the distributor to meet the increase in customer demand. But by this time, the distributor has already bought a certain amount of supply from the wholesaler, and so it has to readjust its own orders from the wholesaler—and so on, through to the manufacturer and the producer of the raw materials. Because customer demand is often fickle and its prediction involves some inaccuracy, businesses will carry an inventory buffer called “safety stock.” Moving up the chain, each node will observe greater fluctuations, and thus greater requirements for safety stock. One analysis performed in the 1990s assessed the scale of the problem to be considerable: a fluctuation at the customer end of just 5 percent (up or down) will be interpreted by other supply chain participants as a shift in demand of up to 40 percent.

Just like the wave that travels along an actual bullwhip following a small flick of the wrist, a small change in behavior at one end results in massive swings at the other. Data in the system loses its fidelity to realworld demand, and the further you move away from the consumer, the more unpredictable demand appears to be. This unpredictability in either direction is a major contributing factor to economic crisis as companies struggle (or fail) to cope with situations of overproduction, having produced much more than they predicted would be demanded and being unable to sell what they have produced above its cost. Insufficient stock can be just as disruptive as overstock, leading to panic buying, reduced trustworthiness by customers, contractual penalties, increased costs resulting from training and layoffs (due to unnecessary hiring and firing), and ultimately loss of contracts, which can sink a company. While there is of course a great deal more to economic crisis than the bullwhip effect, the inefficiencies and failures produced by the bullwhip effect can be key causes, rippling throughout the system and producing instability in other sectors. Even with modest cases of the bullwhip effect, preventing such distortions can allow reduced inventory, reduced administration costs, and improved customer service and customer loyalty, ultimately delivering greater profits.

But there’s a catch—a big one for those who defend the market as the optimal mechanism for allocation of resources: the bullwhip effect is, in principle, eliminated if all orders match demand perfectly for any period. And the greater the transparency of information throughout the supply chain, the closer this result comes to being achieved. Thus, planning, and above all trust, openness and cooperation along the supply chain—rather than competition—are fundamental to continuous replacement. This is not the “kumbaya” analysis of two utopian writers; even the most hard-hearted commerce researchers and company directors argue that a prerequisite of successful supply chain management is that all participants in the chain recognize that they all will gain more by cooperating as a trusting, information-sharing whole than they will as competitors.

The seller, for example, is in effect telling the buyer how much he will buy. The retailer has to trust the supplier with restocking decisions. Manufacturers are responsible for managing inventories in Walmart’s warehouses. Walmart and its suppliers have to agree when promotions will happen and by how much, so that increased sales are recognized as an effect of a sale or marketing effort, and not necessarily as a big boost in demand. And all supply chain participants have to implement data-sharing technologies that allow for real-time flow of sales data, distribution center withdrawals and other logistical information so that everyone in the chain can rapidly make adjustments.

*In short, one of the pitfalls of a perfectly competitive market is that information about the rest of the economy is unavailable to each link in a chain of mutually dependent firms. There’s an analogous pitfall in computer science known as “greedy algorithms.” In a greedy algorithm, one assumes that if they always pick the locally optimal choice at each intermediate branch in a tree of possibilities, then one will eventually reach the globally optimal value. Say, for example, I want to get home in the shortest distance possible, but I have several streets I can go down. If I always take the shortest road that I can see in front of me without ever looking ahead to see if I might reach a dead end or a detour, I am likely to actually extend my trip home. On the other hand, if I had knowledge about the relationships and distances between all the roads, I could plan the shortest trip home even if it meant, at particular steps, choosing the longer of two roads. Like a mature adult capable of delaying gratification, I am able to make a short term sacrifice now to yield a bigger reward later. But, by contrast, our acquaintance the market is rather like a child that keeps failing the marshmallow test; it is a “greedy algorithm” in that it might produce locally optimal solutions (say, for a particular firm), but it will never produce the globally optimal solution (the solution that maximizes utility and minimizes costs for all of society). But market actors are not merely naive, without knowledge about the rest of the “map,” they simply have no choice but to pick the greedy method.

We hear a lot about how Walmart crushes suppliers into delivering at a particular price point, as the company is so vast that it is worth it from the supplier’s perspective to have the product stocked by the store. And this is true: Walmart engages in what it calls “strategic sourcing” to identify who can supply the behemoth at the volume and price needed. But once a supplier is in the club, there are significant advantages. (Or perhaps “in the club” is the wrong phrasing; “once a supplier is assimilated by the WalmartBorg” might be better.) One is that the company sets in place long-term, high-volume strategic partnerships with most suppliers. The resulting data transparency and cross–supply chain planning decrease expenditures on merchandising, inventory, logistics, and transportation for all participants in the supply chain, not just for Walmart. While there are indeed financial transactions within the supply chain, resource allocation among Walmart’s vast network of global suppliers, warehouses and retail stores is regularly described by business analysts as more akin to behaving like a single firm.

Flipping all this around, Hau Lee, a Stanford engineering and management science professor, describes how the reverse can happen within a single firm, to deleterious effect. Volvo at one point was stuck with a glut of green cars. So the marketing department came up with an advertising and sales wheeze that was successful in provoking more purchases by consumers and reducing the inventory surplus. But they never told manufacturing, and so seeing the boost in sales, manufacturing thought there had been an increase in demand for green cars and cranked up production of the very thing that sales had been trying to offload.

The same phenomenon occurs in retail as much as it does manufacturing (and manufacturing is merely another link within the retail supply chain anyway), with Toyota being one of the first firms to implement intra- and inter-firm information visibility through its Walmartlike “Kanban” system, although the origin of this strategy dates as far back as the 1940s. While Walmart was pivotal in development of supply chain management, there are few large companies that have not copied its practices via some form of cross–supply chain visibility and planning, extending the planning that happens within a firm very widely throughout the capitalist “marketplace.”

Nevertheless, Walmart may just be the most dedicated follower of this “firmification” of supply chains. In the 1980s, the company began dealing directly with manufacturers to reduce the number of links within, and to more efficiently oversee, the supply chain. In 1995, Walmart further ramped up its cooperative supply chain approach under the moniker Collaborative Planning, Forecasting and Replenishment (CPFR), in which all nodes in the chain collaboratively synchronize their forecasts and activities. As technology has advanced, the company has used CPFR to further enhance supply chain cooperation, from being the first to implement company-wide use of universal product bar codes to its more troubled relationship with radio-frequency ID tagging. Its gargantuan, satellite-connected Retail Link database connects demand forecasts with suppliers and distributes real-time sales data from cash registers all along the supply chain. Analysts describe how stockage and manufacture is “pulled,” almost moment-to-moment, by the consumer, rather than “pushed” by the company onto shelves. All of this hints at how economic planning on a massive scale is being realized in practice with the assistance of technological advance, even as the wrangling of its infinities of data—according to Mises and his co-thinkers in the calculation debate—are supposed to be impossible to overcome.

Sears’s Randian Dystopia

It is no small irony that one of Walmart’s main competitors, the venerable, 120-plus-year-old Sears, Roebuck & Company, destroyed itself by embracing the exact opposite of Walmart’s galloping socialization of production and distribution: by instituting an internal market.

The Sears Holdings Corporation reported losses of some $2 billion in 2016, and some $10.4 billion in total since 2011, the last year that the business turned a profit. In the spring of 2017, it was in the midst of closing another 150 stores, in addition to the 2,125 already shuttered since 2010— more than half its operation—and had publicly acknowledged “substantial doubt” that it would be able to keep any of its doors open for much longer. The stores that remain open, often behind boarded-up windows, have the doleful air of late-Soviet retail desolation: leaking ceilings, inoperative escalators, acres of empty shelves, and aisles shambolically strewn with abandoned cardboard boxes half-filled with merchandise. A solitary brand new size-9 black sneaker lies lonesome and boxless on the ground, its partner neither on a shelf nor in a storeroom. Such employees as remain have taken to hanging bedsheets as screens to hide derelict sections from customers.

The company has certainly suffered in the way that many other brick-and-mortar outlets have in the face of the challenge from discounters such as Walmart and from online retailers like Amazon. But the consensus among the business press and dozens of very bitter former executives is that the overriding cause of Sears’s malaise is the disastrous decision by the company’s chairman and CEO, Edward Lampert, to disaggregate the company’s different divisions into competing units: to create an internal market.

From a capitalist perspective, the move appears to make sense. As business leaders never tire of telling us, the free market is the fount of all wealth in modern society. Competition between private companies is the primary driver of innovation, productivity and growth. Greed is good, per Gordon Gekko’s oft-quoted imperative from Wall Street. So one can be excused for wondering why it is, if the market is indeed as powerfully efficient and productive as they say, that all companies did not long ago adopt the market as an internal model.

Lampert, libertarian and fan of the laissez-faire egotism of Russian American novelist Ayn Rand, had made his way from working in warehouses as a teenager, via a spell with Goldman Sachs, to managing a $15 billion hedge fund by the age of 41. The wunderkind was hailed as the Steve Jobs of the investment world. In 2003, the fund he managed, ESL Investments, took over the bankrupt discount retail chain Kmart (launched the same year as Walmart). A year later, he parlayed this into a $12 billion buyout of a stagnating (but by no means troubled) Sears.

At first, the familiar strategy of merciless, life-destroying post-acquisition cost cutting and layoffs did manage to turn around the fortunes of the merged Kmart-Sears, now operating as Sears Holdings. But Lampert’s big wheeze went well beyond the usual corporate raider tales of asset stripping, consolidation and chopping-block use of operations as a vehicle to generate cash for investments elsewhere. Lampert intended to use Sears as a grand free market experiment to show that the invisible hand would outperform the central planning typical of any firm.

He radically restructured operations, splitting the company into thirty, and later forty, different units that were to compete against each other. Instead of cooperating, as in a normal firm, divisions such as apparel, tools, appliances, human resources, IT and branding were now in essence to operate as autonomous businesses, each with their own president, board of directors, chief marketing officer and statement of profit or loss. An eye-popping 2013 series of interviews by Bloomberg Businessweek investigative journalist Mina Kimes with some forty former executives described Lampert’s Randian calculus: “If the company’s leaders were told to act selfishly, he argued, they would run their divisions in a rational manner, boosting overall performance.”

He also believed that the new structure, called Sears Holdings Organization, Actions, and Responsibilities, or SOAR, would improve the quality of internal data, and in so doing that it would give the company an edge akin to statistician Paul Podesta’s use of unconventional metrics at the Oakland Athletics baseball team (made famous by the book, and later film starring Brad Pitt, Moneyball). Lampert would go on to place Podesta on Sears’s board of directors and hire Steven Levitt, coauthor of the pop neoliberal economics bestseller Freakonomics, as a consultant. Lampert was a laissez-faire true believer. He never seems to have got the memo that the story about the omnipotence of the free market was only ever supposed to be a tale told to frighten young children, and not to be taken seriously by any corporate executive.

And so if the apparel division wanted to use the services of IT or human resources, they had to sign contracts with them, or alternately to use outside contractors if it would improve the financial performance of the unit— regardless of whether it would improve the performance of the company as a whole. Kimes tells the story of how Sears’s widely trusted appliance brand, Kenmore, was divided between the appliance division and the branding division. The former had to pay fees to the latter for any transaction. But selling non-Sears-branded appliances was more profitable to the appliances division, so they began to offer more prominent in-store placement to rivals of Kenmore products, undermining overall profitability. Its in-house tool brand, Craftsman—so ubiquitous an American trademark that it plays a pivotal role in a Neal Stephenson science fiction bestseller, Seveneves, 5,000 years in the future—refused to pay extra royalties to the in-house battery brand DieHard, so they went with an external provider, again indifferent to what this meant for the company’s bottom line as a whole.

Executives would attach screen protectors to their laptops at meetings to prevent their colleagues from finding out what they were up to. Units would scrap over floor and shelf space for their products. Screaming matches between the chief marketing officers of the different divisions were common at meetings intended to agree on the content of the crucial weekly circular advertising specials. They would fight over key positioning, aiming to optimize their own unit’s profits, even at another unit’s expense, sometimes with grimly hilarious results. Kimes describes screwdrivers being advertised next to lingerie, and how the sporting goods division succeeded in getting the Doodle Bug mini-bike for young boys placed on the cover of the Mothers’ Day edition of the circular. As for different divisions swallowing lower profits, or losses, on discounted goods in order to attract customers for other items, forget about it. One executive quoted in the Bloomberg investigation described the situation as “dysfunctionality at the highest level.”

As profits collapsed, the divisions grew increasingly vicious toward each other, scrapping over what cash reserves remained. Squeezing profits still further was the duplication in labor, particularly with an increasingly top-heavy repetition of executive function by the now-competing units, which no longer had an interest in sharing costs for shared operations. With no company-wide interest in maintaining store infrastructure, something instead viewed as an externally imposed cost by each division, Sears’s capital expenditure dwindled to less than 1 percent of revenue, a proportion much lower than that of most other retailers.

Ultimately, the different units decided to simply take care of their own profits, the company as a whole be damned. One former executive, Shaunak Dave, described a culture of “warring tribes,” and an elimination of cooperation and collaboration. One business press wag described Lampert’s regime as “running Sears like the Coliseum.” Kimes, for her part, wrote that if there were any book to which the model conformed, it was less Atlas Shrugged than it was The Hunger Games.

Thus, many who have abandoned ship describe the harebrained free market shenanigans of the man they call “Crazy Eddie” as a failed experiment for one reason above all else: the model kills cooperation.

“Organizations need a holistic strategy,” according to the former head of the DieHard battery unit, Erik Rosenstrauch. Indeed they do. And, after all, what is society if not one big organization? Is this lesson any less true for the global economy than it is for Sears? To take just one example: the continued combustion of coal, oil and gas may be a disaster for our species as a whole, but so long as it remains profitable for some of Eddie’s “divisions,” those responsible for extracting and processing fossil fuels, these will continue to act in a way that serves their particular interests, the rest of the company—or in this case the rest of society—be damned.

In the face of all this evidence, Lampert is, however, unrepentant, proclaiming, “Decentralised systems and structures work better than centralised ones because they produce better information over time.” For him, the battles between divisions within Sears can only be a good thing. According to spokesman Steve Braithwaite, “Clashes for resources are a product of competition and advocacy, things that were sorely lacking before and are lacking in socialist economies.”

He and those who are sticking with the plan seem to believe that the conventional model of the firm via planning amounts to communism. They might not be entirely wrong.

Interestingly, the creation of SOAR was not the first time the company had played around with an internal market. Under an earlier leadership, the company had for a short time experimented along similar lines in the 1990s, but it quickly abandoned the disastrous approach after it produced only infighting and consumer confusion. There are a handful of other companies that also favor some version of an internal market, but in general, according to former vice president of Sears, Gary Schettino, it “isn’t a management strategy that’s employed in a lot of places.” Thus, the most ardent advocates of the free market—the captains of industry—prefer not to employ market-based allocation within their own organizations.

Just why this is so is a paradox that conservative economics has attempted to account for since the 1930s—an explanation that its adherents feel is watertight. But as we shall see in the next chapter, taken to its logical conclusion, their explanation of this phenomenon that lies at the very heart of capitalism once again provides an argument for planning the whole of the economy.

THE ECONOMICS OF INFORMATION

In the Depression year of 1931, Ronald Coase, a twenty-year-old British economics student, arrived in Chicago to pursue an unusual research project. His question was deceptively simple, although the economics he had been taught apparently didn’t have an answer: “Why are there these ‘islands of conscious power’?…If production is regulated by price movements [and] production could be carried on without any organization at all, well might we ask, why is there any organization?” In other words, if the market is the magic bullet to all human interaction, then even the simplest work tasks—from “stock this shelf” to “format this spreadsheet”—could theoretically be governed by prices on markets rather than by managers giving orders. Somewhat naively, Coase asked, why isn’t everything bought and sold on its own little market? Why are there so many times more Walmarts than there are Sears? Why do companies—from mom-and-pop shops to corporate behemoths—even exist?

Coase argued that Markets introduce a whole web of what he called “transaction costs.” Writing a contract, setting up a market or finding the best price all take up resources and time. So long as the cost of doing all this was cheaper in house than on the market (and it was), it was only rational to keep it in house. So the “free” market isn’t really free either! Coase argued that it only makes sense that some decisions would be left to planning—a decision is made, and it is done. Planning is more efficient—though for Coase, only up to a certain point. Having completed his tour of American business and witnessed its inner workings, upon his return to Britain, he compiled his thoughts in a 1932 lecture to University of Dundee students little younger than himself, although it would be another five years before he published his results.

The resulting text, “The Nature of the Firm”, features a quote from economist Dennis Robertson in which he talks of the curiosity of the very existence of companies, unflatteringly describing them as “islands of conscious power in this ocean of unconscious cooperation, like lumps of butter coagulating in a pail of buttermilk.” But where Robertson had merely remarked upon the mystery, Coase explained it: “Those who object to economic planning on the grounds that the problem is solved by price movements can be answered by pointing out that there is planning within our economic system [that] is akin to what is normally called economic planning.”

He was ignored for his insight. To this day, while hats are now tipped to Coase, and even though planning is plainly ubiquitous, taking place at heretofore unimagined scales, most economists talk very little about it. Economics textbooks offer in-depth explanations of consumer goods markets, the labor market, the money market or even the entire economy as one big market, but little to nothing about the planning inside firms. At best, economists will briefly mention planning, and then only to ridicule it. In much of mainstream economics, the firm is just a mathematical equation that consumes inputs and produces outputs. How it does this is rarely asked; its internal workings are insufficiently interesting. Or sufficiently embarrassing.

The vision of an orderly but completely unplanned market economy is nothing but fantasy. Planning exists in the market system and on a truly enormous scale. Today, the volume of transactions carried out within firms is as large as that carried out between them. Managers

have always been very concerned with planning, but it is only by diving deeply into practical management texts that we can learn about its extent under capitalism. Economists have hidden it behind a tangled web of seeming disorder.

Planning The Market

At the same time that Coase was traveling about, asking corporate managers why they didn’t have markets for moving products from shelves at one end of a warehouse to ones at the other, economists elsewhere were still busy arguing whether it was necessary to have markets at all. As noted earlier, Ludwig von Mises argued in 1920 that socialist planning of an entire economy was impossible because complex economies of the kind we now have need both markets and prices. In his view, markets decentralize the vast troves of information that a single planner couldn’t compile and calculate. Prices, however, make it possible to compare vastly different things; without them, he reasoned, how would planners know the relative worth of things as disparate as a car factory and a ballpoint pen and ultimately decide how many of each there should be? The counterargument that best answered these questions, at least for a while, finally came in 1937 from Polish economist Oskar Lange.

Lange believed that the neoclassical economists’ models of the capitalist economy could be commandeered and repurposed for socialist planning. Under capitalism, when H&M makes too many skinny, off-purple corduroy trousers, its stores eventually drive down the price to entice people to buy them. Demand meets supply when the price falls—at least that’s what happens in theory. In reality, the extra pants can end up in landfills, and H&M’s production for next season can end up moving somewhere with lower wages to make ever lower prices possible. Using the equations of Léon Walras, one of the founders of the neoclassical school, Lange wrote a pamphlet in 1937 that imagined a planned economy, which imitated the market without these downsides. Lange’s fictional socialist planners would manipulate “shadow prices” on paper, rather than waiting for real prices to filter down from cash registers to production decisions. Like a UV light at a crime scene, socialist planning would make explicit all the math that only happened in the background in models of capitalism. Lange answered Mises’s challenge—that prices and markets were necessary to any economic rationality—by incorporating them into a model of “market socialism.”

The key was devising how planners would figure out which shadow prices are the right ones—those that ensure the socialist economy is making enough, but not too much, of everything. For this, Lange repurposed another idea from Walras: tatonnement. In French, Walras’s native tongue, the word means “groping toward.” Walras imagined that markets groped toward the right prices until they found the holy grail of economics: general equilibrium, where all markets are in balance and the amount supplied of every single good or service is exactly equal to the amount demanded. Add some more math, and mainstream economists will tell you that they’ve proven that everyone is also as happy as can be, living in the best of all possible worlds.

Lange imagined that these planners would not merely be simulating the market; in theory, they could actually perform this tatonnement better than markets! People under Lange’s “market socialism” would still go to (government-run) stores to buy consumer goods, signaling to planners what they wanted produced. Producers—all also publicly owned—would aim to produce what the planners translated from consumer demands as efficiently as possible, without needing to leave room for profit after covering costs. As the economy produced things and consumers bought them, central planners would run equations, figure out what there was too much of and what there was too little of, and adjust the “shadow prices” until everything was in sync. Even without all the correct information available at once, Lange’s expected his planners to grope toward equilibrium like markets did under capitalism, only better and faster. And it would only be a matter of time before computers came along that were powerful enough to make the process faster still. Lange spent his final years fascinated by computer science and cybernetics. In one of his last papers, he wrote: “The market process with its cumbersome tatonnements appears old-fashioned. Indeed, it may be considered as a computing device of the pre-electronic age.”

By the time the Second World War began, many classical economists begrudgingly admitted that Lange’s arguments worked—at least in theory. If the socialist system of planning Lange and others described was theoretically possible, then the only question that remained was whether it was feasible. Although corporate and military planners, averse to socialism but intrigued by the power of even the simplest mathematical calculation for resource management and control, were beginning to use crude versions of formalized planning tools, it was difficult to imagine when—if ever—the computing power required for planners to solve Lange’s equations in reasonable time on an economy-wide scale would be available. With seemingly dim prospects for viable application, there was no reason to trumpet the fact that the socialists might be right.

Hayek’s Riposte

Such defeatism alarmed another Austrian economist, Friedrich von Hayek, who, following in the footsteps of Mises, was determined to prove Lange wrong. Hayek is better known today as the godfather of neoliberalism, the pro-market ideology that has come to dominate government policy around much of the world, the first incarnation of which is best exemplified by the administrations of Margaret Thatcher in the UK and Ronald Reagan in the United States during the 1980s. Hayek was explicit about wanting ideological regime change. The postwar welfare state truce between capital and labor had barely been installed when Hayek joined a small group of right-wing radicals to found the Mont Pelerin Society in 1944—a free market think tank before its time. It was integral to their task of reshaping ideology that they have at the ready a rebuke to Lange.

For someone who believed so fervently in capitalism, Hayek offered a very honest picture of the system. Maybe it was precisely because he was so ideologically committed to capitalism that he could talk about its shortcomings—all the ways it deviated from the fantasies of the neoclassical economists with their perfect humans, perfect markets and perfect information. Hayek questioned these central assumptions. People are not hyperrational—we have incomplete, imperfect ideas about the world. Markets are never quite in sync: there is always too much or too little of something. Capitalism is dynamic, a process of constant change rather than a state of equilibrium. On this last point, Hayek agreed with Marx. But as we’ll see, it would take a few decades for the mainstream of economics to embrace such notions.

Hayek was right in rejecting mainstream fantasies. In fact, it was Lange who had underestimated the problems he inherited from the economics of his time — he had merely attempted to replace “capitalist” variables in the equations of dominant neoclassical economics with “socialist” ones. In doing so, he took on all the flawed assumptions of the mainstream model.

Without this baggage, Hayek took a different tack from the silent but grudging acceptance of the mainstream. He rejected Lange’s case outright. Hayek argued that markets—incomplete, permanently off tilt, full of fallible humans—do not just aggregate and calculate information. Markets are producers of information and knowledge. Even if Lange’s market socialism allowed planners to calculate better and faster than did free markets, planning would ultimately still be impossible because planners would not have the information created by market interactions to use in their calculations. Buying and selling may not generate technical and scientific knowledge, but it still creates all that knowledge of “time and place” that is instrumental to making efficient production and distribution decisions. Hayek argued that the problem for planners was not in the “how”—the equations to use—but in the “what”—the data that goes into the equations. The copious information planners need is unavailable before markets work their magic. Decentralization creates coordination: only the market can bring together the information that is normally isolated in the heads of different individuals.

Hayek, however, was writing before the advent of “big data,” which is testing the limits of just how much granular information can be collected. It seems that he also wrote in blissful ignorance of Coase, who had shown just how flimsy the veneer of decentralized decision making really is, even under capitalist markets.

Oddly enough, despite challenging the market socialists head on, Hayek’s ideas were initially ignored, perhaps because they were critical not only of left-wing, but also mainstream economic opinion. At a time when even Richard Nixon was pronouncing that “we are all Keynesians now,” how could their maximalist rhetoric be anything but out of step? The debate on the calculation problem continued to unfold in the pages of obscure economic journals. The world, however, had moved on.

But shortly after Nixon’s startling declaration of allegiance, the existing economic orthodoxies on both sides of the Berlin Wall were violently thrown into question. By the 1970s, “really existing socialism” was mired in economic crisis, its cracks beginning to show. The “free world” was troubled, too, experiencing its most severe economic crisis of the postwar period. Political and economic elites saw in the crisis an opening to unwind their postwar compromise with labor, a compact borne not of love, but out of their fear of revolution. It was in this context that the new heterodoxy championed by Hayek became efficacious outside the walls of the academy at last.

We’ve All Been Misinformed

Joseph Stiglitz, another winner of the Swedish National Bank’s Prize in Economic Sciences in Memory of Alfred Nobel, first made his name by furthering the critique of the assumption of human rationality while still making a case for markets. Distinct from the earlier mythology of a perfectly rational Homo economicus—nowhere to be found in reality, but for so long beloved by economists—the economics of information that Stiglitz helped launch started from the seemingly obvious idea that getting our hands on, and using, information is usually costly, and sometimes impossible. An example economists love to use is the market for private health insurance. There is only so much an insurer can do to see if a person buying insurance is relatively healthy. Developing a better and better picture costs more and more. At some point, the costs prevent further information acquisition from making sense. In the same way, hiring a mechanic to take apart and inspect the engine of a used car to find out if it is a “lemon” can cost more than the car itself. Markets can fail: some people will end up overpaying for health insurance, while others will be uninsured. Your local sketchy used car dealership isn’t likely to be the first place you’d think of as a well-functioning market.

Beyond individual markets, Stiglitz and others were asking a bigger question: What if the entire economy was something of a used car dealership? Once enough examples of failing markets accumulate, the entire system’s efficiency and justice can be called into doubt. In short, the economics of information ultimately challenges the argument that capitalism, despite its flaws, is the best of all possible worlds. However, rather than seeing information problems as a reason to explore collective, democratic decision-making alternatives that could bring people and information together, economists went to work making market theory work in spite of humanity’s imperfections. Since the ’70s, the economics of information has generated ever more crafty schemes for incentivizing people or organizations to do things—all, of course, within the bounds of capitalist markets.

Mechanism design is one such scheme. In this obscure corner of economics, economists drum up—elegant, but often mathematically complex—means to compel people or companies to reveal information that they would otherwise keep secret. A new auction format created by economists in the early 1990s to help the US government sell off cell phone frequencies to telecommunications firms is an exemplary case. The auction had rules designed to force companies to reveal how much the rights to frequencies were really worth to them—lying would see them lose the rights to competitors. The design netted the government hundreds of millions of dollars more than expected and is now commonplace around the world.

Mechanism design is a kind of planning, although a very indirect one. Economic decision making of any kind—whether outright planning or a “designed” market—needs to gather the bits and pieces of information spread between people. But information problems don’t preclude other ways of doing things. Rather than creating a complex process that ultimately benefits a few big players, governments today could choose to run a public cell phone utility, which would constitute one more step on the way to greater socialization. As things stand, however, governments make some money on the auction, but give up control over a valuable resource. This also leaves behind a market dominated by a few big players who can charge famously high prices backed by shoddy customer service.

Mechanism design is just one more example showing that the free market also has to be planned. Real-world markets must be consciously made and remade.

Speaking of Making People Do Things…

One of the few economists before Coase to look inside the black box of business was, as it happens, Karl Marx. Marx saw the firm as an instrument for extracting profit off the backs of workers. He alighted upon a simple fact: workers are paid a wage for their time, not for what they produce. *Unlike a machine, living labor-power has a special property: it produces more than it costs to keep it going. That cost, the price of labor-power, is what the worker is paid as a wage; that which they produce belongs to the capitalist; and the difference between the two (plus material costs) is the surplus value. Because, at any given time, the cost of labor-power, one’s wage, is a fixed value, the boss can extract ever more surplus value out of the worker by making them work longer hours or by increasing the intensity of their labor. Hence the longer and harder the proletarian works, the less they work for themself — to produce the value with which they are compensated — and the more they enrich those who hold power over them.

Coase thought that firms planned simply to save costs. For Marx, it is a consequence of the class struggle. This struggle — between the producers and the owners — determines how everything we produce is divided up between us. The manager’s exercise of central planning over his small province of tyranny is therefore not simply a better means to an end, as Coase thought, but a reflection of the essential antagonism between the classes in society. The more the capitalist gets what he wants, the less the worker does; the dictatorial disciplining of labor is not merely more efficient — though it certainly is that — but an essential method of power and control. The GPS device in the UPS driver’s truck, the call center badge that monitors washroom breaks or the white-collar worker’s app that tracks web browsing history are the “sticks” requiring one does as one is told, to keep the workers producing even when they know it does not benefit them to do so.

Yet for mainstream economists, the confrontation between workers and managers only comes up in the context of “shirking.” Shirking, however, is a very rational response for someone who has little or no say over their work, has no deeper sense of collective responsibility, and knows that the profit from what they do ends up in someone else’s pocket. Shirking is not an innate tendency toward laziness, as the capitalists condescendingly claim, but rather a symptom of the worker’s alienation under capitalism.

In response to any mention of durable human cooperation that is not mediated by markets, in particular by the undisguised incentives provided by the labor market—at their most basic, work or starve—defenders of the market system often bring up “the tragedy of the commons.” This phrase, coined by ecologist Garrett Hardin in a 1968 article in the journal Science, refers to a shared resource inevitably depleted through overuse by individuals acting in their self-interest. The prototypical commons employed to illustrate this tragedy is a plot of open, shared pastureland in a village. If farmers only look out for the cows that are theirs, rather than the entire pasture, each will allow their cows to overgraze, and the land shared in common will quickly turn to dust.

Over the course of her long career, Elinor Ostrom, the only woman to win the “not really a Nobel” prize in economics in its fifty-year existence, did much to debunk this crude story. She compiled evidence of groups stewarding common resources and found that in many cases, the commons not only survived but thrived. Rather than being overrun by unthinking self-interest, shared resources were in reality often governed by complex sets of social rules established over time. Ostrom studied actual shared pasture land in Swiss alpine villages and found it had been preserved for common use for over 500 years. Based on this and other case studies, Ostrom went on to identify conditions that helped protect common resources—among them, participation in decision making by users of the resources, the capacity for monitoring usage, meaningful social sanctions and conflict-resolution mechanisms.

Get the Machine before It Gets You

Despite all the enthusiasm about markets and choice, planning remains the modus operandi of business. What has changed is that the advent of the information technology age has permitted the capture of vast stores of information. What do Facebook and Google do? They prod us, gently and with our own collusion, to reveal information about ourselves. Their business model is the economics of information, come to life. For now, they use the accumulated data to sell ad space—who knew the epitome of high technology would be getting the right people to see ads for novelty “I have a Polish husband and I know how to use him” T-shirts?—but the possibilities are much broader.

Uber and other media darlings of the “sharing economy” combine sussing out information with finding new ways to lower transaction costs. Good capitalists that they are, they’re doing it at the expense of workers and democracy (and other capitalists, namely the venture financiers who continue to pump money into a business like Uber even though it has so far failed to turn a profit). Uber’s rapid expansion stems in large part from its army of well-paid lobbyists, who in turn cajole and threaten city governments behind closed doors into cutting regulations around taxi monopolies.

Uber’s drivers, on the other hand, are poorly-paid “contractors.” No longer classified as workers (except in the UK where courts reinstated their rights as workers), they can make below minimum wage and have few labor rights. As with more and more workers in a range of sectors, they are under constant, nigh on panoptical, surveillance via data. Uber uses a five-star driver rating system in which drivers must maintain an average rating of 4.6 stars to keep driving for the company. Uber can “suggest” certain norms for its drivers to follow (how much to smile, what kind of extra services to offer, and so on), but in reality it is the risk of even one bad rating that quickly prods them to fall into line. Yet there is no top-down rule; when businesses can constantly collect and analyze information, strict management happens from the bottom up. Uber’s business model is to use the economics of information to do more than just sell ad space. The company’s ability to make people do things without telling them explicitly is not unique and is but a refinement of capitalism’s ability to make people complicit in their own unfreedom—a refinement made possible by a greater amount of and greater control over information.

On the other hand, rather than the herald of dystopian workplaces everywhere, Uber is also a natural candidate for a worker co-op. All that Uber provides, after all, is an app; the company is nothing but a middleman. A cooperatively owned network of drivers using a similar app could set pay rates and work rules democratically, in the here and now. A drivers’ co-op would be far superior to the venture capital–fueled behemoth we have today, even if this is a form of enterprise that, while introducing more workplace democracy than is normally possible under capitalism in the short run, is still subject to the same profit-seeking imperatives as any firm within capitalism—imperatives that will prompt self-exploitation in order to compete with other enterprises, thus ultimately undermining these very same democratic impulses.

Similarly, social networks could be run as public utilities rather than as private monopolies—remember that we created public electricity or water works after the failures of nineteenth-century robber baron capitalism. One of the big questions of the twenty-first century will be, who owns and controls the data that is quickly becoming a key economic resource? Will it be the fuel for democratic planning, or instead for a new more authoritarian capitalism? These questions require that we recognize the immense challenges posed by data-driven twenty-first-century capitalism: How could we nationalize multinational corporations that span and disregard national borders, and often play jurisdictions off of one another? How would we ensure privacy with so much data under collective, state control?

Privately held data is making possible more efficient production, but at the same time it is enabling closer supervision, and modern corporate planning is only starting to take advantage of all this newly available information. One outcome is illusory freedom for workers. If we constantly produce information both at and outside of work, we don’t need to be supervised so directly—but the boss is still watching, and doing so more closely than has ever before been possible. Data and metrics speak for themselves: managers can see how many parts a worker assembled per minute or how many packages a driver delivered per hour.

Increasing self-management at work—ostensibly without managers, but still closely surveilled—is a symptom of bigger changes. As wages, both in the United States and across much of the global North, have grown at glacial pace or outright stagnated since the 1970s, workers have taken on more personal debt just to keep up. At the same time, governments have cut public benefits, leaving workers more vulnerable when they are laid off or injured. Even Alan Greenspan, the former head of the US Federal Reserve, called today’s workers “traumatized.” Translated, this means that pressures to fall into line now exist outside the explicit top-down hierarchies.

Capitalism is stuck with planning even though it regularly transmogrifies its techniques of planning. Today, capitalist planning exists both in the old, hierarchical sense that Coase studied as well as in new, more roundabout ways that take cues from the economics of information.

Opening the Gates to the Future

There’s an old quip among historians of economics that a PhD-level microeconomics textbook from the 1960s could be mistaken for a textbook at the department of planning at a university in Havana. In the microeconomics textbook, the free market generates the prices that dictate how much of everything is produced and how things are distributed; in the planning textbook, a planner solves the same equations by coming up with the equivalent proportions of production and distribution. Oskar Lange’s version of socialism and the economic orthodoxy of the twentieth century shared the same flawed assumptions. Over time, as outlined in this chapter, many poked holes in these assumptions: Markets are costly, said Coase. Human beings are not infinitely powered, all-knowing calculators, argued Stiglitz. Even Hayek was right: capitalism is dynamic, not static, and rarely in the sort of equilibrium imagined by Lange and conventional economics.

But the economics of information also challenges Hayek’s counterargument to Lange, that the market is the only means we have to produce all the information that planning would require in the first place. For markets sometimes fail to discover the right information, and that which they do reveal can be false. Also, the enormous amount of economic activity that continues to take place outside the market—within the black boxes we call Walmart or Amazon or General Motors—is evidence against Hayek. At the same time, the rise of information technology shows just how much information it is possible to have at our fingertips. Hayek describes prices as “a system of telecommunications”; today, we have telecommunications far more precise and powerful that can communicate information directly without it being mediated by prices. Hayek’s arguments may have worked to disarm some of Lange’s vision for planning, but they shouldn’t stop contemporary socialists from arguing for democratic planning that is also a process of discovery. Planning is not only possible, but is already all around us, albeit in hierarchical and undemocratic forms. What a very different, democratic planning will look like is a question every contemporary revolutionary should take seriously.

MAPPING THE AMAZON

Amazon is on its way to developing psychic powers. Or at least, such was the fantasy that one could be forgiven for believing, based on the hosanna-filled, adrenalized newspaper column inches that appeared in the summer of 2014 when the online bookseller-turned-“everything store” filed a patent application for a new process it called “anticipatory shipping.” Amazon would soon know what you wanted to buy before you knew it yourself. When you placed an order for the latest John Green young adult novel for non–young adults, another jar of artisanally brined lupini beans, or that Instant Pot wonder–pressure cooker that produces pulled pork faster than the speed of light, the package would already be on its way.

As those journalists less prone to the confection of hyperbolic clickbait pointed out at the time, what this patent describes is in truth a very small step from what Amazon already does. It is a minor extension of the kind of data the company already collects and of the colossal, tentacular logistics operation it already runs. Amazon, building its retail market position on the back of the internet revolution, is the largest technology company using the fruits of modern IT to distribute consumer goods. In short, Amazon is a master planner. It is these sorts of logistical and algorithmic innovations that give the lie to the hoary free market argument that even if planning can deliver the big stuff like steel foundries and railways and healthcare, it would stumble at the first hurdle of planning for consumer items. A fortiori, Amazon offers techniques of production and distribution that are just waiting to be seized and repurposed.

What Amazon Plans

Since its late-’90s dot-com beginnings selling only books, Amazon has expanded to potentially fulfill a large part of a household’s everyday consumption. Echoing Walmart’s horizontal integration, the company has even started to incorporate producers of the things it sells into its distribution network by placing its own workers at the factories and warehouses of some of its key suppliers. Under what the company calls its “Vendor Flex” program, the number of Band-Aids that Johnson & Johnson produces, for example, can depend in part on Amazon’s need. It gives the retail behemoth a role in managing production that extends beyond its own corporate borders.

Beyond simply distributing products, Amazon is, like Walmart, “pulling” demand. In fact, in its early days, Amazon headhunted so many top-level managers from Walmart for their logistics savvy that the Waltons sued. The untold billions of gigabytes of customer data that Amazon collects and the algorithm marvels it uses to parse this data give it an incredibly detailed picture of what people want to buy, and when. Meanwhile, integrating operations with producers ensures that products can be ready in sufficient quantities. Here too, given the sheer scale of this economy, we see the fits and starts of a more integrated model of production and distribution planning, however hierarchical and servile toward its bosses it may be. We might describe Jeff Bezos as the bald, moustache-less Stalin of online retail (editor note: no we may not!).

Yet at heart, Amazon remains (for now) a giant distribution network for consumer goods. The internet age has enabled the rise of a new type of retail model for moving goods from producers to consumers, and Amazon took advantage of this opening better than any of its rivals did. Amazon now controls nearly half of total online retail in the United States. So when Amazon plans, it plans big. Some of Amazon’s planning problems are the same as those faced by other major distribution networks; other problems are entirely novel. In essence though, Amazon’s story is another tale of getting the logistics right—in other words, getting things from point A to point B as cheaply as possible. While this task sounds simple enough, it demands plans for everything from warehouse siting and product organization to minimizing the costs of delivering customers’ packages and shortening delivery routes. Wired magazine describes the company as “a vast, networked, intelligent engine for sating consumer desire.”

Add to this the fact that Amazon, as with every internet company, collects improbable amounts of data on its consumers. A conventional brick-and-mortar store doesn’t know which products you look at, how long you spend looking at them, which ones you put in your cart and then put back on the shelf before arriving at the checkout, or even which ones you “wish” you had. But Amazon does. This data tsunami not only involves consumer information, but stretches throughout the supply chain, and the company uses this data to its advantage wherever it can. Its planning problems are no longer the pedestrian optimization challenges faced by any large company before the internet age, but rather the optimization of “big data”—sets of data that are produced at such gargantuan volumes, varieties and velocities that traditional data processing techniques and software are insufficient.

Amazon’s scale—its ambition to be the “everything store”—introduces significant problems for its IT systems. It is one thing to deliver even a thousand products to a hundred or a thousand retail stores, as would a traditional seller. It is another to deliver millions of products to millions of customers.

The warehouse and transport problems mentioned above are a particular class of mathematical challenge known as “optimization problems.” In an optimization problem, we aim to do something in the best way possible, subject to a number of limits on our action, or “constraints.” Given three different possible routes through a city to deliver a package, say, which is fastest given the number of traffic lights and one-way streets? Or more realistically for Amazon, in delivering some daily number of packages, the company is limited by the schedule of delivery flights, the speed of airplanes, the availability of delivery trucks and a host of other constraints, in addition to city traffic. There are also random events, such as bad weather, that can shut down airports—and while these are sporadic, they are also more likely in some places and at certain times than others.

Every day when you commute to work, you are solving a relatively simple optimization problem. But the math behind optimization is very complex for problems with more than even just a handful of constraints. Given enough variables (conditions that can change) to be optimized and enough constraints, even the most powerful supercomputer we can currently construct, armed with the best possible algorithm we can design, would be incapable of solving some of these problems within our lifetime, and some even within the lifetime of the universe (editor's note: there’s a $1 million reward left unclaimed for anyone who can solve the century-old “traveling salesman” problem!). Many of Amazon’s problems fall squarely into such categories.

So while patents for drone delivery get all the media attention, the true wonders at the heart of its operations are actually the esoteric mathematical techniques that help it manage and simplify its optimization problems. To give one example, these key patents help Amazon plan how to best move items between warehouse shelves and customer doorsteps. Part of solving this problem involves “load balancing”: the same way that your computer shifts tasks so as to not crash any single system, Amazon decides where to build its massive warehouses and how to distribute products between them to make sure no part of its system gets overloaded. To be clear, Amazon’s planning methods are not complete solutions to optimization problems that might take the lifetime of the universe to solve, but instead simply best approximations to get around exploding mathematical complexity. Yet Amazon still chooses to plan rather than leave optimization up to price signals from the market.

Again, take the problem of shipping orders at the lowest cost. Even precisely answering the seemingly simple question of finding the lowest cost shipping method for a day’s worth of orders can quickly grow out of hand. There is no single best way to ship one order out of thousands or millions shipped on a given day, because each order’s cost depends on all the others. Will the plane from the UPS “Worldport” hub in Louisville, Kentucky, to Phoenix be full? Did your neighbor down the street order her electric toothbrush with express shipping, or can it be delivered with your book order tomorrow? The complexity ratchets up still further when Amazon considers not only all the possible alternative routes—which it controls—but also adjusts for the possibility of random events such as severe weather and tries to predict the next day’s orders. This “order assignment” optimization problem has hundreds of millions of variables, and no easy solution. The problem is so complex that there are not even approximations that can take every aspect of the problem into account.

But despite such problems, the planning process within Amazon does not fall apart. While Amazon may depend on horrible working conditions, low taxes and poor wages, it nevertheless functions. The planning problems faced by individual corporations under capitalism do have approximate, “good enough” solutions. As this book argues again and again, planning exists on a wide scale within the black box of the corporation—even if it is “good enough” rather than perfect.

That’s the trick: to find the best possible, even if partial, approximations. Amazon’s modelers work to bring intractably complex problems down to size, to build plans that neither stretch into infinite time, nor respond to all the possible random events that could happen at every step, but that simply work.

Structure amid Chaos

Describing Amazon as a big planning machine doesn’t quite match its image as an icon of “new economy” disruption. Even before Silicon Valley became a hub of global capitalism, planning was typically well hidden behind the facade of competition. Today, the facade has only become more ornate: all you see is a website and then a package at your doorstep. Behind the scenes, however, Amazon appears as a chaotic jumble of the most varied items zipping between warehouses, suppliers and end destinations. In truth, Amazon specializes in highly managed chaos. Two of the best examples of this are the “chaotic storage” system Amazon uses in its warehouses and the recommendations system buzzing in the background of its website, telling you which books or garden implements you might be interested in.

Amazon’s recommendations system is the backbone of the company’s rapid success. This system drives those usually helpful (although sometimes comical—“Frequently bought together: baseball bat + black balaclava”) items that pop up in the “Customers who bought this also bought…” section of the website. Recommendations systems solve some of the information problems that have historically been associated with planning. This is a crucial innovation for dreamers of planned economies that also manage to satisfy consumer wants. The chaos of individual tastes and opinions is condensed into something useable. A universe of the most disparate ratings and reviews— always partial and often contradictory—can, if parsed right, provide very useful and lucrative information.

Amazon also uses a system it calls “item-to-item collaborative filtering.” The company made a breakthrough when it devised its recommendations algorithm by managing to avoid common pitfalls plaguing other early recommendation engines. Amazon’s system doesn’t look for similarities between people; not only do such systems slow down significantly once millions are profiled, but they report significant overlaps among people whose tastes are actually very different (e.g., hipsters and boomers who buy the same bestsellers). Nor does Amazon group people into “segments”—something that often ends up oversimplifying recommendations by ignoring the complexity of individual tastes. Finally, Amazon’s recommendations are not based on simple similarities, such as, in the case of books, keywords, authors or genres.

Instead, Amazon’s recommendation algorithm finds links between items based on the activity of people. For example, a bicycle repair manual may consistently be bought alongside a particular bike-friendly set of Allen keys, even though the set isn’t marketed as such. The two things may not be very obviously related, but it is enough that some people buy or browse them together. Combining millions of such interactions between people and things, Amazon’s algorithm creates a virtual map of its catalog that adapts very well to new information, even saving precious computing power when compared to the alternatives—clunkier recommendation systems that try to match similar users or find abstract similarities.

Here is how the researchers at IBM’s labs describe Amazon’s recommendations: “When it takes other users’ behavior into account, collaborative filtering uses group knowledge to form a recommendation based on like users.” Filtering is an example of an IT-based rejoinder to one of the criticisms Hayek leveled against his socialist adversaries in the 1930s calculation debate: that only markets can aggregate and put to use the information dispersed throughout society. The era of big data is proving Hayek wrong. Today’s deliberately planned IT systems are starting to create “group knowledge” (collective intelligence, or shared information that only emerges out of the interactions within or between groups of people) out of our individual needs and desires. And Amazon doesn’t just track market transactions. Beyond what you buy, the company collects data on what you browse, the paths you take between items, how long you stay on the page of each item you browse, what you place in your cart only to remove it later, and more.

Hayek could not have envisioned the vast amounts of data that can today be stored and manipulated outside of market interactions (and, to be fair, even many Marxists have assumed that the myriad capricious variables associated with faddish consumer items in particular forecloses the capacity for their socialization), although he certainly would have admired the capitalists such as Bezos who own the data and use it to pad their obscene fortunes. It is a delicious irony that big data, the producer and discoverer of so much new knowledge, could one day facilitate what Hayek thought only markets are capable of.

Really, it is not such a big step from a good recommendation system to Amazon’s patent for “anticipatory shipping.” It has a viability beyond any Silicon Valley, TED Talk–style huckster bombast or tech-press cheerleading. The reason this genuinely incredible, seemingly psychic distribution phenomenon could actually work is not a result of any psychological trickery, subliminal advertising craftiness, or mentalist power of suggestion, but is found in something much more mundane: demand estimation. With its huge data sets that measure the relationships between products and people, Amazon is already very successful in figuring out demand for particular products, down to a previously unimagined level of detail.

In our irrational system, the ultimate purpose of product recommendations is to drive sales and profits for Amazon. Data scientists have found that rather than high numbers of customer-submitted reviews, which have little impact, it is recommendations that boost Amazon’s sales. Recommendations help sell not only less popular niche items—when it’s hard to dig up information, even just a recommendation can be enough to sway us—and bestsellers that constantly pop up when we’re browsing.

Zooming out beyond Amazon’s corporate interests, the recommendations system is a way of managing and integrating great swaths of social labor. Many of us freely, without expectation of any reward, spend time and energy writing reviews and giving out stars to products or even just mindlessly browsing on Amazon and other technology platforms. This is work that we and others benefit from. Even over the course of one day, we may repeatedly engage in unpaid labor to rate everything from the relatively innocuous, such as call quality on Skype, to the more serious, such as posts, comments and links on Facebook and Twitter, to the potentially very impactful on individual lives, such as the “quality” of Uber drivers. Under capitalism, the social labor of many is transformed into profit for the few: the filtering may be “collaborative,” but the interests it serves are competitive and very private.

Workers Lost in the Amazon

While many of us end up using free time to perform the social labor that allows Amazon to perfect its recommendations system, Amazon’s warehouses run on paid labor that is nonunionized and frequently occurs under appalling, similarly big data–disciplined conditions. Before taking a closer view of the work itself, let’s quickly look at the workplace. The focal points of Amazon’s distribution network are its warehouses, which the company calls “fulfillment centers.” These usually take up football fields’ worth of floor space jammed with shelving units. Amazon uses a peculiar form of organization called “chaotic storage,” in which goods are not actually organized: there is no section for books or subsection for mystery fiction. Everything is jumbled together. You can find a children’s book sharing a bin or shelf with a sex toy, caviar next to dog kibble.

Once again, powerful planning is what allows Amazon to save on what turns out to be needless warehouse organization. Every item that enters a fulfillment center gets a unique barcode. Once inside the warehouse, items go in bins, each of which also has a unique code. Amazon’s software tracks both the items and the bins as they move through the warehouse. The software always knows which bin an item is in and where that bin is. Because items can always be found easily, deliveries from suppliers can be unloaded where it is convenient, rather than methodically organized and reorganized.

Amazon’s chaotic storage could be a metaphor for the free market system: at first glance, it seems that the chaos organizes itself. Orders and packages zoom through the system and customers get what they want. But as with the free market, upon closer inspection we see thickets of deliberative planning at every step. Highly refined IT systems make sense of the chaotic storage, track items from the moment they arrive at a warehouse to the moment they leave, and make sure everything falls seemingly supernaturally into place. Everything ordered, coordinated, planned and not a market in sight to perform any of these billions of allocation decisions.

Planning is also present in the most minute details of a warehouse worker’s day. Handheld scanning devices tell workers where to go to pick items for orders. Workers are appendages of machines that lay out precisely which routes to follow between shelves and how long they should take. Here’s how a BBC undercover worker-reporter described the work: “We are machines, we are robots, we plug our scanner in, we’re holding it, but we might as well be plugging it into ourselves.” A leading UK researcher on workplace stress contacted by the same BBC investigation claimed that conditions at Amazon warehouses pose serious physical and mental health risks.

Around the start of this decade, Amazon’s top operations managers determined that its warehouses were still too inefficient, and so they themselves went shopping for something better. In 2012 Amazon bought Kiva Systems, a robotics firm, and it now uses robots to put its entire shelving system into motion. Amazon’s updated, even more automated fulfillment centers now feature shelves that move and humans who stand in place—the opposite of what a warehouse normally looks like. Flat, Roomba-like robots rove the warehouse floor along designated pathways. They can lift entire shelving units just off the ground and maneuver them along the same pathways to “picking stations.” These are small designated areas where human order pickers stand, taking items from storage bins and putting them into order bins as shelving units come and go. Frederick Taylor would be proud!

The social, physical and mental cost of a machine for delivering the right things to the right people ultimately falls on the workers who make the machine hum—regardless of whether workers are piloted around a maze of shelves by a handheld scanner or pick orders in place while robots whiz to and fro toward them. The boosters at Wired magazine are in awe of the subjugation of the Chaplins in this twenty-first-century Modern Times: “The packing stations are a whirl of activity where algorithms test human endurance.”

Other more critical reporting has been less kind to Amazon in fleshing out just what these endurance tests entail. In 2011 the Lehigh Valley, Pennsylvania, local paper, the Morning Call, investigated its nearby Amazon fulfillment center. Workers said they routinely faced impossible-to-meet targets, debilitating heat and constant threats of being fired. On the hottest days of the year, Amazon had paramedics on hand outside the warehouse to treat heat-exhausted workers—a cheap Band-Aid solution for Amazon that makes clear its low estimation of health and safety; apparently humane working conditions are not one of its algorithms’ optimization constraints. It was only after this story blew up in the national media and the revelation hurt its largely liberal-tech-and-innovation brand image that Amazon began to refurbish some warehouses with air conditioning. In fact, only one out of the twenty workers featured in the Morning Call story said Amazon was a good place to work.

Amazon workers interviewed by the media consistently report feeling the constant stress of surveillance. Being too slow to pick or pack an item, or even taking a bathroom break that is too long, results in demerit points. Amassing enough of these points can lead to being fired. Soon, this feeling of constant surveillance could become far more visceral: in February of 2018, Amazon patented a wristband that, like an episode of Black Mirror, monitors a warehouse worker’s every hand movement in real time. And Amazon pits workers not only against the clock, but also against one another. Utilizing the lumpenized and semi-proletarian masses from among the industrial reserve army, Amazon staffs warehouses with a mix of permanent and temporary workers hired by subcontractors. Permanent positions are few, but they come with some security, slightly higher pay and limited benefits; they are dangled as carrots before temporary workers, encouraging competition and overwork, further fostering a climate of uncertainty and fear.

With the help of robots, the average time to fill an order in a warehouse automated by Kiva technology has plummeted from ninety minutes down to fifteen. Working conditions, however, haven’t budged: the work remains as dull and draining as ever, warehouses remain hot, and the pace of work can be absurdly fast, regardless of the level of automation. While workers in automated warehouses stand all day and try to keep up with the robots zooming by, workers in the non-automated warehouses can expect to walk nearly double the distance on a daily shift of a typical mail carrier. Even small things like distances to break rooms can be an obstacle—sometimes so long that going both ways can take up most of a break.

Long hours for low pay are the norm in an Amazon warehouse, but the relatively highly paid white-collar workers at Amazon also face a crushing work environment. A 2015 New York Times exposé revealed an environment of overwork and “purposeful Darwinism” that pushes many past their physical and emotional limits. Even if sophisticated planning is Amazon’s workhorse, it is implemented within the bounds of a ruling ideology of ruthless competition that breaks white-and blue-collar workers in different ways. Put differently, Amazon is doing exactly what Marx described in a passage from The Communist Manifesto that often goes underappreciated: “The bourgeoisie cannot exist without constantly revolutionizing the instruments of production, and thereby the relations of production, and with them the whole relations of society.” Our task must be to disentangle the good brought by technology from the tentacles of a social system that degrades workers and subverts more rational planning.

Amazonian Technologies beyond Amazon

Despite being a model of the new, disruptive, internet-dependent capitalism, Amazon remains a planning device as much as other companies ever have. In simplest terms, Amazon is a giant planned machine for distributing goods. It is a mechanism for forecasting, managing and meeting demand for an incredibly wide array of things we need and want. It is a collection of thousands of interlocking optimization systems that work together to carry out the deceptively simple task of moving objects from producers to consumers. Rather than the anarchy of the market, once we enter the Amazon, we are entering a sophisticated planning device—one that offers not only clues for how we could manage demand and supply of consumer goods in a society not built on profit, but also warnings to would-be planners for the public good.

Instead of optimizing the satisfaction of our needs and desires, as well as workers’ working conditions and livelihoods, Amazon’s plans are geared toward maximizing profit for its shareholders—or future profit, since Amazon keeps plowing money from sales into research, IT and physical infrastructure to squeeze out competitors. Planning for profit is in fact an example of capitalism’s web of allocation inefficiencies. The planning technologies dreamed up by Amazon’s engineers are a way of meeting a skewed set of social needs—one that ends up enriching a few, misusing substantial free social labor, and degrading workers. A democratized economy for the benefit of all will also need institutions that learn about people’s interests and desires, optimize via IT systems, and plan complex distribution networks; but they will look different, perhaps alien to the systems we have today, and they will strive toward dissimilar goals.

Three challenges should give us pause before even beginning to call the riddle of democratic planning solved.

First, there is large-scale technical feasibility. The difficulty of planning and optimizing even the isolated task of delivering Amazon’s packages demonstrates that designing systems for economy-wide planning will be anything but trivial. The algorithms that power everything from Amazon’s recommendation system to Google’s search engine are still in their infancy—they are relatively simplistic, making best-estimate guesses, and are prone to failure. Algorithms run into systemic problems, for example with working class and poor people who more frequently use shared devices to shop or non-English speakers, where their capacity for “reading” nuance is limited. We’ll have to storm both the barricades and the optimization problems.

Second, the planning done by Amazon and others still relies heavily on prices in interactions that take place beyond the borders of the firm itself. Amazon purchases its inputs—from the multitude of items it stocks, to the warehouse shelves they sit on, to the servers that run its database—on a market; consumers, meanwhile, also take into account the relative costs of items when deciding whether to add them to their virtual carts. Beyond the confines of the firm, a market system continues to operate. This means that it’s not simply a matter of repurposing existing technologies, lopping off the bosses and otherwise keeping everything the same. Even though there is market-less planning within corporations, it is a form of hierarchical, undemocratic planning that is very much necessary to survive and thrive in a market. Many elements of this planning apparatus, their very form and purpose, are conditioned by that undemocratic hierarchy.

Finally, while the big data collected and processed by Amazon is precisely the kind of tool that would aid in overcoming these challenges of large-scale economic calculation—and indeed it is already being used in this way by the Amazons and Walmarts (never mind the Facebooks and Googles) of the world—we have to recognize that alongside the staggering freedom-enhancing potential of the massive data sets held by both corporations and states, there also lies a staggering capacity for freedom restriction.

The story of Walmart’s major rival Target sending deals on diapers and baby food to several expectant mothers who did not yet themselves know they were pregnant, based on data mined on individual spending patterns, seems almost quaint today. Now, we are only a single Google search for “poor sleep” away from months of bombardment by mattress ads on every social media network to which one belongs. There are more insidious examples: in 2012, the short-lived “Girls Around Me” app used a mash-up of geolocation and social media data to allow individuals to find out all kinds of personal details about women in their vicinity who had used Facebook or Foursquare’s “check-in” feature. The UK’s Daily Mail called it the “Let’s Stalk Women” app, while science fiction author Charles Stross imagined a near future of other, far-nastier data mash-ups—could neoNazis create a “Jews Near Me” app? Beyond the private sector, states across the world are also increasingly using and misusing big data. Police departments across the United States have begun to experiment with something called “predictive policing” to devise methods for predicting offenders, victims, identities, and locations of crimes. It is the arrival of “pre-crime” from the pages of Philip K. Dick’s Minority Report into the real world. Similarly, China’s “Integrated Joint Operations Platform” combines data from multiple sources, including online tracking and facial recognition-enabled CCTV cameras, as well as health, legal and banking records, in order to flag suspected political dissidents. And all this is planning, too.

The time has come for concrete, rather than abstract, proposals for the democratization of global governance, economics and planning, including around issues of geolocation, social networking, search, data mining, machine learning and ubiquitous computing. Because here’s the thing: the big data cat is out of the bag. Both the ubiquitous surveillance of corporations and the ubiquitous surveillance of the state are already here. We need a third option—one that goes beyond the state-versus-market dichotomy.

INDEX FUNDS AS SLEEPER AGENTS OF PLANNING

Even if the most perceptive of free market cheerleaders might be ready to concede that large-scale planning does indeed happen within capitalist enterprise, they remain insistent that innovation and rational economy-wide investment allocation are insurmountable stumbling blocks for any more thoroughgoing notion of planning. They double down on their original argument: that the market is simply a more efficient allocator, the only way to guarantee the “correct” incentives to invest or innovate. However, as with the mammoth scale of planning of production and distribution that takes place behind the curtain at corporate giants such as Walmart and Amazon, it is also the case that investment and innovation occur outside of market mechanisms far more than market defenders are willing to admit, or perhaps have even noticed.

Let’s start with investment. To invest is, at base, the act of putting some portion of economic activity today toward the capacity to produce more tomorrow. Here too, beyond current production and distribution, firms must make plans to allocate those goods and services that will produce yet more goods and services in the future. They must, in short, plan investment: build the factories that will make tomorrow’s gadgets, the hospitals that will house tomorrow’s patients, the rail tracks that will carry tomorrow’s trade, and the windmills, dams or reactors that will power all of them.

Investment is often presented as a sacrifice, and as a result imbued with moralism. In this story, investors are heroic scrimpers and savers, putting the future good ahead of the gratification of the moment. In reality, as we are far from the first to point out, they are owners of a disproportionate share of society’s common resources, produced not by themselves but by their workers; by dint of this daily expropriation of the value produced by workers, they hold disproportionate power over how social life is organized. *Under capitalism, workers receive less in wages than the value they produce with their labor-power — this difference is surplus value, part of which goes to the rentseekers, part of which becomes commercial profit, and part of which goes towards re-investment for the expansion of the owner’s capital. This is why investment is no sacrifice — the conversion of surplus-value into constant capital is no loss of value, only a “loss” in liquidity. (See Capital Volume II for details).

Going further still, it’s a common misperception that the stock market is the major source of investment funds. But in fact, the majority of US capital investment comes from retained profits *(and banks!), not from the stock market.

When times are good and profits are rolling in, the belief that things can only get better is too easily sparked among the rich and powerful. Investment surges. Bad money chases after good, overcapacity and overproduction develop—and eventually, there’s a crash as investors realize that not everyone will be able to cash in. There are two mutually exclusive rules of capitalist crises: “don’t panic,” and “panic first.” Busts thus inevitably follow booms, and the system goes through repeated cycles—at significant human cost.

Downturns, which spike unemployment and poverty, discipline workers; the sack, as the Polish economist Michal Kalecki wrote, is the key disciplining device under capitalism, and perhaps even more important a possession to business owners than is profit. It is their power to deprive the workers of their means of subsistence that gives the owner power, delivering unto them no less a whip hand than that of the slavemaster. In so doing, it gives the owner the ability to use humans as tools in the craft of their choice—as paintbrush, hammer or scythe. It is a reminder of how the system functions at the most basic level. Recessions also discipline capital, enforcing a changing of the guard and creating the conditions for new bouts of accumulation. The system as a whole regenerates and refines itself, fresh faces masking the same core social relationships.

These cycles of boom and bust are not, however, pure anarchy. Capitalism, too, has something akin to an economy-wide central planner: the financial system—the first car in the rollercoaster, managing spirits and rationing investment. Economist J. W. Mason, who has developed the idea of finance as planner in a series of articles in Jakkkobin magazine, writes: “Surplus is allocated by banks and other financial institutions, whose activities are coordinated by planners, not markets … Banks are, in Schumpeter’s phrase, the private equivalents of Gosplan. Their lending decisions determine what new projects will get a share of society’s resources.” Banks decide whether a firm will get a loan to build a new plant, a household a mortgage, or a student a loan for tuition and living expenses—and the terms on which each is repaid. Each loan is an abstract thing that masks something very concrete: work for workers, a roof over someone’s head or an education.

In rationing investment, the financial system is central to managing expectations about the future—connecting today with tomorrow. Interest rates, financial sector regulations and loan decisions are capitalism’s way of choosing between different possible economic plans. Investment today is meant to lead to profits tomorrow. Regulation defines the very terms of how resources are accounted for: what constitutes profit or how a bank’s loan portfolio functions. The financial system’s best guesses of ultimately unknowable future profitability, then, govern how concrete resources are set aside. So much, so straightforward. Yet even here, we begin to see how the capitalist economy is not as anarchic as free market proponents would have us believe.

Central Bankers, Central Planners

At the fulcrum of any contemporary financial system sits the central bank, banker to the bankers. Typically, central banks are most visible during crises, when they intervene to prop up the financial system, lending when panic overtakes others. Yet even during “normal” times, central banks, through regulation and monetary policy, help set the overall pace of credit creation and, ultimately, of economic activity overall. Often presented as neutral policy makers, central banks are in fact political beings with political aims, tightly integrated with the rest of the private financial system.

Take the US Federal Reserve (aka, “the Fed”). Its leadership has been very concerned with how quickly wages are growing, what unions are doing and how the balance of power is shifting within workplaces—what socialists would call “the state of class struggle” (editor's note: “socialists,” but, apparently, not the authors?). Often in very explicit terms, the Federal Reserve has taken great interest in the relationship between workers and bosses, labor and capital, as much as any union organizer. The archives of meeting minutes dating back to the 1950s reveal central bankers talking frankly and knowledgeably about which unions are currently in bargaining and their relative strength. The auto and steel sectors received particular attention; the governors of the Fed might have been even more interested in the strategy of the United Steelworkers (USW) or United Auto Workers (UAW) than would the average shop steward.

This was true during the postwar Golden Age of capitalist growth as well. Here are Governor C. Candy Balderson’s views as described in the minutes from the March 3, 1956, meeting of the Reserve’s Federal Open Market Committee:

The [Federal Reserve] System’s actions should be decisive enough to cause businessmen to realize the danger of a wage-price spiral and not abdicate when they face wage negotiations this spring and summer the way they would if they felt they could simply increase their prices and continue to sell goods. He hoped that labor unions would appreciate the dangers of a wage-price spiral.

That summer, the Fed ended up taking decisive action, raising interest rates, as a successful steel strike pushed previously reluctant central bankers to Balderson’s side. The years 1957–58 saw a short recession precipitated in part by these higher rates. Like firefighters setting off controlled burns to prevent future, out of control wildfires, the Fed governors deliberately altered the costs of investment in order to change the climate in which capital bargained with workers. They planned, overriding what the (labor) market, left to its own devices, would otherwise have delivered.

Similarly, during the first eight months of the 1973–75 oil-shock recession, interest rates continued to rise—nicely coinciding with UAW bargaining with the Big Three automakers. When the Fed finally lowered rates to stimulate investment and counteract the slump, Fed governors argued that, unlike expansionary fiscal policy carried out by Congress and the president, presumably at the behest of the democratic will, their independent actions would be much easier to undo when the economy “overheated” again and workers started to ask for more. And undone they were—very quickly: as is widely acknowledged, in 1980, under the Carter administration’s Federal Reserve leadership of Paul Volcker, the body used sky-high interest rates to launch an assault, not only (or even primarily) on inflation, but on the remaining power of organized labor. And in the decade following the 2008 financial crisis, Fed-led monetary policy played an oversized role; indeed, distrust for state spending has, since the advent of so-called “neoliberalism” in the 1970s, consolidated itself as common sense. To manage ongoing stagnation, central banks across the global North have made massive purchases of bonds, mortgages and other kinds of equity, adding to their rate-setting and regulatory power. The irony here is that an unaccountable, undemocratic department within the state, in the form of central banks, has intervened in the economy in spite of elite consensus against state intervention in the economy.

Of course, the path is never straight between the actions of banks (central and private alike) and what happens in the wider economy. Some interventions fail. And aims and tactics will change to reflect the balance of power in the economy: in principle, planning carried out by the financial system could just as easily support, on the one hand, a high-productivity economy that more evenly distributes growth (as during the 1950s), as it could one of corporate short-termism and upward transfers of wealth (as starting in the 1980s), on the other.

The financial managers of the global economy—the vast majority working at private rather than central or other public banks—occupy a class, not a control room. They share much in terms of wealth, positions of power, education, and lunches in Davos. But as individuals they have their own histories, ideological leanings and visions for how best to achieve stability for capital. Large-scale planning is mundane, technocratic and systemic, not conspiratorial. Networks of power and ideology replicate themselves without the need for open scheming. Economy-wide planning under capitalism is so diffuse that much can get in the way of even the best-laid plans—never mind the unavoidable yet unpredictable crisis dynamics of the system itself. And so, as capitalism heaves from boom to bust, its managers switch from plans for prosperity to plans for surviving a crisis, all of them contested and imperfectly implemented.

Communism by Index Fund?

Contemporary capitalism is ever more tightly integrated through the financial system. What do we mean by integration? Well, for instance, the chance that any two firms in the broad S&P 1500 index of the US stock market have a common owner that holds at least 5 percent of shares in both is today a stunning 90 percent. Just twenty years ago, the chance of finding this kind of common ownership was around 20 percent. And index funds (which invest money passively), pension funds, sovereign wealth funds, and other gargantuan pools of capital all bind economic actors still closer together via their enormous pools of money. Passive management of such funds is a relatively novel investment strategy, involving retention of a broad swath of assets that replicates an existing index, which itself aims to replicate an entire market; in this model, limiting buying and selling still offers robust diversification, but with limited transaction costs and low management fees. Passive management is increasingly dominant, not just within equity markets, but among other investment types, and it is displacing the historic but more expensive norm of active management strategies, which use fund managers and brokers to buy and sell stocks and other investment vehicles, deploying their research and knowledge to attempt to outperform the market.

This shift in recent years from active to passive investing is not news. But the implications are systemic and profound for the very notion of a competitive market.

An investor who has holdings in one airline or telecom wants it to outperform the others: to increase its profits, even if only temporarily, at others’ expense. But an investor who owns a piece of every airline or telecom, as occurs in a passively managed index fund, has drastically different goals. Competition no longer matters; the overriding interest now is squeezing the most out of customers and workers across an entire industry—no matter which firm does it. In principle, capitalist competition should unremittingly steer the total profits across a sector down, ultimately to zero. This is because even though every firm individually aims for the highest possible profit, doing so means finding ways to undercut competitors and thus reduce profit opportunities sector-wide. Big institutional investors and passive investment funds, on the other hand, move entire sectors toward concentration that looks much more like monopoly—with handy profits, as firms have less reason to undercut one another. The result is a very capitalist sort of planning.

This unseemly situation led Bloomberg business columnist Matt Levine to ask, in the title of a remarkable 2016 article, “Are Index Funds Communist?” Levine imagines a slow transition from today’s index funds, which use simple investing strategies, through a future where investing algorithms become better and better, until “in the long run, financial markets will tend toward perfect knowledge, a sort of central planning—by the Best Capital Allocating Robot.” For him, capitalism may end up creating its own gravediggers—except they will be algorithms, not workers.

This idea—that finance itself will socialize production—may read like clickbait provocateurism, but it isn’t actually that new. The point has been made frequently by Marxist writers for centuries, such as Rudolf Hilferding, whose Das Finanzkapital, published in 1910, postulated a shift from the competitive capitalism Marx had analyzed to something far more centralized, tending toward monopoly driven by finance and a state under its control. *A century earlier, Marx once remarked that the banking system “possesses, indeed, the form of universal book-keeping and distribution of means of production on a social scale, but solely the form.” Indeed, under monopoly-capitalism, the socialization of production has nearly reached its highest pitch, yet appropriation remains, as before, private. The banks may possess the form of universal distribution of the means of production, but, of course, in substance it as well has remained private. Extending (and in some places critiquing) Hilferding’s analysis in Imperialism: The Highest Stage of Capitalism, Vladimir Lenin goes on to show finance-capital’s role in turning free competition into monopoly, and further identified this new capitalist epoch as a transitional form to a higher socio-economic order (i.e., socialism).

While in terms of mechanics, it may be easier to transfer into common ownership a real estate income trust that owns the title to hundreds of homes than it is to seize hundreds of homes outright—or to take over a single index fund that owns millions of shares than it is to take over hundreds of factories—politically, the task is no less difficult. Moving ones and zeros around on an electronic exchange requires class power just as much as storming barricades does. The subjective agents of revolutionary change—those who might push for and carry out a sweeping socialization of investment—are far removed from the centers of financial capitalism. On its own, an investment algorithm can no more dig capitalism’s grave today than a power loom could in the nineteenth century. History is not an automatic process, and advancements in the means of production will not automatically overturn the relations of production of society; tools are nothing without organized political forces ready to put them to more useful ends.

Incentivize This!

At this point, defenders of the market are likely to retreat to another line of defense: incentives. Even if capitalists already plan here and there (or even nearly everywhere, as we have shown), only markets can guarantee the efficiencies that come from having the right incentives. Socialist managers will simply waste investment funds as a result of “soft budget constraints”—the notion that managers can ultimately always get more resources—creating vicious cycles of excessive risk taking and false reporting.

Incentives are, however, simply another way of answering the question, “How do we make people do things without telling them directly?” The biggest incentive under capitalism is that without a job, a worker will lose their home, their belongings and ultimately starve. This is the cat-o’-ninetails that disciplines “free labor,” the terror that forces a worker to doff her cap before every foreman or manager. This despotism lies at the very heart of the system, yet it goes unmentioned in any call to “get the incentives right” from business journalists or neoliberal reformers.

The list of socially harmful incentives is much longer. There are incentives to pay poverty wages, to maintain unsafe working conditions, to push poor people out of their neighborhoods, to produce bombs and to use them. Even stock prices, those supposedly most price-like prices, in large part reflect gambles rather than economic fundamentals. The flip side is all manner of nonmarket sanctions that exist and have existed throughout human history. Markets are not the only, or even remotely, the best way to pursue common projects that require people and resources committed across time and space. What defenders of capitalism are afraid of is not planning, but its democratization.

Capitalist institutions affect our behavior in multiple ways, from what we do today to what we want—or have—to do tomorrow. Capitalism is not just a means for dividing up goods and services—though it is that too; it is a way of structuring society. If socialism has the capacity to transform the economy, then it is likely to transform ourselves as well. We’re very malleable creatures—biological systems constrained and shaped by our environment and by each other. We create society, but society also creates us; one of the successes of capitalism, and especially its most recent, neoliberal variant, has been to instill competition into more and more aspects of life. *But as the recently deceased socialist philosopher Andrew Collier so beautifully articulated, “To look at people in capitalist society and conclude that human nature is egoism, is like looking at people in a factory where pollution is destroying their lungs and saying that it is human nature to cough.”

Social scientists have long understood that building different institutions will also make us into different people. Will we still need incentives? In the broadest sense of being motivated to do things, of course we will. But it is a poor theory of social life that says creation or innovation can only take place with the prospect of personal monetary gain. As we argued in chapter 2, the set of all goods and services that are profitable may overlap with, but is not coincident with, the set of all goods and services that are useful to society. If something is not profitable, such as we have seen with new classes of antibiotic, no matter how beneficial, it will not be produced. Meanwhile, so long as something is profitable, no matter how detrimental, such as fossil fuels, it will continue to be produced. The problem is generalizing behavior under capitalism to all human behavior. Investments—decision making over how we divide our resources between our present and future needs—could be planned such that they are responsive to human needs rather than investors’ need for profit.

The Innovative State

But even if investment—diverting resources for future-oriented use—can be planned, what about innovation, the very discovery of those new uses? At first glance, innovation does not seem like something you can plan. But like investment, which is already subject to copious conscious planning, much, if not most, innovation today happens outside the market. The common story gives far too much credit to individuals, to spontaneous flashes of insight in the minds of great men. But most innovation is social. It proceeds in small steps, and most of it is done not because of a price signal but in spite of it: innumerable improvements are made every day by workers on assembly lines or at computer desks who get no credit, just as great discoveries are produced in research laboratories that are not only financed but often directed by the state. Steve Jobs didn’t invent the iPhone; as Italian American economist Mariana Mazzucato brilliantly points out, almost every major component is the product of state-directed innovation.

In her book The Entrepreneurial State, Mazzucato laments that while the myriad examples of private sector entrepreneurial activity cannot be denied, this is not the only story of innovation and dynamism. She asks: “How many people know that the algorithm that led to Google’s success was funded by a public sector National Science Foundation grant? Or that molecular antibodies, which provided the foundation for biotechnology before venture capital moved into the sector, were discovered in public Medical Research Council labs in the UK?” Far from the slander of the state as slow-moving and bureaucratic, and the myth of the nimble private sector, she argues that businesses are in fact ineluctably risk averse, due to the need for a relatively short-term return on investment. Instead, the reality is that the state, from the internet and personal computers to mobile telephones and nanotechnology, has instead proactively shepherded new sectors out of their most uncertain, unforeseeable periods—and in many cases even through to commercialization. And this is not a case of the state filling the gaps of the private sector, correcting market failures. The state was central: “None of these technological revolutions would have occurred without the leading role of the state. It is about admitting that in many cases, it has in fact been the state, not the private sector, that has had the vision for strategic change, daring to think—against all odds—about the ‘impossible.’”

In the United States, ostensibly the most capitalist of states, this process has largely been hidden because so much of it has occurred under the direction of the Pentagon, that part of government where even the most ardent Republican free marketeer allows him or herself to discover the joys of central planning. In fact, war and economic planning have a long history together, and the conflict-ridden twentieth century necessitated public-driven planning and innovation on vast scales.

World War II—a new, fiercer form of total war—gave rise to a comprehensive wartime planning regime, even in the capitalist heartland. In the United States, the War Production Board (WPB) was created in 1942. Its mandate was widely ranging, encompassing everything from fixing production quotas to resource distribution and price setting. The WPB, America’s grand national experiment in economic planning, was responsible for converting peacetime industries to war production, allocating and prioritizing material distribution, rationing essential items such as gas, rubber and paper, and suppressing nonessential production. It had its successes—the war was ultimately won by the Allied camp—but its short existence was compromised by infighting between civilian and military personnel, and undermined by business that was always looking out for its own interests, jockeying to exit the war stronger.

But wartime planning sprouted up beyond just the WPB. A smaller agency called the Defense Plant Corporation was responsible for over a quarter of total wartime investment in new plant and equipment; with it, the government ended up building and controlling some of the most modern manufacturing facilities in the United States at the time. Beyond the immediate war effort, government funded and planned basic research that led to major breakthroughs. The Manhattan Project, which ultimately developed the atomic bomb, is well known, but there were other advances from such efforts that were indisputably socially good, including the mass production of the first antibiotic, penicillin.

Prior to the advent of antibiotics, unless you had surgery, mortality from pneumonia was 30 percent, and from appendicitis or a ruptured bowel, 100 percent. Before Alexander Fleming’s serendipitous discovery of penicillin, patients with blood poisoning contracted from a mere cut r scratch filled hospitals, although doctors could do next to nothing for them. The first recipient of penicillin, forty-three-year-old Oxford police constable Albert Alexander, had scratched the side of his mouth while pruning roses. The scratches developed into a life-threatening infection, with large abscesses covering his head and affecting his lungs. One of his eyes had to be removed. The discovery of penicillin may have been made by a Scotsman, but in 1941, with much of the British chemical industry tilted toward the war effort and London’s defeat at the hands of Hitler a real possibility, it was clear that large-scale production of penicillin would have to be moved to the United States.

A high-priority program aiming to increase penicillin yields was placed under the direction of the Fermentation Division of the Department of Agriculture’s Northern Regional Research Laboratory (NRRL) in Peoria, Illinois, a move that proved vital to the innovations that made large-scale production of penicillin possible. Howard Florey, the Australian pharmacologist—who, along with German-born British biochemist Ernst Chain and Alexander Fleming, would go on to win the 1945 Nobel Prize for Medicine for the development of penicillin—visited a number of pharmaceutical companies to try to interest them in the drug, but he was disappointed in the results. The Committee on Medical Research (CMR) of the Office of Scientific Research and Development (OSRD)—created in June 1941 to ensure that as war approached, the appropriate amount of attention was directed toward scientific and medical research relating to national defense—convened a meeting with the heads of four drug firms to impress upon them the urgency of their involvement and assure them of government assistance. The response, however, was pessimistic. It was only during the second such conference, ten days after the attack on Pearl Harbor, that the argument was won. Crucially, the government obtained agreement for the sharing of research between the different actors through the CMR—a cooperative development that proved decisive in the scaling-up of production as each company solved different aspects of the overall problem, each in itself a problem from hell. As Pfizer’s John L. Smith characterized it, “The mold is as temperamental as an opera singer, the yields are low, the isolation is difficult, the extraction is murder, the purification invites disaster, and the assay is unsatisfactory.” Despite the successes of initial production under OSRD auspices, the manifest utility of this wonder drug to the war effort, ahead of the invasion of occupied Europe, prompted the War Production Board in 1943 to take over direct responsibility for cranking up production. The board directed twenty-one companies to participate in its aggressive expansion of penicillin production, each of which received priority on construction materials and supplies. In time of war, government leaders did not trust the private sector to be up to the task: the supply of all penicillin that was produced was controlled by the WPB, which distributed it to the armed forces and the US Public Health Service. Production soared from 21 billion units in 1943 to 1.7 trillion units in 1944 (in time for the D-Day landings at Normandy), to some 6.8 trillion units at war’s end.

With the war’s conclusion in 1945, planning was rapidly abandoned, departments were shuttered and government plants were sold off to private industry. Paradoxically, however, US corporations ended the war stronger than they began it. Elephantine contracts from the government, price supports and relaxed anti-trust laws all worked to boost profits and grow corporations. The wartime planning regime needed to get business onboard, so throughout the war, while government bureaucrats made some of the toplevel decisions, business still controlled production. The war ultimately enabled a capital-friendly version of planning: production was still mainly carried out by large firms belonging to even larger cartels, albeit with a significant dose of government rationing. At the same time, the scope of economic planning carried out inside corporations increased.

The combination of bigger government and bigger corporations that emerged from World War II led even those on the right to question whether capitalism would give way to some form of economy-wide planning. Hayek’s fellow traveler Joseph Schumpeter famously thought that the replacement of capitalism by some form of collectivist planning was unavoidable. A fervent anti-socialist, Schumpeter nevertheless saw how the capitalism of his time was aggregating production and creating ever-larger institutions—not just firms but also government agencies—that planned internally on ever-larger scales. He thought it was only a matter of time before bureaucratic planning overtook, through its sheer weight, the dynamism of the market. The rise of Keynesian economic management and the experience of wartime planning convinced Schumpeter that a transition to the socialism he despised was inevitable, if not imminent.

Instead, the onset of the Cold War after 1945 produced a fervent official anti-Communism, alongside a narrow, technocratic vision of economic management. The government saw good in increasing productivity, and even in coordination between business; but any move to extend democracy to the economy was bad. Elite concern about a growing militancy, both among rank-and-file soldiers still in Europe and workers in the United States, meant that even as official rhetoric extolled loudly the virtues of free market capitalism, in practice, the American welfare state expanded. As with Western Europe’s emerging welfare state, elites grudgingly accepted social reform as the lesser evil to the immediate threat of social revolution. Business compromised: government would play a larger role in the economy, supporting basic innovation and ensuring that the final products and services produced by business found markets, while at the same time professing unwavering support for the free market.

The central hotbed of publicly planned innovation was the postwar Pentagon, coordinating government agencies that would prove responsible for the initial development of computers, jet aircraft, nuclear energy, lasers, and, contemporarily, much of biotechnology. Its approach built upon the method of partnership between government and science for basic and applied research that was pioneered by the Manhattan Project of the United States, the UK and Canada during the Second World War. With the Soviet launch of Sputnik in 1957, as Mariana Mazzucato argues, senior figures in Washington were petrified that they were falling behind technologically. Their immediate response was the creation, the following year, of the Defense Advanced Research Projects Agency (DARPA), an agency that—along with allied agencies that the Pentagon viewed as vital to national security (including the Atomic Energy Commission and NASA)—would support blue-sky research, some of which might not produce results for decades. DARPA oversaw the creation of computer science departments throughout the 1960s, and in the following decade, it covered the high costs of computer chip prototype manufacture at a lab at the University of Southern California.

Mazzucato lists twelve crucial technologies that make smartphones “smart”: (1) microprocessors; (2) memory chips; (3) solid state hard drives; (4) liquid crystal displays; (5) lithium-based batteries; (6) fast Fourier transform algorithms; (7) the internet; (8) HTTP and HTML protocols; (9) cellular networks; (10) Global Positioning Systems (GPS); (11) touchscreens; and (12) voice recognition. Every last one was supported by the public sector at key stages of development.

We see a similar phenomenon within the pharmaceutical sector, but this time with respect to the crucial role played by government labs and public universities in the development of radical new drugs, known as “new molecular entities” (NMEs)—particularly those given a “priority” (P) rating —as opposed to the cheap-to-develop and therefore more profitable “me too” drugs (existing treatments with the formulas tweaked slightly, which are favored by Big Pharma). Mazzucato quotes Marcia Angell, former editor of the New England Journal of Medicine, who argued in 2004 that while the large pharmaceutical companies blame high drug prices on exorbitant research and development costs, the reality is that it was government-funded labs that were responsible for some two-thirds of the NMEs discovered in the previous decade. One must go beyond the concession that private pharmaceutical companies have been unproductive and declare that in the war against disease, they have been absent without leave for decades.

It all reminds of Karl Marx’s simultaneous admiration and condemnation of capitalism of the nineteenth century. How furious he was that such an incredible system, more productive than feudalism or slavery or any other previous economic structure, could also be so inexorably restricted, so bounded, so lazy with respect to what it could produce. All these possible things (whether knowns, known unknowns, or Rumsfeldian unknown unknowns) that could so benefit humanity would never be manufactured so long as they were unprofitable, or even just insufficiently profitable! This was what Marx meant when he raged against the “fettering of production.” Human progress, the expansion of our freedom, has thus far been held back by this irrational system.

NATIONALIZATION IS NOT ENOUGH

On July 5, 1948, the National Health Service Act, establishing the world’s first universal, public and free healthcare system, came into effect in the UK. Despite the Labour government’s passage of the act two years previously, the formal creation of the NHS remained deeply uncertain and a source of fractious debate until the moment of its arrival. In a speech to Parliament on February 9, 1948, Aneurin Bevan, the Labour minister for health, exhorted his colleagues:

I think it is a sad reflection that this great act, to which every party has made its contribution, in which every section of the community is vitally interested, should have so stormy a birth … We ought to take pride in the fact that, despite our financial and economic anxieties, we are still able to do the most civilized thing in the world —put the welfare of the sick in front of every other consideration.

The story of the British NHS is, however, much more than a story about caring for the sick. It is a century-long saga of the struggle for some form of democratically controlled planning under capitalism—a major reason for the tempestuousness of its birth and the conflicts it continues to engender. Radical enough, but not revolutionary, the NHS signaled the potential for a slow erosion of the market in a major sphere of life.

The story of the NHS begins not in the halls of the British Parliament at Westminster, but in the mining villages and industrial towns born of the human sweat that powered the Industrial Revolution. Before the NHS, healthcare was largely a luxury. The wealthy hired personal doctors; the rest simply did without or depended on the modicum of relief provided by churches or the state. Local governments set up rudimentary hospitals for the poor, but they were at best insufficient, at worst more akin to prisons. They often kept the sick and the infirm separated from the rest of society, rather than cure them—sweeping the unemployed and unemployable under a squalid, fetid rug and calling it charity.

As a counter to this injustice, working-class organizations of all kinds began to experiment with mutual aid. Workers formed “friendly societies,” pooling together small monthly dues from individual workers to pay doctors and run occasional free clinics. As they grew, some societies could hire full-time doctors and even build their own clinics, offering care to entire families, rather than just (mostly male) workers. This people’s healthcare was most advanced in the coal-mining valleys of South Wales, where working-class culture thrived. By the early twentieth century, even little cottage hospitals were springing up alongside the black pits.

It was this spirit of mutual aid that allowed communities to survive economic downturns. Unemployed miners were put to work doing administrative tasks such as collecting fees—themselves reduced during such times—and doctors were also forced to take a pay cut in proportion to a society’s lower income. This simple solidarity kept services intact, even when money was short. Worker-run clinics in Wales and across the UK were among the first large-scale insurance schemes for healthcare, predating both national public insurance (as in Canada or France) and private insurance (as in the United States). The working class organized itself to deal collectively with a problem that affected every individual, but with which no individual could deal on their own. It was socialized medicine in embryo.

As workers became more organized, these mutual aid–financed clinics grew still further in scale and number. Membership was opened up to entire communities, beyond just miners and their families. In turn, and through unions, workers made demands, not only on bosses for better working conditions, but also on government for radical redistribution of resources, including the establishment of healthcare as a right. In essence, this would be a public healthcare system: the same phenomenon of mutual aid extended to all of society and, crucially, requiring those with greater means to pay a greater share of the finance. Pushed to act to contain such broader demands and the spread of socialist ideas, the UK government created, in 1911, a limited national insurance scheme. This first attempt at publicly funded healthcare, however, was far from comprehensive: even after two decades, National Insurance covered just 43 percent of the population, the majority of them working-age men.

Today, doctors can be some of the strongest defenders of public healthcare, helping us recognize, for example, that vaccinations will not deliver the crucial defense of herd immunity unless an entire community is vaccinated. But at the time, it was not only the wealthy, as one might presume, but also most doctors that opposed the establishment of public healthcare. The former did not want to shoulder new taxes to pay for universal services that would disproportionately benefit the poor and working majority; the latter feared that a national scheme would not only reduce their incomes but also challenge their managerial control over what medical care looked like.

Both fears were warranted. As they expanded, worker-run schemes did indeed start to challenge the absolute power of doctors over medical care. Worker societies did not so much target individual clinical decisions—rather, they increasingly wanted a say in planning, in how resources were allocated. Would new money go into building clinics or hiring nurses—or into savings accounts held by doctors? The most forward-thinking societies advocated for doctors to become salaried workers rather than contractors—people thus invested in the expansion of medical practice, rather than that of personal fortunes. As with any other sector, medicine has its own logistic specificities. Decisions have to be made about where clinics are located, how to divide tasks between nurses and doctors, which afflictions should be prioritized, and so on. To have a say over these things goes beyond simple redistribution of resources; rather, British workers were demanding that an entire sector of the economy be democratized.

Doctor Knows Best

The barriers to change were formidable. Medical care was (and often remains) largely paternalistic: doctor knows best, and patients are to do as they are told. Doctors, as members of the petty bourgeoisie or labor aristocracy, traditionally have a degree of social power that they do not wish to part with. They decide much more than which prescription to write; they have influence over where clinics are established, which medical technology to use, and what counts as a legitimate health need and what doesn’t. Of course, within the confines of the operating or examination room, doctors are legitimate experts. They have specialized skills and knowledge furnished by years of medical training. Contrary to the claims of modern-day charlatans, the advent of medical science unquestioningly represented a qualitative leap beyond the magical thinking and credulity that preceded it. The medieval notion that four humors in imbalance causes illness cannot compete with the germ theory of disease. As the lyrics of “The Internationale,” the socialist hymn, famously command: “For reason in revolt now thunders, / And at last ends the age of cant! / Away with all your superstitions, / Servile masses arise, arise!”

Even so, doctors are not the only medical experts. Although nurses were key to the provision of care in the early-twentieth century United Kingdom, nursing was seen as less valuable because it was associated with femininity and low skill. Subjugated in society as women, nurses long played a subordinate role in hospitals and had little input into the shape of a system that would quickly stall without them. At a bare minimum, democratization would have to encompass all the workers involved in producing healthcare.

But health and disease stretch far beyond the four walls of a clinic or hospital, and beyond the medical knowledge of health practitioners; they are not a single, isolated compartment of our lives. For example, whether someone contracts lung disease may depend on pollution as much as it does on the responses of the health system, as epidemiologists will be the first to remind. Chronic disease during old age depends on a whole life history, reaching back through quality of social integration as an adult to childhood nutrition and primary education. Work-related injuries are highly dependent on the kind of work we do and the kinds of safety protections we have—from rules against asbestos to unions’ willingness to fight for them. Health researchers today call these the “social determinants of health.” *And while investments to improve these social determinants would likely reduce the social costs of healthcare, that is, by preventing illnesses rather than treating them, the private owners of medicine in our current system would be loathe to give up a portion of their market for the good of society — yet another example of a road that the “greedy algorithm” will never go down!

While medicine can be a narrow field of expertise, healthcare encompasses everything we do. It is not just an individual responsibility but is deeply impacted by what society looks like and the level of its collective decision making. What, for example, counts as a legitimate health concern, and what can be dismissed? Are you depressed because of who you are, or because you spend the majority of your time enriching people that would no sooner murder your children if it would net them another dime? Is it you, or is it capitalism?

These questions go to the heart of what democratic planning looks like in practice anywhere, not just with respect to healthcare. For if we want a more egalitarian system to apply the best of human technical knowledge more effectively, without having to sit through interminable meetings or cast an unending string of votes, then we’ll have to delegate some decision-making power—whether to experts, (elected) managers or representatives. At the same time, while healthcare should be delivered by experts, it should not be run exclusively by them. The question of whether people should be passive consumers of medicine or instead its active co-creators is a common theme throughout the history of public healthcare, wherever it has emerged. Veteran British physician Julian Tudor-Hart describes the seeds of transformation later developed by the NHS: “This embryonic new economy at the heart of the NHS depends on the growth of an element it always contained, which has only recently, and slowly, been recognized: the power and necessity of patients as co-producers … Once released from deference, public expectations become an irresistible force, providing initial elements of democratic accountability can be retained and rapidly extended.” This is a call for a new system—one based on mutual accountability, democratic control over resources and input in decision making from all affected—a struggle already taken up by British workers in the early twentieth century.

The Second World War transformed everything, not least the prospects for true public healthcare. As war engulfed Europe, the British government introduced planning across major sectors of the economy. There were limits on markets, including the market for healthcare. Profit, while ever present, was at least within certain bounds, temporarily made secondary to the goal of winning the war. In this darkest of times—this “midnight in the century,” as libertarian socialist Victor Serge described the period—the whisper of new, more democratically planned institutions was a signal of what was possible. The government-run Emergency Medical Service (EMS) demonstrated to ordinary people that medical provision could be allocated according to human need—even the skewed and limited set of needs dictated by war conditions—instead of for private gain.

The British working class emerged from the war emboldened. Planning had worked. Capitalists were forced to sacrifice profit to win the war, and the system didn’t collapse. The country needed rebuilding, and the war had also shown that with enough intervention into the economy, everyone who needed a job could get one. This sense of potential propelled the Labour Party to a landslide in elections held at war’s end. Labour’s program was reformist but sweeping: the institutions of a new, extensive welfare state would hem in the market. Although it would take until 1948 to be officially established, the new National Health Service was the postwar government’s greatest achievement. Healthcare was made free at point of service, paid out of taxation and universally available. Distinct from some other public healthcare systems, hospitals were not merely publicly funded but nationalized.

Doctors, led by the British Medical Association (BMA), protested ferociously at the coming public system, afraid of losing their privileges. They called Bevan a “medical Führer,” and the NHS “creeping Nazism.” They threatened to paralyze the new system. But with medicine still a lucrative profession and public opinion firmly against them, the doctors’ threats were mostly hollow.

The BMA did, however, win on one point. Labour had resurrected the old demand of the friendly societies—that doctors become salaried public servants, rather than independent small-business people that contracted with the state—but the BMA insisted doctors remain an independent power, formally beyond the remit of immediate democratic direction. In the face of the BMA’s dogged opposition, Bevan ultimately conceded that family doctors, unlike those in nationalized hospitals, would remain independent contractors—“ stuffing their mouths with gold” in his words. Within a few months of the NHS being established, the vast majority of doctors, however reluctantly, signed up. Public planning won out over private interests.

How the NHS Planned

The first task of the early NHS was turning an inadequate patchwork of clinics, hospitals and other services into a functioning, properly joined-up and universal public healthcare system. Early planning was rudimentary. In 1948, while the UK, like much of Europe, was still recovering from the bombardments, demolitions and ruination of the war, detailed statistics were effectively nonexistent. The world’s first truly universal (as opposed to specialist) computing device, the Small-Scale Experimental Machine (SSEM) at Manchester University, ran its first program on June 21 of that year. By the end of the 1949, the world was home to a total of still just four similar devices, and even these were in stubbornly tentative operation. Widespread computer use was still decades away. The Ministry of Health set budgets and priorities but planned little else. Annual budgets for hospitals were very simple: take the previous year’s numbers and increase them by however much the entire NHS budget was increasing. The NHS did grow, but this method of annual, proportional increases locked in and perpetuated inequalities that existed on the eve of its creation.

Much like today, where less populated regions suffer a lack of high-speed internet because telecommunication companies cherry-pick the most profitable areas to service (and let the rest of a country rot, for all they care), great chunks of the country came into the era of the NHS hospitalless, or at best with hospitals in poor shape, a situation that would not be corrected for years. The first major planning initiative at any serious scale would not come till the 1960s. Its aim was precisely to tackle these inequalities by building more and better hospitals, especially in poorer areas. The 1962 Hospital Plan of the then–Conservative government was a grand promise, but it almost immediately ran into chronic underfunding—presaging much of the history of the NHS to come.

A decade later, however, under another Labour government, meaningful planning appeared to be on the horizon. In policy documents, the aspirational goal of the NHS was now “to balance needs and priorities rationally and to plan and provide the right combination of services for the benefit of the public.” In practice, three changes pointed to the potential for more thoroughgoing, democratic planning.

First, the NHS expanded the horizons of health. A reorganization in 1974 created “Area Health Authorities,” whose boundaries neatly coincided with those of local governments. AHAs were intended to better integrate healthcare into local planning of other kinds, whether this meant sewers, roads, community centers or schools. The potential was, in principle, enormous: healthcare could be more than just a reaction to illness and begin to have bearing on those broader social determinants of health.

The same 1974 reform changed how healthcare was managed. New local management teams integrated the three parts of the NHS that had been run independently since 1948: hospitals, family medical clinics, and community health centers for the elderly and those with severe mental health difficulties. For better or for worse, these teams made decisions by consensus (extending something that had been part of the NHS since its founding in the three-person consensus boards, consisting of a doctor, a manager and a nurse, that ran individual hospitals). Working alongside these consensus management teams were “Community Health Councils.” Local organizations representing seniors or the disabled were given the right to elect one-third of each CHC’s members. When created, CHCs had no direct decision-making authority, but they held genuine promise to democratically transform the NHS. With community representation, CHCs showed that it was possible to open the opaque NHS hierarchy to bottom-up voices of patients and citizens.

Finally, in 1976, the NHS committed to distributing resources in line with health needs, a potentially radical transformation. Taking into account regional differences in age and morbidity, the Resource Allocation Working Party (RAWP) greatly expanded upon past faltering attempts to correct inherited inequalities from the pre-NHS era. Regions with bigger needs (which were often poorer as well) would now receive bigger budgets. By identifying key areas of spending, the politicians and managers who ran the service could finally wean doctors from some of their vestigial power.

The reforms of the 1970s maintained a naive faith in top-down technocrats, reinforcing the paternalistic notion that expertise can overrule democracy that had also in part animated the creation of the NHS—one shared by both Labour and Conservative politicians. Many of these reforms simply created new layers of citizen-phobic, under-democratic bureaucracy. But these reforms also carried in them the seeds of a more radical remaking of the NHS. Rather than planning only how much healthcare there was, and where—the important questions that the 1960s planners had to tackle first—these reforms could also have laid the groundwork for planning that tackled how healthcare was produced and, most importantly, who participated in decision making.

However, instead of ratcheting up democracy within the system, most of the 1970s reforms failed in the face of brewing economic crisis. The oil shock of the early 1970s saw both prices and unemployment spike at the same time—something that economists of all mainstream stripes had said was no longer supposed to happen. The regime of boom and bust was supposed to have been solved by Keynesianism, delivered by the postwar compromise between capital and labor. In response to the new crisis, throughout the 1970s and early 1980s, elites in the UK (as in the United States and much of the West) launched an assault on the postwar economic settlement that had guaranteed higher wages and expansive public services for workers in exchange for high growth rates and high profits for business. With profits threatened, higher wages and expanding public services came under attack from the right in the UK and across the global North. UK unions launched one last major strike wave, which reached its height in 1979. It wasn’t enough. Worker expectations for more and for better were firmly in the crosshairs when the most right-wing Conservative government since the war, led by Margaret Thatcher, came to power that same year. The tide had turned against the welfare state; capital had decided it was time to break the postwar compact with labor.

The reforms of the 1970s fell, one by one, to the Right’s vision for healthcare. Norman Fowler, Thatcher’s secretary of state for health, scrapped the area health authorities in 1982, before they even had a chance to integrate with local governments. A year later, Fowler eliminated management by consensus and reinstated individual responsibility for managers, calling the policy “general management.” Community health councils outlasted AHAs by two decades—scrapped in England only in 2003—but even as they were allowed to limp on, they remained, more than anything, a vague protest body. The RAWP formulas stayed, but the principles behind them were soon transformed, by New Labour this time rather than by Conservatives. Under Labour Prime Minister Tony Blair, metrics that had been aids in planning slowly transmogrified into performance targets for managers. Over the course of the 1980s, a business ethos crept into the NHS. It didn’t come out of nowhere: the right’s once-marginal ideologues had long blamed all NHS shortcomings on misspent budgets and a lack of “choice” by patients. While the problems of poor services and long wait times were real, fears about “out-of-control” budgets were largely manufactured. The NHS had been massively underfunded. Spending on health as a percentage of GDP had started out at a measly 3 percent of GDP in 1948, growing only to around 6 percent by the 1980s. At the time, France was spending about 9 percent of GDP on healthcare, and Germany 8 percent; thus, the NHS was and remains a relative bargain.

Even in 2014, the UK spent just over 9 percent of GDP on healthcare, still below the average for countries in the global North. By comparison, the market-based system in the United States consumes nearly double that figure, 17 percent of GDP, while still denying care to millions—a paragon of economic inefficiency. The right’s counterargument—that any budget, no matter how big, would never be enough—falls flat. Health budgets have remained relatively stable, except in the one country in the Organisation for Economic Co-operation and Development (OECD) that maintains a mostly private system.

But even 6 percent of GDP is still a big slice of the economy that holds relatively little opportunity for profit. Right-wing hand-wringing about cost control provided cover to the healthcare corporations that would gain, even if only part of the NHS were sold off. The barrier to overt privatization was that the NHS regularly topped polls of the most trusted institutions among the British electorate. Famously, even neoliberal revolutionary Margaret Thatcher had to promise that “the NHS is safe in our hands” in a speech to her own Conservative Party convention in 1983. But by 1988, when Thatcher announced a major review of the NHS, nearly a decade of hardright rule and a much longer ideological battle against the welfare state left these words increasingly hollow.

Three years later, Thatcher’s successor as prime minister, John Major, introduced the biggest reform in the history of the NHS: the “internal market.” Although the Conservatives couldn’t put the NHS onto the market, they found a way to put the market into the NHS, with an end result that was neither fish nor fowl. The big change was termed the “purchaser-provider split.” Before this reform, a doctor would refer a patient to a local hospital or clinic for any further service, such as a blood test, hip replacement or liver transplant. The NHS paid the doctor and funded the hospital, so no money explicitly changed hands between the two. Under the internal market, akin to the Sears debacle described earlier in the book, hospitals and community care clinics “sell” services. They are the providers. Doctors, local health authorities or other NHS agencies are purchasers who in turn “buy” these services in the name of their patients.

Over the course of the 1990s, a Labour-Conservative consensus around the allocative efficiency of markets and competition replaced the postwar consensus around planning and public service. Margaret Thatcher reportedly called Tony Blair—elected in 1997 as the first Labour prime minister since the 1970s—her greatest achievement. Nominally center-left, his business-friendly, pro-market New Labour government worked to expand the Conservatives’ market reform (although at this point only within the NHS in England, as Scotland, Wales and Northern Ireland were given more autonomy and largely turned away from market reform). In the English NHS, purchasers, now called “commissioners,” became fully independent of the NHS hierarchy, thus attenuating voter accountability. Alongside more markets, New Labour also created new institutions, such as the Monitor and the Care Quality Commission, to act as market regulators. In almost every case, such independent “expert” bodies were formally public bureaucracies, rather than market actors—not unlike independent central banks or the European Commission—nevertheless, they represented ever more impaired responsibility to voters, despite their location within the state. The state now oversaw a fragmented system rather than planned one that was more unified.

With the door to wholesale market transformation cracked, David Cameron’s post-2010 coalition of Conservatives and Liberal Democrats pushed it wide open. Their 2012 Health and Social Care Act now extended access to explicitly for-profit providers and introduced competition over commissioning contracts themselves—a contract for who gets to sign other contracts. By this time, even the British Medical Association—the same doctors’ organization that had initially fought to maintain space for private business and professional privileges—was now standing up to reforms that would be a gateway for healthcare corporations first to cherry-pick, then to take over, large sectors of the NHS. In the years immediately following this overhaul, over 10 percent of total NHS spending already went to for-profit providers.

Against the Market

The story of the NHS since the 1990s is not just one of a conflict between planning and markets; it is also a reminder that markets need to be made and sustained, a point well understood by the neoliberals who set out to do just this. Markets are human creations; indeed, Adam Smith’s prehistory of plucky Neolithic humans getting by through “truck and barter” is as inaccurate as the creationist Eden where humans rode dinosaurs. Rather than being natural and inevitable, markets are a human creation. The NHS is a perfect example of such conscious effort going into the creation of something that is ultimately antidemocratic—where the strength of your voice is the size of your wallet—not to mention anarchic and often irrational. Three decades on, the central NHS is increasingly a rudderless vehicle for handing out money, as system-wide planning has eroded away. Competition was supposed to make the NHS more efficient, increase the quality of services and give patients a voice. On all counts, however, it has done little; and instead it has undermined the basic values of the NHS—that healthcare be universal, accessible and free.

Market reforms introduced plenty of new costs. Ostensibly about slimming down government bureaucracy, the dense jungle of contracts between providers and purchasers in fact required armies of new bureaucrats. Even by 1994, three years into the internal market, the NHS had hired 10,000 new managers. While administration costs made up just 5 percent of the total NHS budget in the 1980s, by 2005 they had nearly tripled, to 14 percent of the total. On these simple measures, planning was several times as efficient as the market. A 2014 report from the UK’s Center for Health and the Public Interest put the cost of just running the internal market itself at an estimated £4.5 billion per year—enough to pay for dozens of new hospitals. A public, universal health service can amalgamate costs. In this system, hospitals do not have to charge for individual procedures (or their components, like anesthesia); instead, costs are absorbed into a common budget from which surgeons are hired and supplies purchased. Resource control can still occur without the mediation of internal pricing: for example, through simple service prioritization. The complexity of modern medicine (and the increase in preventative care) means that isolating costs is not only difficult, but largely arbitrary. But despite it being hard to draw a straight line between small packets of health spending and health outcomes, the internal market requires that services be divided into such “products” to be priced. Administering the resulting network of contracts is not only inefficient; it cuts against the socializing tendencies of public healthcare.

Markets in healthcare are not only costly, but also far from the simple models described in economics textbooks. What economists call “costs of entry” are very high: building new hospitals is an option available only to the state or to the few large healthcare corporations. And without the state, these corporations end up dominating the market, leading to scant competition but widespread waste and duplication. Consultants and marketers, for example, have flourished under the NHS internal market. Socialists have long pointed out that marketing is a major waste of resources and human energy under capitalism, but it’s especially jarring in healthcare: resources that could go toward saving lives or curing diseases end up wasted on enticing doctors to pick one clinic over another for a referral.

Have all of these additional costs created new benefits? At best, it’s hard to tell. Every patient comes into treatment with their own personal history, including all the social determinants of health, making comparison very difficult. On an aggregate level, recall that as England moved further along the market path, Scotland decided in the late 1990s to return to a more public NHS. Since then, the Scottish NHS has improved more rapidly on important indicators, such as wait time for a hospital bed or an ambulance. On other measures like life expectancy, the gap between relatively poorer Scotland and its southern cousin remains steady, as well.

Difficulties in gauging quality haven’t stopped market boosters from pretending it’s simple. As part of its reforms, New Labour even created a three-star rating system—like Uber driver reviews but for hospitals. This went about as badly as you’d expect. For example, under the star system, cardiac surgeons in London hospitals were less willing to perform high-risk but life-saving operations because they could damage their hospital’s rating. The free market fanatics, who complained that perverse incentives let quality languish under planning, thus created perversions all their own.

So, if competition cannot claim to be more efficient or to deliver higher quality, can it at least give patients that elusive “voice”? In fact, it turns out that having a choice in one’s medical provider is a fairly low priority. In a recent UK survey, 63 percent of people ranked fairness as their most important value in healthcare. Choice in services, however, was last. What’s more, where conditions become more life threatening and treatments more technologically advanced, people demonstrate even less desire to give input into medical decisions. And surveys have also found that people would rather have a greater say over the kind of treatment they receive than over who delivers it. People clearly desire a voice in healthcare decision making, but realizing this requires different and deeper democratization than that provided by the shallow market version. Involving patients by treating them as if they were consumers choosing shampoo at the drugstore is very different to giving patients more informed autonomy over their own health.

The Planned, Democratic Alternative

Today, after nearly three decades of market reform, each year the NHS manages healthcare less, while managing competition more. It plans by proxy. Less room for strategic planning means decisions are made by smaller, independent units that are enmeshed in growing webs of contracts. Of course, before the 1990s, the NHS still planned too little, and planning was not democratic enough. And it was also chronically underfunded. The slow extension of the internal market—Margaret Thatcher’s dictum that “there is no alternative”—to healthcare was one way out of the impasse at the end of the 1970s—an antidemocratic one, and inefficient for the system, but lucrative for private providers.

But there were, and are, alternatives. Deeper, democratic planning would unite healthcare workers with patients, and entire communities, as active co-producers of health and collective owners of a healthcare service. Even as undermined and dismantled as the NHS is, we can, perhaps, still catch a glimpse of a better, possible future, like fish trying to make out the distorted world beyond the surface of the water. A public, universal system—free at the point of service and paid out of taxes (as was Bevan’s goal), but embodying humanist, bottom-up democracy, rather than paternalistic, technocratic state charity—is also one that builds its own constituency and creates a different kind of people—more willing to cooperate and to see their own destinies cooperatively tied up with those of others.

*The example of the NHS demonstrates that nationalization within bourgeois society will never work for the working class. Public sector workers, it is true, do not produce surplus value; their labor is consumed unproductively (from the perspective of the capitalists). But they still remain wage laborers producing goods and services for a market. Nationalization is not the same decommodification, and so long as the bourgeoisie maintain their class dictatorship, so long as it is they alone, in the final analysis, that have democratic, public ownership over the means of production, a nationalized industry will always remain a capitalist enterprise. The supposed “truce” between capital and labor was always destined to end one or another; next time we shall have to ensure that it ends in our interests.

TOWARDS A COMPUTERIZED PLANNING

The term “input-output analysis,” marking out one of the most important branches of economics, was conceived during the Second World War to describe the work of Russian-born Harvard economist Wassily Leontief and the Bureau of Labor Statistics, work for which Leontief would later earn one of those (sort of) Nobel prizes for economics. An input-output table offers a simplified representation of the flows of inputs and outputs among industries, and ultimately consumers. It is, in effect, a spreadsheet: each horizontal row represents how a particular industry’s output is used as an input by another industry and consumers, while each vertical column represents all the inputs used by any one industry. The table demonstrates quantitatively the dependence of each industry on all other industries. An increase in Lego output requires an increase in input of plastic, and hence an increase in plastic production.

Such tables are used by companies, and departments within companies, to plan production to meet output targets, and to analyze what the effects on outputs would be with changes to various inputs (and vice versa). The table allows the calculation of the quantity of a particular commodity ‘A’ that is required to produce one unit of commodity ‘B.’ Leontief described his work this way: “When you make bread, you need eggs, flour, and milk. And if you want more bread, you must use more eggs. There are cooking recipes for all the industries in the economy.”

Although he published the first input-output table in a 1936 paper, Leontief himself said that more rudimentary versions of such tables had been produced in the nineteenth century by economist Léon Walras, or even in the eighteenth century by François Quesnay (his Tableau économique)— and, indeed, by Marx. One of Leontief’s major breakthroughs was to convert Walras’s equations into linear algebra. This advance is what drove uptake of input-output analysis after the Second World War in the United States and, subsequently, internationally.

Akin to the sometimes-silly battle between Newton and Leibniz over who had been the one to invent calculus, throughout the Cold War and after, a great deal of effort was expended on assessing the origins of input-output analysis to decide whether it was an American or Soviet innovation (and even, within the USSR, whether it was a Bolshevik or a Menshevik innovation!). What is interesting though—and recent post–Cold War scholarship suggests this is undeniable—is that the early efforts of the Soviet Union to “grope in the dark,” to use Mises’s term, left an impression on a younger Leontief.

In 1925, some twenty Soviet economists under the direction of P. I. Popov developed a fairly crude national economic accounting balance—focusing on six main branches of the economy and a number of subsectors—akin to how bookkeepers prepare a balance sheet. The innovation here is the mental leap of viewing the national economy as a sort of giant, single firm. That same year, Leontief published a review of the work on national balance sheets by Popov and his colleagues. Even earlier, economist Alexander Bogdanov had proposed an iterative procedure to steadily ratchet upward the granularity of national economic tables, and Nikolai Bukharin, drew on Bogdanov’s work to devise a mathematical formalization of Marx’s economic tables for expanded reproduction, which in turn laid the groundwork for Popov and his team.

During the Stalin years, little developed at Gosplan beyond these national material balances. As contemporary economic historian Amana Akhabbar argues, most of the economists of the 1920s would go unpublished or untaught in universities until their revival during the Khrushchev Thaw. At the end of the ’50s, input-output analysis, which appeared to economists amid the Thaw as a rigorous, statistical technique with considerably more precise forecasts than the crude economic sketches they had up till then been depending upon, was “imported” from corporate America back into Russia by Soviet economist and mathematician Vasily Nemchinov. Nemchinov, stressing, and perhaps exaggerating, their Soviet origins, is credited with the introduction of mathematical methods to central planning and with establishing, in 1958, the first group in the country to study mathematical economics, which would later become the Central Economic Mathematical Institute.

Conversely, throughout his career in the United States, Leontief would insist that his work did not really rely on Soviet economics. After World War II, Leontief quickly lost governmental and US Army financial supports following accusations that federal funds were being used to develop “Communist technology,” and again during the height of McCarthyism for the same reasons. It is some irony that it was only as a result of interest from private companies—notably Westinghouse Electric Corporation, who saw utility in his technique—that he was able to continue his research.

This Cold War performance of dressing up American economics in Soviet drag, and vice versa, even to the point of it taking a vast and venerable American conglomerate to rescue a Soviet economic technique, entirely out of self-interest, is a trope that we will see repeated over and over.

The initial development of linear programming, a branch of mathematics today available to an undergraduate in any discipline with a couple years’ worth of math, was heavily influenced by input-output analysis. Simply put, linear programming explores methods to find the best outcome given a series of constraints. It would go on to be adopted widely within microeconomics and within corporations in the West to plan production, transportation, technology and indeed any tasks that involve multiple variables and that aim at maximization of profits while minimizing costs and resources.

Firms routinely use linear programming tools to solve complex decision problems involved in supply chain logistics, production scheduling, transportation, or any form of resource allocation. Developed in the Soviet Union by Leonid Kantorovich and published in a 1939 booklet, Mathematical Methods of Organizing and Planning Production, the discovery of linear programming followed a request from a plywood factory that wanted to optimize production. The technique, by taking data from input-output matrices, offered a way to solve a whole class of similar conundrums.

It was first applied during the Second World War to solve military supply problems, but it was subsequently forgotten about, or rather repressed. The main problem, among many, was that Kantorovich counterposed “mathematical economics” to conventional Soviet “political economy.” Opponents sniffed something un-Marxist. In a 2007 mathematical-biographical sketch of Kantorovich by his student A. M. Vershik, he talks of an “internal veto”—a self-censorship not only of economic matters, but even of the mathematical aspect upon which they were built—that lasted until 1956.

Largely independently of Kantorovich, Dutch American mathematician and economist Tjalling Koopmans devised a similar method for the analysis of the optimum allocation of resources. The pair of them would be awarded another economics Nobel in 1975 for their joint discovery. A third individual, US mathematician George Dantzig, again independently of the other two but slightly later, just after the war, developed a formulation of linear programming to solve planning problems for the US Air Force. In 1947, he devised the “simplex method,” or simplex algorithm, within linear programming. It would quickly be adopted by industries for their internal planning, and it remains in use today; New Scientist magazine recently called this American twist on the question of Soviet optimization “the algorithm that rules the world.”

Mirroring the American arch-capitalists who saved the work of Leontief, in the Soviet Union, it was military specialists who were the first to delve into linear programming, as they were the only ones with access to foreign texts on the subject, translated into Russian though not yet published domestically. Their interest was not the broader question of economic planning, but systems control, itself a subset of the topic of distribution of resources, which is in the end of course the alpha and omega of economics. Not a colonel, nor a single general, had heard of Kantorovich. Vershik recalls visiting a Ministry of Defense research institute in Moscow in 1957 and telling them about his mentor Kantorovich’s work. “For them, who had just started to study the American literature on linear programming, this was a revelation.” At this time, a broader rehabilitation of cybernetics was occurring, and the urgency of introducing computers into the army had increased. Kantorovich was invited to give a public lecture on his pet subject. The military specialists, who up until this point had only been using American sources obtained through secret channels, were thrilled to find that it was one of their own who had been a pioneer in this field. Kantorovich wrote:

I discovered that a whole range of problems of the most diverse character relating to the scientific organisation of production (questions on the optimum distribution of the work of machines and mechanisms, the minimisation of scrap, the best utilisation of raw materials and local materials, fuel, transportation, and so on) lead to the formulation of a single group of mathematical problems (extremal problems)…But the process of solving them with which one is faced is practically completely unusable, since it requires the solution of tens of thousands or even millions of systems of equations for completion.

Kantorovich’s idea was for the planners to assess optimal pricing, a scheme in which objectively determined valuations or “shadow prices”—a notional number assigned to items in place of a price—would be calculated from opportunity costs without the need for the “total information awareness” that the likes of Mises and Hayek said would be demanded for planning to work. Contra Mises and like Lange, Kantorovich demonstrated that rational economic calculation outside of market mechanisms was, in principle, possible.

Remember that economic planning can be useful to both capitalist firms and socialist economies. The difference lies in their objective function (the goal) and how it is determined. In the capitalist firm, the technique is put in the service of maximizing profit for the gain of the owners, and indeed, most linear programming textbooks and software manuals assume profit as the objective. In the socialist society, the objective function may still be an increase in wealth, but that of the society as a whole; that is, mathematically akin to profit-maximization, but socially determined. The steady expansion of leisure time might be another objective function, as might the maximization of ecosystem services and minimization of their disruption. In this way, we see how while the replacement of market allocation with economic planning may be a necessary condition for the realization of socialism, it is not a sufficient condition: it must be married to democracy.

The Yugoslav Centrifuge

There is a simple squaring of the circle here, partisans of the notion of “market socialism” maintain. Capitalism uses the market to allocate resources; therefore, there can be no capitalism without the market. But there can be a market without capitalism.

Under “market socialism,” there is no private ownership of industry, but allocation of goods and services still occurs via the market. Workers own their own enterprises, in the form of cooperatives, which in competition with each other sell their wares, and survive, expand or fail depending on the demand for them. Due to the vagaries of the market, as in social democracy, some key sectors, such as healthcare, may still be held by the public sector, but it remains a market society. Such a system benefits from the alleged efficient allocation of the market, avoiding bureaucratic sclerosis while eliminating the owning class, the bourgeoisie (editor’s note: nevermind that workers who own stake in a co-op are, themselves, petty bourgeois!). Further, there are no bosses, and the workplace is democratically managed.

But partisans of market socialism have to set aside the reality that the goods and services produced in markets, even “socialist markets,” will still only be those that can turn a profit. And, as we have discussed, the set of things that are beneficial overlaps only in part with the set of things that are profitable. New classes of antibiotic, rural high-speed internet, and crewed spaceflight would all be as difficult to deliver under a socialist market as under a capitalist one, without significant, planned intervention into the market. Meanwhile, items that are profitable but actively harmful, such as fossil fuels, would still likely be produced.

The anarchy of the market also inevitably suffers from duplication and overproduction, and their concomitant manufacture of economic crisis. Just as capitalist markets run on profit, under “market socialism,” use of the price signal would also generate excess revenues for the more efficient firms (even if transformed into worker cooperatives) and losses for the unlucky ones. Market “socialists,” then, have to explain how this system would redistribute “profits” equitably among the population. More importantly, how would their solution ensure that the profit motive—one that squeezes more work out of workers and creates incentives to overproduce—does not reemerge? Scaled up, the market and the profit motive create economy-wide cycles of boom and bust that hurt people and waste resources. By their very nature, markets produce inequalities—inequalities that, so long as a market exists, are only ameliorable, not eradicable. And it has consistently been inequality that has driven extra-economic conflict throughout history.

This is no abstract discussion. After World War II, Yugoslavia under Marshal Tito embraced a variation of market socialism. The Stalin/Tito split of 1948 sent leaders of the young multinational republic off to seek an alternative path to the bureaucratic Soviet model for the construction of socialism, leading them to experiment with what they called “workers’ self-management,” or radnic ˇ ko samoupravljanje. Under this system, while factories remained formally under state ownership, the workers directed production (again, admittedly not with full control) at their workplace, the commodities produced were sold on the market, and then the workers at a particular enterprise kept the surplus revenue themselves.

As the role of market forces steadily expanded under Tito, particularly with the abolition of central determination of wages and the advent of personal income’s dependence upon the success or failure of a particular enterprise, competition between enterprises increased, and inequality grew between workers, skill categories, workplaces, sectors and, most ominously, regions. Inevitably, some factories will be superior to others at producing commodities, or have the luck of being located in a more developed region, with higher education levels, better transport infrastructure or any number of advantages. The state tried to balance this out through redistribution: regionally preferential policies such as the taxation of more profitable enterprises to fund the industrialization of less developed regions or to support agricultural areas. But this in turn provoked regionally based contestation of policies and investment decisions. University of Glasgow economic historian Vladimir Unkovski-Korica has argued that particular workplaces tended to identify less with the polity as a whole than with the interests of their enterprise management or their regional government. The first labor strike in the young country occurred as early as 1958, in an older mine in the wealthy republic of Slovenia, driven by resentment at the channeling of what workers viewed as their wealth into the amelioration of regional inequality. But this was not merely better-off workers getting humpty about high taxes; any effort to balance out inequality, necessarily a centralized endeavor, risked being seen as a return to Serb hegemony.

As if it were not enough to be caught between the twin dangers of an egalitarian centralism viewed as Serbian chauvinism, on the one hand, and a revival of regional nationalism, on the other, Yugoslavia also faced the challenge of a rising balance of trade deficit, and as much as a third of inward investment being dependent on foreign aid. Worse still, while initially this aid had come in the form of grants, by the ’60s, these grants had turned into loans. The government responded with a greater orientation toward exports, which in turn benefitted some factories and regions more than others. The strategy of integrated development of the whole country was ultimately abandoned in 1963 via the dissolution of the Federal Investment Fund under regionalist pressure, its funds distributed to local banks, which only accelerated the centrifugation of Yugoslavia while undermining economies of scale and a rational, regionally appropriate division of labor. The market logic of enterprise competing against enterprise predictably drove the reestablishment of workplace hierarchies, as well as ever-greater emphasis on financial shenanigans and marketing skills at the expense of production—the latter largely historically viewed by socialists as a wasteful carbuncle that squanders otherwise useful resources, the quintessence of capitalist irrationality. Wasteful investment and unsustainable loans proliferated as underperforming enterprises attempted to improve their position in the market. To service these onerous debts, the reestablished managerial hierarchy, aided by a withered self-management apparatus, did what any normal capitalist manager does: squeeze wages and conditions. Unemployment made its return to the land. And all this before the global economic crises and oil shocks of the 1970s.

*Does this mean there is no room for “market socialism” or cooperatives in any conception of a socialist society? Broadly speaking, yes — though it will depend on the particular material conditions of a country undergoing socialization. Both the USSR and China, for example, had to industrialize a backwards, semi-feudal economy before complete socialization was feasible. As the agricultural sector was primarily occupied by peasants, and not proletarians, agricultural coops, AKA communes or collective farms (“kolkhoz” in Russian), were maintained as a transitional, subordinate form to the nationalized economy. And as Stalin explained in Economic Problems of Socialism in the USSR:

In order to ensure an economic bond between town and country, between industry and agriculture, commodity production (exchange through purchase and sale) should be preserved for a certain period, it being the form of economic tie with the town which is alone acceptable to the peasants.

In this limited, special circumstance, we might consider this “market socialism” to the extent that a portion of the economy produces commodities for exchange in the market. But this remains a subordinate economic form within the socialized economy, which must eventually transition from collective ownership to ownership by the whole society.

Planning in Practice (Again)

The practical algorithm Kantorovich offered, in an appendix to his 1960 work on the subject, could be solved with paper and pencil, but it was only tractable for problems of limited scale. When it came to solving more complex problems, Kantorovich recommended an approximative technique of aggregating similar production processes and treating them as a single process. At this time, in the USSR as in the United States, such exercises were largely performed by human “computers.” While Kantorovich’s ideas were met with varying levels of enthusiasm, computing power at the time was too limited to employ the technique for detailed economy-wide planning, and it was instead used for drawing up plans for particular enterprises, or at most, sectors.

After the fall of the Soviet Union, the debate naturally became something of an academic discussion, rather than a live controversy, and certainly a discourse that was lost to those engaged in day-to-day social struggle.

But in the 1990s, two “progressive” computer scientists, Paul Cokkkshott at the University of Glasgow and his kkkollaborator, ekkkonomist Allin Cottrell at Wake Forest University, began to argue in a series of academic papers that improved algorithmic techniques had once again made the question worth exploring (editor’s note: Cokkkshot’s understanding of Marxism is as extensive as his tolerance for transsexuals. That is to say — nonexistent. Deficiencies notwithstanding, his contributions to computerized planning, unfortunately, remain relevant). In their 1993 book, Towards a New Socialism, a text that in places reads less like a left-wing polemic than a university programming textbook, Cokkkshott and Cottrell argue against the idea that planning is destined to fail, employing new knowledge from the world of computer science: “Modern developments in information technology open up the possibility of a planning system that could outperform the market in terms of efficiency (in meeting human needs) as well as equity.”

Computers are better than markets—so went the argument. All the worries of Mises and Pareto—that while in theory, socialist economic calculation is no different from market calculation, it remains impractical—were being made moot by technological change. However, they contend, while the project is made easier by some level of technical sophistication, it is not so much the availability of superfast central computers that has been the major constraint. A distributed planning network of quite modest personal computers, linked by an economy-wide telecommunications system and employing a standardized system of product identification and computer databases, would be sufficient. It would, however, require universal access to computers and the free flow of information.

According to Cokkkshott, if you take a large economy and use standard input-output techniques, you can represent it as a huge matrix, with columns for every industry and the rows for how much of each output of another industry one will consume. So for, say, the steel industry column, at the bottom it will say how much steel is produced, while the rows will indicate how much coal, how much iron-ore, or how much limestone it uses.

Now, in principle, the number of steps in this matrix calculation to reach a certain mixture of final output will grow as the cube of the size of your matrix; so if you have a matrix with, say, 10 million entries in it, it will appear that to come up with an answer, the number of steps required will be 10 million to the power of three. But this is only if you choose to write it out as a matrix—because if you did that, you’d find almost all the entries in the matrix would be zero since you don’t use, say, limestone in the making of a book. Most things aren’t used in other processes. Therefore, most products require only a small number of inputs.

“The conception that everything affects everything,” says Cokkkshott, “is not true. You can disaggregate many aspects of the economy.” Through experimentation, Cokkkshott and his colleagues suggest that this disaggregation allows the number of steps to grow logarithmically rather than exponentially, enormously simplifying the complexity of the problem. In essence this means that at first there is a rapid increase in the number of steps, followed by a period where the growth slows. But the growth nonetheless keeps going, as opposed to a case where the number of steps begins slowly and then increases very rapidly as you go on.

Cokkkshott explains: “You say: ‘I only want to get an answer to three significant figures, because how many businesses really can plan their output to more than this?’ Because you don’t want an exact solution, but an approximation to a certain number of significant figures.” This rougher requirement for the calculation also limits the number of iteration steps you have to run on the algorithm. “So when you actually look at it in terms of a practical problem in terms of how the data is really structured, what the real world demands, you find you’re dealing with something very much simpler than the abstract algebra would suggest.” This is something that is now relatively well known in computer science. *A similar kind of “disaggregation” is employed, for example, by game engines performing collision detection. A naive implementation for collision detection would have you check the distance from each game object to every other game object — causing the run-time to increase factorially (very bad!) with the number of game objects. But, really, we only really need to check if two objects are touching if they are, roughly, nearby. Hence, one of the ways some game engines optimize collision is by breaking up levels into “cells,” and then only checking for collisions between objects within each “cell,” placing a practical cap on the number of necessary comparisons.

Cokkkshott has pushed the debate from the realm of theory to experimentation. It’s very difficult to do practical research in planning for obvious reasons, but after testing his ideas with a modestly advanced departmental computer costing around £5,000, he claims to have solved such optimizing equations for an economy roughly the size of Sweden in about two minutes. He projects that if he had used the sort of computers used by his university’s physics department or any weather-forecasting center, then it would be a very simple matter for larger economies, with the cycle time for computation on the order of hours, rather than months or years or millions of years.

“It’s relatively easy to show that these algorithms are tractable. They’re polynomial or sub-polynomial. They’re in the best tractability class. They’re easily amenable to industrial-scale economies with a fraction of the processing power that Google has.”

The question, then, turns to the collection of the right information. But this too is becoming easier, as products are increasingly tracked using barcodes, and purchasers and suppliers share vast databases containing information monitoring every aspect of production, the ordering of components, calculating of costs, and so on.

Now, all of this is an extraordinary claim. Cokkkshott’s methodology and results need to be interrogated and replicated by other researchers. But some of this replication has already happened right under our noses. The colossal multinational corporations and financial institutions already engage in planning internally, but on a worldwide scale, coordinating economic activities continents apart. Cokkkshott points to air transport as the first industry to be subject to comprehensive computerized planning, under the Boadicea airline booking system that launched in the 1960s. Shipping clerks are also long since a thing of the past.

To be clear: a non-market economy is not a question of unaccountable central planners, or equally unaccountable programmers or their algorithms making the decisions for the rest of us. Without democratic input from consumers and producers, the daily experience of the millions of living participants in the economy, planning cannot work. Democracy is not some abstract ideal tacked on to all this, but essential to the process.

And most importantly, computer-assisted, decentralized, democratic economic decision making will not arise as a set of technocratic reforms of the system that can simply be imposed. First there must be a fundamental transformation of the relations and structures of society, including the confection of new networks of interdependence—frameworks that the masses of people will have to fight for, build and ultimately sustain. While such a system can and must be built from the ground up, to reach the scale of what is realistically required both to construct a just economy and to deal with the ecological crisis, this system will have to be global and throughgoing in its demands for both human liberation and technological advance.

CHILE’S SOCIALIST INTERNET

The story of Salvador Allende—president of the first and only democratic-socialist regime, who died when General Augusto Pinochet overthrew his barely three-year-old administration in a US-backed coup on September 11, 1973—is well known and lamented among progressives. For much of the Left, the crushing of the Allende administration represents a revolutionary road not taken, a socialism unlike that of the Soviet Union or China, committed to constitutional democracy, the rule of law and civil liberties, even in the face of fascist paramilitary terror. *For us Marxists, it is a tragic cautionary tale about the folly of liberal reformism. The litany of human rights horrors committed under Pinochet and tales of los desaparecidos, or “the disappeared”—a euphemism for the more than 2,000 of Pinochet’s secretly abducted victims whose fate the state refused to acknowledge—have until recently eclipsed a bold and pioneering experiment in cybernetic economic planning that was initiated under Allende.

The project, called Cybersyn in English and Proyecto Synco in Spanish, was an ambitious effort to network the economy, and indeed, society. It has been described in the Guardian, not without reason, as a “socialist internet”—an endeavor decades ahead of its time.

Largely unknown for decades, it has finally received its due. Around the time of the fortieth anniversary of Pinochet’s coup, a suite of articles appeared in the mainstream media, from the New Yorker to the popular podcast 99% Invisible, many drawing on the extensive research and interviews with the architects of Cybersyn performed by electrical engineer and technology historian Eden Medina to produce her 2011 volume on the triumphs and travails of the Cybersyn team, Cybernetic Revolutionaries. The reason for the flurry of interest in Cybersyn today, and for the recovery of its story, is due in part to its remarkable parallel to the US military’s Advanced Research Projects Agency Network (ARPANET)—the precursor of the internet—and the revelation, like something out of an alternate universe, that an internet-like structure may first have been developed in the global South. The attraction to the tale of Chile’s socialist internet is likely also due to the raft of lessons for today offered by this artifact from Allende’s democratic revolution—“flavored with red wine and empanadas,” as he put it—on privacy and big data, the dangers and benefits of the Internet of Things, and the emergence of algorithmic regulation.

Our interest here, though, is primarily to consider Cybersyn in terms of its success or otherwise as an instrument of non-centralized economic planning. Freed from the Cold War’s constraints, we can today consider Cybersyn more objectively and ask whether it might serve as a model for leaping over both the free market and central(ized) planning.

Cybernetics as Herding of Cats

In 1970, the newly elected Popular Unity coalition government of Salvador Allende found itself the coordinator of a messy jumble of factories, mines and other workplaces that in some places had long been state-run, in others were being freshly nationalized, while some were under worker occupation, and others still remained under the control of their managers or owners. The previous centrist administration of Christian Democrat Eduardo Frei had already partially nationalized the copper mines, the producer of the country’s largest export. Frei’s government had also developed a massive public house-building program and significantly expanded public education, all with substantial assistance from the United States. Washington was fretful that if it did not pay for social reforms, it would witness social revolution within the hemisphere that it viewed as its own. Thus, substantial sections of Chile’s relatively small economy were already in the public sector when the socialists took over, stretching the bureaucracy’s management capability to its limit. A more efficient strategy of coordination was required.

The then-29-year-old head of the Chilean Production Development Corporation, Fernando Flores, responsible for the management of coordination between nationalized companies and the state, had been impressed with the prolific writings on management cybernetics of a British operations research scientist and management consultant named Stafford Beer. Flores had studied industrial engineering at the Catholic University, but in doing so, he had also trained in operations research, that branch of applied mathematics in search of optimal solutions to complex decision-making problems. It’s a salmagundi of a discipline, drawing on modeling, statistical analysis, industrial engineering, econometrics, operations management, decision science, computer science, information theory, and even psychology. In the course of his studies and early work for the Chilean railways, Flores had come across Beer’s texts on cybernetics. While Beer’s work, for which he had gained a substantial international reputation, focused on more efficient management techniques, according to Medina’s interviews with Flores, the latter was captivated by how the “connective, philosophical foundation” of Beer’s management cybernetics could serve Allende’s vision of an anti-bureaucratic democratic socialism in which workers participated in management and that would defend individual civil liberties. Management cybernetics, Flores reasoned, could assist the young government in “herding the cats” of the public and worker-managed sectors.

The term “cybernetics” today has something of a naively techno-utopian aura, or even a body-horror, dystopic dread about it. But at its fundament, the field of cybernetics simply investigates how different systems—biological, mechanical, social—adaptively manage communication, decision making and action. The first edition of Beer’s 1959 book on the subject, Cybernetics and Management, does not even make reference to computers, and, as Medina is keen to stress, Beer himself was an intransigent critic of how business and government deployed computers. Cybernetics is not management by algorithm. It is not digital Taylorism.

During World War II, MIT mathematician Norbert Wiener and his engineering colleague Julian Bigelow were tasked with developing ways of improving the targeting of enemy aircraft. Following consultations with an early neuropsychologist, the two developed an apparatus that automatically helped the human gunner to correct their aim through what they called feedback, a circular method of control through which the rules governing a process are modified in response to their results or effects. Today, this may seem obvious (and its very obviousness is likely a product of how influential cybernetic notions have become in our culture; this is where the word “feedback” comes from), but at the time, this was a revelation wherein linear, “if this, then that” control systems dominated. As Richard Barbrook recounts in his 2007 history of the dawn of the computer age, Imaginary Futures, despite the military engineering origins of the field, Wiener would go on to be radicalized by the Cold War and the arms race, not only declaring that scientists had a responsibility to refuse to participate in military research, but asserting the need for a socialist interpretation of cybernetics. “Large corporations depended upon a specialist caste of bureaucrats to run their organisations,” Barbrook notes. “They ran the managerial ‘Panopticon’ which ensured that employees obeyed the orders imposed from above. They supervised the financing, manufacture, marketing and distribution of the corporation’s products.” Wiener, and later Beer, on the other hand, conceived of cybernetics as a mechanism of domination avoidance: a major challenge that the managers of any sufficiently complex system face, according to Beer, is that such systems are “indescribable in detail.”

Echoing this concern, three years before the Prague Spring uprising was crushed by Soviet tanks in 1968, two Czechoslovak authors, Oldrich Kýn and Pavel Pelikán, published Kybernetika v Ekonomii, a book that challenged the top-down central planning system. In it, they focused on the key role of accurate information in the coordination of economic activities, whether via the market or through planning, arguing that the human capacity to receive and process information is inherently limited. A high degree of centralizing hierarchy requires that the top-level decision makers have a large information processing capability. At the same time, in addition to the poor quality of decision making resulting from the inability of an individual or even a small group of humans to process more than a certain amount of information, overcentralization can also result in the costs of transmitting and processing information being “many times higher than the most pessimistic estimates of loss that could occur with an effective reduction of information and a decentralization of a large part of the decision-making.” Instead, Kýn and Pelikán proposed that the amount of information be gradually reduced along the hierarchy, with each place in the hierarchy enjoying a certain degree of freedom for independent decisions: “Not all the information collected below can arrive at the highest places. The problem, of course, is how to reduce information without losing what is essential for making decisions.”

Conversely, as Beer was aware, too much decentralization and autonomy could produce chaotic results that undermine the well-being of the system as a whole, producing either debilitating overproduction or shortages. Thus his model aimed to promote a maximum of self-organization among component parts via redundant, lateral, multi-node communication networks, while retaining some channels of vertical control to maintain systemic stability and long-term planning. Instead of the abstract dichotomy of centralization versus decentralization, he asked: What is the maximum degree of decentralization that still permits the system to flourish?

Allende was attracted to the idea of rationally directed industry, and upon Flores’s recommendation, Beer was hired to advise the government. Beer, for his part, frustrated at only ever seeing partial implementation of his ideas by the firms he advised, was attracted to the possibility of putting his full vision into practice, and on a much wider scale than he had yet attempted.

And that vision would involve the linking-up of a real-time communication network, connecting factory floor to factory floor, and upward to the State Development Corporation (CORFO), rapidly dispatching data both laterally and vertically and thus allowing quick responses at all points in the system to changing conditions. The data collected would also be crunched by a mainframe computer to produce statistical projections about likely future economic behavior. In addition, the system would involve a computer simulation of the Chilean economy as a whole, which Beer and his colleagues termed “CHECO” (CHilean ECOnomic simulator). However, upon his first visit to Chile, Beer was confronted with the reality of the country’s limited computer resources: just four low- to mid-range mainframes owned by the National Computer Corporation (ECOM), which were already largely locked up with other tasks. At most, ECOM could offer processing time on one such device, an IBM 360/50. As Medina puts it, Beer would be building a computer network of one computer.

But the key was the network, not the type of machine doing the networking. And so as a solution, Beer suggested connecting to the single IBM mainframe a communications network of telex machines which were common enough in Chile and at the time even more reliable than telephones. Initially, Beer thought he was working on a project to develop a more accountable, more responsive communications and control system between government-appointed factory managers, or “interventors” to use the Chilean terminology of the time, and CORFO. He envisaged that the interventors at each enterprise would use the telex machines to transmit production data to the telex machine at the National Computer Corporation. Computer operators there would then translate this information into punch cards that would be fed into the mainframe, which would in turn use statistical software to compare current data with past performance, seeking anomalies. If such an anomaly were discovered, the operators would be notified and they would then notify both the interventor concerned and CORFO. CORFO would then give the interventor a brief period to sort out the anomaly on their own, offering the enterprise a certain degree of autonomy from higher decision making while also insulating those government decision makers from what could otherwise be a tsunami of data by transmitting only what was crucial. Only if the interventor could not sort out the problem would CORFO step in. In this way, instead of all production decisions being made in a centralized top-down fashion, there would be an iterative “roll-up” process, as Beer described it, with policies transmitted downward to factories and factories’ needs transmitted upward to government, continually adapting to new conditions. Beer, a severe critic of Soviet bureaucracy, also believed that the statistical comparisons produced centrally would reduce the ability of factory managers to produce false production figures, as happened in the USSR, and enable much-faster discovery of bottlenecks and other problems. The aim was real-time economic control—in this period a staggering ambition, socialist or otherwise—or as close to it as possible. Up until this point, Chile’s conventional economic reporting methods involved extensive, lengthy printed documents detailing information collected on a monthly or even yearly basis.

Even before the election of Allende’s six-party Popular Unity coalition government, the United States had spent millions on propaganda efforts against the Left and to support the Christian Democrats. Upon the nationalization of the copper industry (even with the unanimous support of the opposition Christian Democrats), Chile’s primary export, the United States cut off credits, and the multinational companies that had been the owners of the mines worked to block exports. Factory and land owners took to the courts to try to block reforms, and sections of the Right openly called for a military coup, an option supported by the CIA. While substantial wage increases for manual and white-collar workers had initially slashed unemployment and contributed to strong economic growth of 8 percent a year, this de facto blockade soon crippled the economy and limited the availability of consumer items. With wage increases chasing fewer items, shortages and crippling inflation appeared, in turn provoking accusations of middle-class hoarding. Allende’s Popular Unity government—very much believed by the working class to be their government—was being threatened internationally and domestically. The workers and peasants were radicalizing; society as a whole was sharply polarizing.

The circumstances of a government under threat forced Beer’s team to work under a crash schedule. The project was challenged on a number of fronts that were not eased by the acceleration of the timetable, but the difficulties were less technological than they were social. Operations research scientists had to perform studies of every nationalized company and establish which production indicators the software would need to track and which ones it should ignore. This was no simple task, even for a simplified model that was intended not to represent the full complexity of the Chilean economy, but simply to uncover the key factors that had the biggest impact on outputs. Nevertheless, the CHECO model was to go beyond production factors—productivity and demand—to consider the currency supply—investment and inflation. But the team was having difficulty simply getting hold of the necessary information to test the model. Mining data was two years old. Agricultural data was sparse. In some enterprises, advanced information collection processes did not even exist. In the end, while CHECO was able to run experimental models exploring inflation, foreign exchange and national income, as well as simplified models of the whole economy and of a handful of sectors, the team viewed these efforts only as a testing ground, not to be used to develop policy.

In addition, for all of Beer and Flores’s desire and Allende’s insistence that the project achieve a participative, decentralizing and anti-bureaucratic system, the role of workers on the factory floor was sometimes negligible, with Cybersyn engineers tending to speak first to enterprise upper management, then to middle management, and then finally to factory production engineers. Medina’s history of the project is careful not to romanticize the results. The engineers did consult with workers’ committees, but not on a regular basis. On top of this, to be able to model individual factories, they needed postsecondary training in operations research, and Chile at this time had a very limited pool of graduates who had been so trained. The team faced resistance from factory managers, whose class position made them less sympathetic to the project, or who simply did not understand what its purpose was. Despite direction to factory engineers that they work with workers’ committees, again class divisions posed a barrier: engineers were instead condescending to workers and preferred talking to management. Medina, in her research, found very little evidence that rank-and-file workers played much of a role in shaping the modeling process.

But one can also imagine the same system being used in a very different way, arming instead of disarming workers. Indeed, even in an embryonic form, the Cybersyn communications network helped groups of workers to self-organize production and distribution during what would otherwise have been a crippling trucking strike, mounted by conservative business interests and backed by the CIA, in 1972. In so doing, it offered the struggling Allende administration a brief stay of execution.

Cyber Strikebreaking

It was during the strike that Cybersyn came into its own. The network could allow the government to secure immediate information on where scarcities were most extreme and where drivers not participating in the boycott were located, and to mobilize or redirect its own transport assets in order to keep goods moving. But this was not simply a top-down operation, directed from La Moneda Palace by the president and his ministers. The strike had forced public sector operations that were near to each other to work together in “cordónes industriales”—literally, “industrial belts”—in order to coordinate the flow of raw materials and manufactured products. The cordónes in turn worked with local community organizations, such as mothers’ groups, to assist with distribution. The autonomous operation of these cordónes mirrored forms of spontaneous worker and community self-direction that appear to pop up regularly during times of revolutionary upheaval, or otherwise at times of crisis or natural disaster, whether we call them “councils,” “comités d’entreprises” (France), “soviets” (Russi a), “szovjeteket” (Hungary) or “shorai” (Iran). Liberal commentator Rebecca Solnit describes in her social history of the extraordinary communities that emerge at such extreme moments, A Paradise Built in Hell, how, far from the chaotic, Hobbesian war of all against all of elite imagination, it is calm, determined organization that on the whole prevails. She repeatedly discovered how remarks by those attempting to survive through earthquakes, great fires, epidemics, floods and even terrorist attacks that despite the horrors experienced, reflect how truly alive, full of common purpose and even joyful they felt. This is the distilled version of the economic calculation debate: relatively flat hierarchies seem perfectly capable of democratically coordinating production and distribution for a limited number of goods and services, for a small number of people and over a limited geography. But how could the myriad products needed by a modern, national (or even global) economy—with its complex web of crisscrossing supply chains, thousands of enterprises and millions of inhabitants (billions, if we consider the global case)—be produced without vast, metastasizing and inefficient bureaucracies? How are the interests of the local production node integrated harmoniously with the interests of society as a whole? What may be in the interest of a local enterprise may not be in the interest of the country.

What happened in Chile in October of 1972 may not be the definitive answer to these questions, but it hints at some possibilities.

On October 15, Flores suggested to the director of the CHECO project that they apply what they had learned from their experimentation to battling the strike. They set up a central command center in the presidential palace, connected via the telex machines to a series of specialized operational units focusing on different key sectors: transport, industry, energy, banking, supply of goods and so on. This network allowed the government to receive minute-by-minute status updates directly from locations across the country, and then just as quickly to respond, sending orders down through the same network. A team at the palace analyzed the data coming in and collated them into reports, upon which government leaders depended to make decisions. If one factory was short of fuel, spare parts, raw materials or other resources, this data flowed through the network to another enterprise that could help. Information was also shared on which roads were clear of oppositionists, allowing the trucks that remained under public control to redirect themselves and avoid blockades. Medina notes how some historians emphasize, instead, the role of popular mobilization from below in breaking the strike, but she argues this is an unnecessary dichotomy. While it did not eliminate vertical hierarchy, the network did connect the government command center to the horizontal activities on the ground. Medina writes: “The network offered a communications infrastructure to link the revolution from above, led by Allende, to the revolution from below, led by Chilean workers and members of grassroots organizations, and helped coordinate the activities of both in a time of crisis.” She argues that Cybersyn simply faded into the background, “as infrastructure often does.” The system did not tell the workers what to do; the workers and their representatives in government simply used the system as a tool to aid them in doing what they wanted to do.

The reality of Chileans directing a technology rather than the other way round should assuage potential concerns that our hypothesis—that contemporary processing power and telecommunications networks can work to overcome the economic calculation challenge—is a technocratic solution; that we are arguing that we offload the responsibility for constructing the democratic, marketless society to an algorithm. This gets it absolutely backwards.

Meanwhile, Flores’s strategy proved a success, shaving the edges off the shortages. Government data showed food supplies were maintained at between 50 and 70 percent of normal. Distribution of raw materials continued as normal to 95 percent of enterprises crucial to the economy, and fuel distribution at 90 percent of normal. Economic reports now relied on data that had been collected and delivered from across the country just three days earlier, where previously such government assessments had taken up to six months to produce. By the end of the month, the strike was all but broken, and it had clearly failed to achieve its goal of paralyzing the country. Chile still functioned. A minister told Beer that if it had not been for Cybersyn, the government would have collapsed on the night of October 17.

The result inspired Beer to envision still-wider applications of cybernetics to support worker participation. This former international business consultant had moved in an almost anarcho-syndicalist direction: “The basic answer of cybernetics to the question of how the system should be organised is that it ought to organise itself.” Science and technology could be tools used by workers to help democratically coordinate society, from the bottom up, leaping over the centralization/decentralization dichotomy. Instead of having engineers and operations researchers craft the models of factories, programmers would be under the direction of workers, embedding their deep knowledge of production processes into the software. Instead of the Soviet model of sending large quantities of data to a central command point, the network would distribute, vertically and horizontally, only that amount of information that was needed for decision making. For Beer, Medina writes, Cybersyn offered “a new form of decentralised, adaptive control that respected individual freedom without sacrificing the collective good.”

But for us, more than four decades later, we have a few outstanding questions, not least of which is whether a system used in emergency, near–civil war conditions in a single country—covering a limited number of enterprises and, admittedly, only partially ameliorating a dire situation—can be applied in times of peace and at a global scale.

After the strike, the government continued to use the network and had plans for its extension, but we will never know whether it all would have worked. On September 11, 1973, the Chilean armed forces finally initiated the coup against Allende that the United States had long sought. According to most assessments, including a 2000 report on the matter by the US Intelligence Community, the plotters proceeded with an implicit nod from Washington. At seven o’clock that morning, the Chilean navy rebelled, seizing the seaport of Valparaíso. Two hours later, the armed forces controlled most of the country. At noon, the general of the army, Gustavo Leigh, ordered Hawker Hunter jets to bomb the presidential palace, while tanks attacked from the ground. When Allende learned that the first floor of La Moneda had been taken, he ordered all staff out of the building. They formed a queue from the second floor, down the stairs and toward the door that opened to the street. The president moved along the line, shaking hands and thanking everyone personally.

President Salvador Allende then walked to Independence Hall on the northeast side of the palace, sat down, and placed a rifle that had been given to him by Fidel Castro between his legs, setting its muzzle beneath his chin. Two shots tore off the top of his head.

The military regime of General Augusto Pinochet immediately halted work on Project Cybersyn, physically destroying much of what had been constructed, although the most important documentation survived due to the rapid actions of key figures involved. By 1975, in addition to murdering, disappearing and torturing thousands, forcing thousands of others to flee as political refugees to places such as Canada, the junta had also implemented the world’s first experiment in what would come to be known as neoliberalism, prescribed by economists, most of whom had studied at the University of Chicago under Milton Friedman, who would go on to advise Republican US President Ronald Reagan and Conservative UK Prime Minister Margaret Thatcher. The junta followed the recommendations of these “Chicago Boys” to the letter: shock privatization of much of the public sector, slashed public spending, mass civil servant layoffs, wage freezes and economy-wide deregulation.

Variations on this neoliberal theme have since been adopted, with varying degrees of zeal or reluctance, by almost all governments the world over, producing a yawning inequality across much of the West—admittedly not always accompanied by CIA-trained death squads shoving trade unionists out of helicopters mid-flight or cutting off fingers and tongues of left-wing guitar-playing folk singers. Reigniting the dream of planning from the bottom up today means first undoing the harms, including in the world of ideas, of the neoliberal half century.

*YOU CAN HAVE YOUR BURGER AND EAT IT TOO

The contradiction between town and country, between agriculture and industry, was historically quite troublesome for early 20th century Communism to navigate. In the present day, and especially here in the heart of empire, we are, perhaps, particularly well-suited to tackle this dilemma. Vast swaths of agriculture are owned, not by small and medium peasants, but by massive agribusiness monopolies and a shrinking class of petty bourgeois owner-operators (who can be subdivided into part owners and pure tenants). A 2017 report from the United States department of agriculture has these key insights about farmland ownership:

• Over the 5 years since 2012, the number of total farms decreased by 3.2% down to 2 million.

• Medium-sized farms are disappearing; over the same timespan, only the smallest farms (less than 10 acres in size) and the largest farms (2,000 acres or larger) increased in number.

•   The largest 4 percent of U.S. farms controlled 58 percent of all farmland (up 8% from 1997).

• In 2017, farms operated by part owners accounted for just under one fourth of all farms but 56 percent of all farmland (with small tenants making up a minority of both total farmland and total farms).

The single largest share of farmland is held by the middle class of owner-operators, though this is steadily decreasing over time. At the same time, the share of farmland owned by large owners is growing. Furthermore, the majority of our actual agricultural laborers are proletarian or semi-proletarian, and, whether inmates or migrant laborers, usually nationally oppressed. If you’ll recall the discussion from chapter 7 about the peasantry and agriculture in the USSR and China, it should therefore be clear that the historical obstacles prohibiting the complete socialization of agriculture are not present in the modern American Empire, nor anywhere else in the imperialist metropoles. In all likelihood, then, we would not require a subordinate, commodity-producing sector of small-scale agricultural co-ops; we would be able to advance straight to socialized agriculture (not to imply that this would be so simple) — a precondition, of course, for planned agricultural production.

With greater immediate pertinence to the topic of economic planning, the digitalization of industrial farm equipment has already developed the means for both producing and distributing data about agricultural production — which will be necessary for planners to coordinate production. If, like me, you are a metropolitan urbanite who has never so much as held a hoe (and if you are reading this, chances are good that you are), you may be surprised to learn exactly how “high-tech” modern farm equipment has become. But behind the backs of us city slickers, there’s apparently been a veritable revolution in “precision agriculture.”

As it turns out, litter boxes, cars, and doorbells aren’t the only gadgets that now sport “smart” features. The modern tractor now comes equipped with a central processor, torque sensors on the wheels that measure soil density, humidity sensors on the undercarriages that measure soil moisture, location sensors on the roof that plot density and moisture, GPS tracking, and, evidently, satellite network connectivity. Among the cutting-edge of agricultural tech is Verdant Robotics, a company whose robotic tractors sport an array of instruments so advanced you’d think it were science fiction. According to their marketing materials, the robots use high resolution cameras and advanced data vision algorithms to automatically spray each and every plant with either fertilizer or pesticides, which they boldly claim reduces chemical use by 95% compared to traditional spraying techniques. What’s more, they claim that these robots create digital maps of the entire field, with the geolocation data of every single plant. According to Curtis Garner, founder and CCO of Verdant Robotics, “One thing to understand about our robot and how it interacts with plants is that it’ll actually remember every single plant…We’re really farming at the centimeter level basis here, so this is allowing us to farm every single plant individually.”

While these advancements are not the technocratic silver bullet to world hunger that capitalists like Mr. Garner would have us believe, they are, no doubt, groundbreaking and disruptive advancements. With these networked machines, it is becoming possible to not only maximize crop efficiency beyond precedent, but, critically, to predict and project crop yields with increasing accuracy. Undoubtedly, these developments will be essential for the future of planned agricultural production. In the meantime, however, the question every revolutionary should be asking themselves is: who exactly owns all this data, and how is it being used?

Hint: it’s not great.

The Right To Repair

Predictably, these innovations in precision agriculture have opened up new avenues of conflict between those who manufacture and own these advanced, technological behemoths and those that use them. Ominously, for example, Verdant Robotics isn’t actually in the business of selling robots. According to them, they’re selling “robotics-as-a-service,” which they don’t explain, but I can only assume means they actually lease the machines. Sinister as that is, we should at least commend their relative honesty compared with their competitors; after all, despite claiming to sell tractors, the reason John Deer equips their tractors with a kill-switch is because they maintain that, actually, farmers do not own their tractors at all — they license them!

Apparently when you buy a tractor, you don’t purchase the software loaded up inside of it, you lease it, and are therefore also beholden to ongoing terms of service dictated by the copyright holders. According to John Deere’s brief submitted to the Copyright Office, rather than properly owning the tractor, the supposed vehicle owner actually “receives an implied license for the life of the vehicle to operate the vehicle,” which farmers have said undermines the very notion of private property. Under the pretense of protecting copyright, the terms of service restrict what the vehicle “owner” is allowed to do with “their” property. According to Cory Doctorow, a journalist who writes about intellectual property, “Those terms specify that even if a farmer repairs their own tractor, swapping a broken part for a working one, they must pay hundreds of dollars and wait for days for an authorized Deere technician to come out to the end of their lonely country road to key in an unlock code.”

If you’re familiar with the conspiracy behind why McDonald’s ice cream machines are always broken, it’s a similar scheme. For the uninitiated: McDonald’s franchise owners are required to buy and equip their kitchen with a specific ice cream machine model from the Taylor company, a McDonald’s partner. This particular model is especially prone to breaking down, and can only be fixed by calling in one of Taylor’s specially authorized repair technicians. Allegedly — so the conspiracy goes — Taylor pays off McDonalds to force their franchisees to buy a machine that's designed to break in order to collect repair fees (a whopping 25% of Taylor’s revenue comes from their repair service!).

John Deere is perhaps even more perverse: Rather than equipment designed to fail, John Deere instead simply reserves the right to remotely disable their customer’s tractors if they void their contract by doing self repairs. The impact has been that farmers, from America to Ukraine, have had to “jailbreak” their farm equipment to get around these lock-outs and to avoid paying a fortune to John Deere.

Naturally, John Deere also maintains that it owns all the data collected by the sensors on their tractors. But because they’re not complete sleazeballs, they are at least willing to sell the data back to the farmers. How generous! As Doctorow further elaborates:

Deere originally bundled that data with an app that came with seed from Monsanto (now Bayer), its preferred seed vendor. The farmers generated the data by plowing their fields with their tractors, but Deere took the position that the farmers weren’t the owners of that data —Deere was.

And that’s not all, either. Not only do the farmers not own the data they collect with their tractor, but that data doesn’t even actually go towards maximizing the efficiency of the farm. According to Doctorow, it’s used to bet against the farmers:

Deere aggregates all the soil data from all the farms, all around the world, and sells it to private equity firms making bets in the futures market. That’s far more lucrative than the returns from selling farmers to Monsanto. The real money is using farmers’ aggregated data to inform the bets that financiers make against the farmers.

Once again, we see a story of technology that could be used to fulfill the needs of humanity being narrowly used for private gain, actively in opposition to the interests of the rest of society. What makes this uniquely frustrating is how clearly this technology could be used for good, if only the people that made it were the same people that were using it. How trivial it would be, indeed, to socialize this hardware and software, and to use the data collected in a social plan to meet the needs of everyone in society.

Revolutionaries should not only be asking themselves how to organize future society. They must understand, here and now, what the state of the class struggle looks like for each sector of the population. What can we contribute to the “right to repair” movement, in which small landowners struggle against the dual tyranny of Big Tech and Big Agribusiness? Could these small-holding tenant-operators be potential class allies? What can we add to our program to entice them, to convince them of the merits of socialism? And so on. As Lenin argued in What is to be Done?, socialist organizers must “go among all classes of the population” in order to understand the full landscape of the class struggle in society. And per Mao, we must go down to the countryside to practice social investigation — in other words, to become more intimately familiar with what’s going on down there.

I Will Not Eat The Bugs!

No investigation into the state of agriculture would be complete without analyzing the production of meat. Factory farming, the source of much of the meat consumed in the West, is cruel, inhumane, unhygienic, a major contributor to greenhouse gas emissions, and uses incommensurate amounts of water. Just under 40% of our global agricultural land is being used to feed livestock, and just under 30% of our ice-free land used for grazing. Here in the US specifically, 44% of our farmland is specialized for cattle and dairy production. Worse, there’s a net loss of calories in the raising of livestock, requiring anywhere from 5 to 25 calories of feed to produce 1 calories worth of meat (depending on the animal). Surprisingly, raising a 1200 lb steer only to get around 500 lbs of edible protein is not an efficient process! And while climate change threatens to decrease our already limited arable land, it is clear that something here needs to change — whether that’s our consumption or our production — if we are going to feed everyone without destroying the ecosystem in the process. So give it to me straight comrade: will I have to eat the bugs?

With profound conviction, I'm here to tell you: maybe! It depends. Specifically, it depends on the feasibility of synthetic (or “cultured”) meat production, which is still in its infancy. The industrial scalability of this novel field of production is still unclear, but the potential benefits are highly promising. Basically, the process involves directly growing cloned muscle tissue in a vat without the need to grow the rest of the animal — although some animals would still need to be raised so that we can clone their juicy bits. Researchers in the field promise that the taste, appearance, and nutrition between natural and “in vitro” meat are indistinguishable. And while all of that sounds a bit unsettling, like some kind of nightmarish bovine matrix, I’d like to think we can agree that the (mostly) cruelty-free scifi meat is preferable to mandatory veganism or total climate collapse. If it can deliver on its promises, you won’t have to choose between the environment and maintaining your gains — you’ll get to have your burger and eat it too!

Let’s get our hooves dirty in some data. Even as undeveloped as the production process is, artificial meat growth is already substantially faster than natural meat production, producing an equivalent amount of edible meat in just a couple months compared to what livestock would grow in a year (or more). One extensively cited study by Tuomisto and de Mattos (2011) further claims that:

In comparison to conventionally produced European meat, cultured meat involves approximately 7-45 % lower energy use (only poultry has a lower energy use), 78-96 % lower GHG emissions, 99 % lower land use, and 82-96 % lower water use depending on the product compared.

Additionally, because it would be an industrial process in a factory resembling something like a brewery, our meat could be produced directly within the metropolitan centers, cutting out substantial transportation costs and further closing some of that divide between town and country. Skeptics of sustainable, artificial meat tend to fall into two camps. The first maintains that artificial meat has no future because consumers won't want it due to the "ick" factor and that, because it's currently more expensive per pound than natural meat, there's no prospect for profitable investment. Unlike a sexy, new electric vehicle, most consumers aren’t likely to spend more money on an equivalent product just because it’s more “ethical” or “green.” Evidently these gentlemen have been mentally neutered by market ideology. Yes indeed, if artificial meat will ever be viable as a sustainable model of meat production, it will not be the market that delivers it to us. Further research and development is needed to make the process economically viable, and with the need to compete with natural meat, the capital investment is undoubtedly too risky for the private sector. Even if this were overcome, however, it is certain that capital would still prefer to be invested elsewhere. In every capitalist society, the development of agriculture lags behind the development of industry, and this tendency is all the more amplified by capitalist-imperialism, where investment into the less developed foreign countries presents a much higher return on investment. In Imperialism: The Highest Stage of Capitalism, Vladimir Lenin described this tendency like so:

It goes without saying that if capitalism could develop agriculture, which today is everywhere lagging terribly behind industry, if it could raise the living standards of the masses, who in spite of the amazing technical progress are everywhere still half-starved and poverty-stricken, there could be no question of a surplus of capital…The need to export capital arises from the fact that in a few countries capitalism has become “overripe” and (owing to the backward state of agriculture and the poverty of the masses) capital cannot find a field for “profitable” investment.

Hence, regardless of the particular technology in question, the profit motive will inevitably foreclose the possibility of qualitatively transforming or substantively “upgrading” our domestic agricultural sector. Wherever the forces of production become increasingly efficient, wherever, in other words, the “organic composition of capital” (ratio of machine to labor) increases, the rate of profit drops, providing ever more incentive to exploit the less developed nations instead. And don’t forget: much of the meat we consume doesn’t even come from domestic farms, but from super-exploited labor of the global south!

Furthermore, our capitalist public sector is just as unlikely to cough up the dough too, because artificial meat would encroach on the territory of the meat lobby, who will inevitably have an outsized influence with "our" representatives. Unless, of course, the development of synthetic meat proves necessary to win a war. But in lieu of that particular circumstance, synthetic meat will simply never be probable if it has to compete with the natural meat industry — whether directly in the market or for influence in the bourgeois state. It will, in the main, have to replace natural meat production. It should be understood that these barriers are specifically capitalist barriers; hence, if planned production is to deliver us sustainable meat, then it will have to be socialist planning, in particular.

The second camp has more technical concerns. They criticize the overly-optimistic projections about whether the techniques used to culture meat cells could ever scale up, that the theoretical ceiling on efficiency will never make it cheaper to produce (or as cheap to produce) than natural meat. It’s not a wholly unreasonable concern; because “knowledge production” in capitalist society, like all production, must be profitable, there is an incentive for researchers seeking funding to exaggerate their claims and to fudge their numbers. This is not the idle conspiracy of a micro-plastic poisoned luddite: I have witnessed this first-hand! Indeed, some skepticism is well warranted. Summarizing the Good Food Institute’s (GFI) techno-economic report and subsequent criticism by Paul Wood, a pharmaceutical researcher with a PhD in immunology, The Counter reported that:

Up until this point, GFI’s imagined production line looks somewhat like what you might encounter in a present-day vaccine-manufacturing plant. The Oxford-Astrazeneca and Johnson & Johnson Covid-19 vaccines, for instance, are produced using a related method… But GFI’s version assumes an additional step that would further process the cells into human food. The large, stirred-batch reactor will be harvested three times to fill four smaller perfusion reactors… Each perfusion reactor would ultimately deliver a total of 770 kilograms of cultivated meat, slightly more than the weight of a single live steer before slaughter—this time without the bones and gristle. It’s a complex, precise, energy-intensive process, but the output of this single bioreactor train would be comparatively tiny. The hypothetical factory would need to have 130 production lines like the one I’ve just described, with more than 600 bioreactors all running simultaneously… If cultured protein is going to be even 10 percent of the world’s meat supply by 2030, we will need 4,000 factories like the one GFI envisions… All of those facilities would also come with a heart-stopping price tag: a minimum of $1.8 trillion, according to Food Navigator. That’s where things get complicated. It’s where critics say—and even GFI’s own numbers suggest—that cell-cultured meat may never be economically viable, even if it’s technically feasible.

It's not looking good for synthetic meat's viability. However, we'd do well to recognize two things. First, R&D into synthetic meat has been chronically underfunded, both by the private and public sector. The fact that it’s technically feasible should give us some hope that it can not yet be ruled out that a superior production pipeline could be developed with greater social investment and research. Second, if you’ll recall the struggle over penicillin production during WWII from chapter 5, before the intervention of planned production it was similarly conceived as a dead end. Only planned production and research that centered penicillin’s potential utility over its market viability was able to bring it down from the realm of “technically feasible” to “economically viable.” Indeed, many of the same concerns about the mold being temperamental, difficult to isolate, bearing low yields, etc, are essentially the same concerns being raised about the viability of synthetic meat. And since the same essential technology for cell acculturation comes from the pharmaceutical industry, we can surmise, perhaps naively so, that economic planning could once again work its magic.

To be sure, I’m no biologist, nor am I an industrial engineer. Nevertheless, the quest to bridge the gap between what is technically possible and what is realistically viable remains, in many ways, one of the central themes of this book. Therefore, I remain cautiously optimistic about the potential for socialistly planned synthetic meat production. Without dipping too far into the depths of speculation, we can conjecture about how this would be organized in a future socialist society; for example, we can imagine that a secondary natural meat sector would still remain, and not purely to provide synthetic meat with the cells it needs either. While synthetic meat could replace the majority of muscle protein in our diet, it’s still completely probable, in other words, that we could enjoy natural meat as a luxury product, provided that it would become a minor part of our diets, and provided that these animals would either be “free range” or hunted. We need not indulge in maximalist “all or nothing” proposals without at least considering how changes in production will affect what is environmentally sustainable to consume. And if I am wrong…Well perhaps it's best to get used to tofu.

PLANNING THE GOOD ANTHROPOCENE

What is profitable is not always useful, and what is useful is not always profitable. This, one of the principal themes of this book, applies on scales both granular and grand. As we have seen, no matter how beneficial new classes of antibiotic may be, they are insufficiently profitable, so they will not be produced. Meanwhile, many other commodities, such as fossil fuels, that undermine human flourishing or even threaten our existence, remain profitable, and so without regulatory intervention, companies will continue to produce them. The market’s profit motive—not growth or industrial civilization, as some environmentalists have argued—caused our climate calamity and the larger bio-crisis. The market is amoral, not immoral. It is directionless, with its own internal logic that is independent of human command.

It would be very useful to wind down our species’ combustion of fossil fuels, responsible as it is for roughly two-thirds of global greenhouse gas emissions. It would be useful, too, to increase input efficiency in agriculture, which, together with deforestation and land-use change, is responsible for most of the remaining third.

We know how to do this. A vast build-out of dependable baseload electricity from nuclear and hydroelectric plants, supported by more variable renewable energy technologies such as wind and solar, could replace nearly all fossil fuels in short order, cleaning up the grid and delivering enough clean generation to electrify transport, heating, and industry. Decarbonizing agriculture is more complicated, and we still need better technology, but we understand the overall trajectory. Unfortunately, wherever these practices do not create profit, or do not create enough profit, companies will not put them in place.

We hear regular reports claiming that investment in renewable energy is now outpacing investment in fossil fuels. This is good, though it is often the result of subsidies for market actors, themselves typically derived from hikes in the price of electricity that hit working-class communities, rather than from taxes on the wealthy. Even if, in relative terms, more money is going toward wind and solar than toward coal, the absolute increase in combustion from India and China, among other nations, will likely push us past the two-degree-Celsius limit most governments have agreed is necessary to avoid dangerous climate change.

Simply put, the market is not building enough clean electricity, nor abandoning enough dirty energy, nor doing either quickly enough. The relatively simple directive to “clean up the grid and electrify everything” that resolves the fossil fuel part of the equation doesn’t work for agriculture, which will require a far more complex set of solutions. Here too, as long as a particular practice rakes in money, the market will not abandon it without regulation or public sector replacement. Liberals and greens argue that we should include the negative impacts of fossil fuel combustion (and its agricultural corollaries—some suggest a nitrogen tax) in fuel prices. In their estimation, once these externalities increase the carbon price to $200 or $300 per tonne (or as much as $1000 per tonne, according to the US National Association of Manufacturers), the market—that efficient allocator of all goods and services—will resolve the problem. Leaving aside the grotesque inequalities that would result from steadily ratcheting up flat taxes, even as working-class and poor people spend a larger proportion of their income on fuel, carbon-tax advocates have forgotten that their solution to climate change—the market—is the very cause of the problem.

Think Bigger

How will a carbon price build a network of electric vehicle fast-charging stations? Tesla only builds them in those areas where it can rely on profits. Like a private bus company or an internet service provider, Elon Musk won’t provide a service where it doesn’t make money (or at least, one that doesn’t convince investors that it will someday make money; Tesla is currently a loss-making black hole for venture capital). The market leaves the public sector to fill the gap. (Editor’s note: While electric vehicles do consume energy more efficiently, and while a decarbonized energy grid would eliminate their emissions from charging, the electrification of personal vehicles is no solution to climate change when one considers the much higher life-cycle/embodied energy cost compared to combustion vehicles. These costs are still born even with a decarbonized energy grid. Therefore, socialization of transportation will very likely have to be the dominant policy, perhaps with a minor, collectively owned fleet of shared automotives. Iam only keeping this section here for its argument against the market.)

This is no abstract argument. Norway provides free parking and charging for electric vehicles, allows these cars to use bus lanes, and recently decided to build a nationwide charging network. Thanks to its interventionist policy, electric vehicles in the country as of January 2018 account for over 50 percent of total new sales, more than anywhere else. For comparison, barely 3 percent of cars in eco-friendly but market-enthralled California are electric.

The up-front costs of some of these changes pose one important obstacle. Take, for instance, nuclear power. From a system-wide perspective, conventional nuclear power still represents the cheapest option thanks to its mammoth energy density; it also boasts the fewest deaths per terawatt hour and a low carbon footprint. The only energy source with a lower carbon footprint is onshore wind. But, like large-scale hydroelectric projects, construction costs are considerable. The Intergovernmental Panel on Climate Change notes that while nuclear energy is clean and non-intermittent, and has a tiny land footprint, “without support from governments, investments in new…plants are currently generally not economically attractive within liberalized markets.” Private firms refuse to begin construction without public subsidies or guarantees. This explains why the most rapid decarbonization effort so far occurred before European market liberalization wrapped its fingers around the neck of its member-state economies. The French government spent roughly a decade building out its nuclear fleet, which now covers almost 40 percent of the nation’s energy needs.

Similarly, to integrate intermittent renewables to their maximum potential, we would need to build load-balancing, ultra high-voltage, smart transmission “super-grids” that span continents or even the entire globe so as to shave off as much as possible their volatile swings. (Editor’s note: or, at the very least, we would need substantial investment into energy storage.) While the wind might not be blowing and the sun not shining in one region, there is always somewhere else on the planet where the wind and the sun are doing what we want them to do when we want them to do it. We need to plan this project on the basis of system reliability, i.e, need. A patchwork of private energy companies will only build out what is profitable. And the up-front costs here are immense. China has its eyes set on precisely this through its Global Energy Interconnection initiative. The price tag for a worldwide electricity grid? $50 trillion.

The Regulatory Limit

Many greens call for a retreat from scale, a return to the small and local. But this, too, misdiagnoses the source of the problem. Replacing all multinationals with a billion small businesses would not eliminate the market incentive to disrupt ecosystem services. Indeed, given small businesses’ gross diseconomies of scale, disruption would only intensify.

At a minimum, we need regulation, that toe-dipping exercise in economic planning. A government policy that requires all firms that manufacture a particular commodity to use a nonpolluting production process would undermine the advantages gained by high polluters. This is the social-democratic option, and it has a lot going for it. Indeed, we should remember how fruitful regulation has been since we gained a deeper understanding of our global ecological challenges. We patched our deteriorating ozone layer; we returned wolf populations and the forests they inhabit to central Europe; we relegated the infamous London fog of Dickens, Holmes and Hitchcock to fiction, although coal particulates still choke Beijing and Shanghai (editor’s note: not so much anymore).

Indeed, much of the climate challenge we face comes from an underdeveloped global South rightly seeking to catch up. But regulation only temporarily tames the beast, and it often fails. Capital so easily slips its leash. So long as a market exists, capital will try to capture its regulatory masters. Everyone, from pipeline-exploding eco-terrorists to Paris Agreement–drafters, recognizes that this fundamental barrier stalls our attempts to curb greenhouse gas emissions: if any one jurisdiction, sector, or company undertakes the level of breakneck decarbonization needed, their goods and services will instantly be priced out of the global market.

Thus, only a global, democratically planned economy can completely starve the beast. Climate change and the wider bio-crisis reveal that multiple local, or regional or continent-wide, decision-making structures are obsolete. No jurisdiction can decarbonize its economy unless others do as well. For even if one country figures out how to capture and store carbon, the rest of the world will still face an acidifying ocean. Similar truths hold for nitrogen and phosphorus flows, closing nutrient-input loops, biodiversity loss, and freshwater management.

Moving beyond environmental questions, we could say the same about antibiotic resistance, pandemic diseases, or near-Earth asteroids. Even in less existential policy areas, like manufacturing, trade, and migration, too many interlinked nodes tie our truly planetary society together. One of capitalism’s great contradictions is that it increases the real connections between people at the same time as it encourages us to see each other as monadic individuals.

All this demonstrates both the horror and marvel of the Anthropocene. Humanity so fully commands the resources that surround us that we have transformed the planet in mere decades, on a scale that leviathan biogeophysical processes took millions of years to accomplish. But such awesome capability is being wielded blindly, without intent, in the service of profit, rather than human need.

The Socialist Anthropocene

Climate researchers sometimes talk about a “good Anthropocene” and a “bad Anthropocene.” The latter describes the intensification, and perhaps acceleration, of humanity’s unintended disruption of the ecosystems on which we depend. The former, however, names a situation in which we accept our role as collective sovereign of earth and begin influencing and coordinating planetary processes with purpose and direction, furthering human flourishing.

Such an attempt at dominion over the earth system may appear, at first glance, to be the ultimate in anthropocentric hubris; but this is in fact precisely what we argue when we say that we want to stop climate change, even if we don’t realize that’s what we’re saying. Because why would Planet Earth care about the particular temperature that predominated for most of the past few centuries, a highly unusual period of global temperature stability? Life on this rock, since it first emerged four and a half billion years ago, has experienced much-higher average global temperatures than even the worst projections of anthropogenic global warming. The late paleontologist, socialist and committed environmentalist Stephen Jay Gould once pooh-poohed all suggestions that we need to “save the planet.” “We should be so powerful!” he responded. “The earth will be perfectly fine. It is humanity that needs saving!” Even making very simple, unobjectionable statements such as “global warming will increase extreme weather events and so we should try to avoid that,” we are inescapably embracing an anthropocentric stance: that we aim to stabilize an optimum temperature for the sake of humanity.

We cannot reach this worthy goal without democratic planning and a steady overcoming of the market. The scale of what we must do—the biogeophysical processes we must understand, track, and master in order to prevent dangerous climate change and associated threats—is almost unfathomable in its complexity. We cannot trust the irrational, unplanned market with its perverse incentives to coordinate the earth’s ecosystems. Counteracting climate change and planning the economy are projects of comparable ambition: if we can manage the earth system, with all its variables and myriad processes, we can also manage a global economy. Once the price signal is eliminated, we will have to consciously perform the accounting that, under the market, is implicitly contained in prices. Planning will have to account for the ecosystem services implicitly included in prices—as well as those that the market ignores. Therefore, any democratic planning of the human economy is at the same time a democratic planning of the earth system. Global democratic planning is not merely necessary for the good Anthropocene; it is the good Anthropocene.

CONCLUSION: PLANNING WORKS

Planning exists all around us, and it clearly works; otherwise capitalists would not make such comprehensive use of it. That’s the simple message of this book and one that strikes at the heart of the dogma that “there is no alternative.” Today, this Thatcherite slogan is already wilting under the pressure of its own success. It has created an anti–social compact: a world of rising inequality and widespread stagnation. But it is under attack from within as well. From Amazon’s warehouses to Foxconn’s factories to all major branches of industry, the capitalist system operates without price signals and markets. It plans—and it plans well.

However, if the good news is that planning works, the bad news is precisely that it currently works within the confines of a profit system that restricts what is able to be produced to that which is profitable; and so long as this is profitable, the system allows what is harmful to continue to be produced. Profit pushes capitalist planning to achieve remarkable efficiencies in resource use and human labor. But nothing stops long hours at poverty wages, climate-busting production methods and fossil-fueled transportation from being inputs into plans—in fact, a host of economic incentives encourage just this. Amazon is as much a complex planning mechanism based in human ingenuity as it is an inhuman place to work. Some 150 years later, we have much the same reaction of awe and terror at the contradictions of twenty-first-century capitalism as had Marx in the face of its Victorian antecedent.

Our world, of course, is very different than his—one in which a haphazard quest for profit was driven by steam power and colonial expansion. Ours is a time of ubiquitous computing and increasingly sophisticated predictive algorithms, layered on top of centuries of accelerated technological and social change.

And, here, Thatcher was wrong on another count. She didn’t just say that there was “no alternative,” but went further, claiming, “there is no such thing as society.” Silicon Valley’s slogans about bringing people together may appear corny, but in that corn, there hides a kernel of truth that disproves this second Thatcherite dictum. Capitalism brings us closer together, now more than ever. Our individual actions rely on globe-spanning chains of activities of others. It takes hundreds, if not thousands, of workers to make one gadget and all its components. Many of these links are invisible to us: from the miners in Africa digging up rare earth metals to the workers in Vietnam manufacturing OLED displays to the millions putting phones together in Foxconn’s factories that resemble small cities, much of their labor is performed in conditions little different than those of the mills and mines of nineteenth-century Britain—ones that are dangerous, overcrowded, and demand inhuman pace. And all that work relies on a second, even more hidden economy of household production, whose uncompensated weight is still largely borne by women. All we see is that final anonymous, yet also indispensable, individual, often a retail worker on minimum wage, who hands over a box.

The accuracy of a Google search result or a recommended product on Amazon is built from the unpaid labor of millions of others across the globe, clicking and liking, sending untold numbers of tiny packets of information—that supposed stumbling block to large-scale planning—around the globe. The glimmers of hope for a different way of doing things are foreshadowed in the sophisticated economic planning and intense long-distance cooperation already happening under capitalism. If today’s economic system can plan at the level of a firm larger than many national economies and produce the information that makes such planning ever more efficient, then the task for the future is obvious: we must democratize and expand this realm of planning—that is, spread itto the level of entire economies, even the entire globe.

The foundations for this alternative mode of production have, in many senses, already been laid; we already carry, in our pockets, access to more information and computing power than could have been dreamed by any of the protagonists of past debates about the possibilities of planning.

At the same time, it is not enough to say, “Nationalize it!” Planning on its own is no synonym for socialism. It is the precondition, certainly, but it is not a sufficient condition. We must be cautious not to fall for bourgeois abstractions about society and the state that do not account for its class basis. Friedrich Engels, in Socialism: Utopian and Scientific, warned against the notion that capitalist nationalization is synonymous with socialism:

Of late, since Bismarck went in for State-ownership of industrial establishments, a kind of spurious Socialism has arisen, degenerating, now and again, into something of flunkyism, that without more ado declares all State-ownership, even of the Bismarkian sort, to be socialistic. Certainly, if the taking over by the State of the tobacco industry is socialistic, then Napoleon and Metternich must be numbered among the founders of Socialism. If the Belgian State, for quite ordinary political and financial reasons, itself constructed its chief railway lines; if Bismarck, not under any economic compulsion, took over for the State the chief Prussian lines, simply to be the better able to have them in hand in case of war, to bring up the railway employees as voting cattle for the Government, and especially to create for himself a new source of income independent of parliamentary votes—this was, in no sense, a socialistic measure, directly or indirectly, consciously or unconsciously. Otherwise, the Royal Maritime Company, the Royal porcelain manufacture, and even the regimental tailor of the army would also be socialistic institutions, or even, as was seriously proposed by a sly dog in Frederick William III’s reign, the taking over by the State of the brothels.

We shouldn’t suggest planning is simply a matter of “taking over the machine”; still less “the government” taking it over and otherwise leaving the machine as it is. Since so much of our social world—our rules and customs, habits and preconceptions, these very systems of planning—have been influenced by the logic of the market, it is not simply a world that must be taken over but one that must be transformed. Likewise, we cannot risk creating a new society run by technocratic planners, we want a democratized society of citizen-planners.

Humans have long relied on planning, from the simple distribution carried out by the first settled civilizations, to the complex calculations that undergird today’s corporate behemoths, to those rare instances, like war or disaster, when the rules of today’s complex economy are temporarily suspended and planning takes over on the grandest scales. It is our hope that today’s revolutionaries, and indeed society as a whole, can recapture the ambition to make such planning a beacon for its long-term vision. To do so, we need to study how it works today, design transitional demands to expand its reach, and dream of transforming its workings completely to deliver a future realm of true liberation.

Planning is already everywhere, but rather than functioning as a building block of a rational economy based on need, it is woven into an irrational system of market forces driven by profit.

Planning works, just not yet for us.