Skip to content
blue sky, clouds, chips

Global Tech Supply Chains are as Complex as a Circuit Board

Top Gun: Maverick skims the surface of a real-life macro-political tangle around microchips.

Words: Brendan O'Connor
Pictures: Hamza Nouasria

A swirling, buzzing, screaming hive — a deadly collective, men without faces or histories absorbed into an impersonal, whirling system of machines capable of delivering death and destruction at range. A distributed, tentacular automaton. This is how “Top Gun: Maverick” opens: with scenes of American fighter jets launching from a US aircraft carrier, their rumbling takeoffs rocking an Abraham Lincoln bobblehead doll shown in profile against a golden sunset. This is the only hint of specificity available to the layman: we are orbiting the USS Abraham Lincoln, a carrier in the Pacific Fleet — key enforcer of American hegemony, mobilized repeatedly to expand, consolidate, and defend US reach and influence over and within East Asia and Oceania.

Where would the Japanese, South Korean, Singaporean, and Taiwanese economies be without the omnipresent watch of American carrier groups? Where, indeed, the Chinese economy? The latent presumption of American hegemony, and the dependence of any and all “elsewheres” is the fabric the opening moments are woven into — the film’s plot is submerged in these presuppositions as the aircraft carrier, the masked sailors and pilots, and the jets screaming across the sky make their way in an open and featureless waterscape. American military power — which relies on, congeals, and accelerates American economic power — is nowhere and everywhere; that is to say, it can be deployed anywhere.

Having been tenuously situated in this floating, shaking, explosive world, the viewer is transported to a place with a name: the Mojave Desert, California. Another golden sun, now rising, backlights another recognizable silhouette — this one not an American president but an American celebrity, Tom Cruise. (Perhaps a future president?) He’s at work on a P-51 Mustang, once a piece of cutting-edge military technology, now an antique. This kind of thing is a recurring motif throughout “Maverick”: whereas the original “Top Gun” celebrates American dominance, “Maverick” is much more anxious and ambivalent: Cruise’s character may be aging, but he still has the charisma and swagger to bed Jennifer Connolly; he may be less physically fit than younger pilots, but he has the skills to out-pilot them and to teach them. But it’s always a question, and right from the beginning it is made clear that obsolescence is inevitable.

When we are reintroduced to Cruise as Maverick, he is not flying active-duty missions or even teaching younger pilots at the Top Gun academy but helping lead a research program in another apparently featureless environment (not the ocean but the desert), rocketing through the upper atmosphere at previously unreachable speeds. The program, however, is about to be shut down — ostensibly because Maverick’s team hasn’t reached their goal of hitting Mach 10, but really because an admiral known as the “Drone Ranger,” played by Ed Harris, “wants our budget for his unmanned program.” Bucking authority, as he is wont to do, Maverick goes ahead with the day’s test flight, during which he is only supposed to hit Mach 9, before the program is officially shut down because — well, because he is a maverick. But he is also self-sacrificing, knowing he will be punished, maybe even court-martialed for this last flight, but goes ahead anyway because it could potentially save his friends’ jobs if he can hit Mach 10.

As he takes off, Maverick addresses the plane, dubbed Darkstar: “All right sweetheart. One last ride.” The Drone Ranger pulls up to base as the Darkstar screeches overhead, literally left in Mav’s dust. Once in flight, moving at several times the speed of sound, the screaming Darkstar turns silent. Beeps and boops sound in the cockpit; a Lockheed Martin logo makes a subtle appearance. (The film’s producers consulted with Lockheed Martin on the design of the Darkstar, which is reportedly modeled on the SR-72, a rumored hypersonic successor to the SR-71, developed by Lockheed’s Skunk Works team. Ironically, the “real” SR-72 would be unmanned.) Speed (and external temperature) continue to rise. “We’re feeling good,” Maverick says. At Mach 9, he becomes “the fastest man alive,” according to a technician on the ground. This is when Maverick begins pushing his and the plane’s limits: “Come on sweetheart, just a little more,” he grunts. “Just a little.”

The camera flips between closeups on Maverick’s sweating face, the fiery exhaust of the engine, the Machmeter ticking towards 10. He hits the target speed, and the control center erupts in cheers; Harris scowls. But Maverick can’t help himself: he decides to push for more, beyond 10. “You got some balls, stick jockey, I’ll give you that,” the Drone Ranger growls. And then: the Machmeter clicks over to 10.3, and alarm bells start buzzing, lights flashing. The limit has been hit: the power of the engine begins to tear the plane apart; its structure cannot direct force into thrust. As Maverick puts it: “Oh, shit.” The control room falls silent; the camera returns to the ground, exterior, facing up toward the sky: the Darkstar tears itself to pieces, burning up in the highest reaches of the atmosphere, a portent in the desert sky.

Maverick, of course, survives the explosion: covered in sweat and dirt and dust, disoriented, looking simultaneously like a spaceman fallen to earth and a miner emerging from the underworld, he walks into a diner and gulps down water. “Where am I?” he asks. “Earth,” a freckly, red-headed kid replies. He returns to base, escorted by military police, and marched into his reprimand by the Drone Ranger. “These planes you’ve been testing, captain, one day, sooner rather than later, they won’t need pilots at all — pilots that need to sleep, eat, take a piss. Pilots that disobey orders,” Admiral Cain growls. “All you did was buy some time for those men out there. The future is coming, and you’re not in it.”


The anxiety over obsolescence, supersession, irrelevance, aging, and death structures “Top Gun: Maverick.” Whether the American superpower’s imperial hegemony can be renewed through the infusion of youthful talent alone, no matter how bright-eyed and bushy-tailed, is unclear. The USS Theodore Roosevelt — which features prominently in the film’s central conflict, wherein a mission must be flown against an unnamed “rogue nation” — was deployed in 2021 to the South China Sea during a period of heightened tensions between China and Taiwan, just a few days after President Joe Biden’s inauguration.

The carrier and its strike group were there “to ensure freedom of the seas, build partnerships that foster maritime security,” according to a statement released by US Indo-Pacific Command. “After sailing through these waters throughout my 30-year career, it’s great to be in the South China Sea again, conducting routine operations, promoting freedom of the seas, and reassuring allies and partners,” the commander of the strike group, Rear Adm. Doug Verissimo, said. “With two-thirds of the world’s trade traveling through this very important region, it is vital that we maintain our presence and continue to promote the rules-based order which has allowed us all to prosper.” That reassurance, basically, is that the United States is still prepared to throw its weight around in the region wherein China is testing political and economic boundaries.

Microchips and Macro Politics

Taiwan, in particular, has reason to be concerned: the island is home to the most advanced microchip manufacturer in the world, the Taiwan Semiconductor Manufacturing Company (TSMC), on which the biggest and most profitable American companies depend. It is also home to a government that the Chinese Communist Party considers to be in rebellion, ruling over territory that the mainland state claims as its own by sovereign right.

In the simplest terms, microchips are the technology that makes all other technology possible — the substrate of the 21st century. They’re made up of parts that are sometimes used to refer to the whole: semiconducting material like silicon or germanium that allows for control over the flow of an electric current, regulated by transistors, which can turn the current on or off, or amplify it. More transistors — packed onto chips of silicon or germanium — mean more power. They’re in everything; the scale of the industry that makes them is mind-boggling: “Last year, the chip industry produced more transistors than the combined quantity of all goods produced by all other companies, in all other industries, in all human history. Nothing else comes close,” Chris Miller writes in his book “Chip War: The Fight for the World’s Most Critical Technology.” As massive as it is, the chip industry is also highly consolidated: approximately 37% of the world’s new computing power every year runs on chips manufactured in Taiwan; a critical piece of technology necessary for the manufacture of the most cutting-edge chips is made exclusively by one Dutch company. “Even a partial blockade [of Taiwan] by Chinese forces would trigger devastating disruptions,” Miller writes. “A single missile strike on TSMC’s most advanced chip fabrication facility could easily cause hundreds of billions of dollars of damage once delays to the production of phones, data centers, autos, telecom networks, and other technology are added up.”

“A single missile strike on TSMC’s most advanced chip fabrication facility could easily cause hundreds of billions of dollars of damage.”

Taiwan was able to secure its strategic position in the capitalist world system — that is to say, to secure American protection — by taking advantage of a period of transition in the development of the microchip industry. Initially, chips were designed by hand and manufactured through a combination of human labor, simple tools, and high-tech machinery. A chip designer might lay out the design for a new, specialized integrated circuit using pencil and paper, and it would be pieced together by an assembly line of workers. This was adequate for chips with merely hundreds or even thousands of transistors, but not those with millions. What is more, chip designers had to keep in mind the specific production processes and capacities of the particular manufacturing facilities where their chips would actually be made. In order to keep pushing the limits of scale and complexity, chip design needed to be standardized so that the process could be more thoroughly and efficiently automated and divisions of labor between chip designers and chip manufacturers more clearly delineated. This was the microchip’s “Gutenberg moment,” Miller writes.

This moment was further realized with the founding of TSMC in 1987, effectively as a joint-venture between the Taiwanese government and the Dutch electronics firm Philips. Taiwan had provided lots of cheap labor for the low end of the semiconductor supply chain since the late 1960s, but it was American firms designing and producing the most cutting-edge chips that were making the real money. In the 1980s and 1990s, China began to integrate itself into the global economy, competing with Taiwan for basic manufacturing and assembly work by offering an even lower-wage labor force eager for factory jobs. TSMC, led by former Texas Instruments executive Morris Chang, promised to solve this problem and give Taiwan the edge over China, catapulting the island to the top of the value chain and providing it with monopoly power in the industry. Chang’s plan, in short, was to sever chip design from chip production once and for all: if chip makers didn’t actually have to make the chips — that is, they didn’t have to invest in building and staffing fabrication facilities, either in the United States or outside of it — start-up costs would be drastically reduced.

While Johannes Gutenberg himself had failed to establish a monopoly over printing, the same was not true in the chip industry. As Miller puts it: By lowering startup costs, Chang’s foundry model gave birth to dozens of new “authors” — fabless chip design firms — that transformed the tech sector by putting computing power in all sorts of devices. However, the democratization of authorship coincided with a monopolization of the digital printing press. The economics of chip manufacturing required relentless consolidation. Whichever company produced the most chips had a built-in advantage, improving its yield and spreading capital investment costs over more customers.

So it was for TSMC, whose business soared during the 1990s, putting it in a position to dominate manufacturing in the industry for decades to come. While each generation of technological development made production more expensive, consolidating manufacturing in a small number of firms around East Asia made it easier to bear these costs: “A foundry like TSMC could fabricate chips for many chip designers, wringing out efficiencies from its massive production volumes that other companies would find difficult to replicate.”

The internationalization of chip production is what has allowed the tech sector to remain profitable and continue growing. But as supply chains extended across the Pacific, they have been stretched thin, concentrating control of production in the hands of just a few firms. To continue growing — to produce chips at both the necessary quantities and level of quality required to sustain profit rates — the industry must continue consolidating, as the cost of manufacturing the most advanced chips continues to rise. This contradictory process is reflected across scales, from the international organization of labor within the industry to the design of nanoscopic microchips themselves.

The Transistor Labor Market

The concept for the transistor (the base component of computing power) was theorized by the physicist Bill Shockley at Bell Labs as early as 1945, but it would take about two decades for the theory to be proven in practice and, just as importantly, to develop a cost-effective production process. “The science of transistors was broadly clear,” Miller writes, “but manufacturing them reliably was an extraordinary challenge.” The breakthrough came at Fairchild Semiconductor, a firm begun by eight engineers who fled Shockley’s management regime, which was infamously dictatorial. At their new firm, they developed a technique for fabricating chips that included etching holes as needed into a layer of silicon dioxide that coated slabs of silicon, protecting the base material from impurities, that also allowed multiple transistors to be built on the same chip without any freestanding wires. These would come to be known as “integrated circuits,” and they were significantly more reliable than any comparable device. They were also more readily miniaturized, which meant that they would require less electricity to work. The Fairchild founders had a revelation: “Miniaturization and electric efficiency were a powerful combination: smaller transistors and reduced power consumption would create new use cases for their integrated circuits.”

This combination, they discovered, would also allow chipmakers to pack more and more transistors into the same limited space. In 1961, Fairchild announced the Micrologic, a silicon chip with four transistors on it. Soon, the firm was making chips with a dozen transistors, then a hundred. In 1965, one of the co-founders, Gordon Moore, predicted that the number of components that could be fit onto a chip would double annually for the next decade, anticipating ten years of exponential growth in computing power that would make possible all kinds of personal electronic devices like wristwatches and portable telephones. He was both right and wrong: computing power did grow exponentially for the next decade — and another four decades thereafter. This prediction has now been naturalized as “Moore’s Law.”

However, giving Moore’s prediction the appearance of physical law wasn’t only a matter of shrinking the size of transistors, Miller notes. It also required an enormous supply of cheap labor, people who could be driven to greater and greater levels of productivity. This would be the role played by Charlie Sporck, who arrived in California to join Fairchild as a manager, after being run out of a unionized GE factory in Hudson Falls, New York. (Workers burned him in effigy.) In the Santa Clara Valley, on the other hand, the labor movement was weak, and Sporck fought any effort to change that. While most of the people who designed the chips were men, the workers who actually assembled them were largely women — including many immigrant women — who had been working on assembly lines in the Santa Clara Valley for decades. Miller writes:

“Chip firms hired women because they could be paid lower wages and were less likely than men to demand better working conditions. Production managers also believed women’s smaller hands made them better at assembling and testing finished semiconductors. In the 1960s, the process of attaching a silicon chip to the piece of plastic on which it would sit first required looking through a microscope to position the silicon onto the plastic. The assembly worker then held the two pieces together as a machine applied heat, pressure, and ultrasonic vibration to bond the silicon to the plastic base. Thin gold wires were attached, again by hand, to conduct electricity to and from the chip. Finally, the chip had to be tested by plugging it into a meter — another step that at the time could only be done by hand.”

Incredibly difficult, tedious work, in other words. And as the market for chips grew, so did the need for labor to do this work.

But even the non-union, immigrant women of the Santa Clara Valley demanded high enough wages that costs threatened to creep back up: industry executives sought solutions within the continental US, opening facilities in Maine and on a Navajo reservation in New Mexico, but before long, they began to look overseas: specifically, to the British colony of Hong Kong, where the average wage of 25 cents an hour was among the highest in Asia but just a tenth of the American average. Fairchild would continue making silicon wafers in California but began shipping semiconductors to Hong Kong for final assembly. Low labor costs also meant that Fairchild could hire trained engineers to run the assembly lines, which led to higher production quality. Fairchild opened its Asian operation in 1963; within a decade, almost all of the other chipmakers had opened assembly facilities overseas as well. They had to, if they were to keep up with Fairchild. “The semiconductor industry was globalizing decades before anyone had heard of the word, laying the grounds for the Asia-centric supply chains we know today,” Miller writes. “Managers like Sporck had no game plan for globalization. He’d just as happily have kept building factories in Maine or California had they cost the same. But Asia had millions of peasant farmers looking for factory jobs, keeping wages low and guaranteeing they’d stay low for some time.”

But not forever. Moore’s Law is not a natural law, but a prediction based on production capacities, capital flows, and the availability of highly exploitable labor. There are limits: political and economic as well as physical. “At some point, the laws of physics will make it impossible to shrink transistors further,” Miller warns. “Even before then, it could become too costly to manufacture them.” Already it is proving more difficult to keep costs down: the extreme ultraviolet lithography machines needed to print the smallest and most advanced chips cost more than $100 million apiece. (And only one company in the world makes them.) And yet, Miller notes, startups focused on designing chips for artificial intelligence and other highly complex and specialized logic chips have raised billions of dollars in funding, while the big tech firms like Google, Amazon, Microsoft, Apple, Facebook, and Alibaba are pouring funds into their own chip design arms. “There’s clearly no deficit of innovation,” he writes. The question, Miller argues, isn’t whether Moore’s Law has hit its limit, “but whether we’ve reached a peak in the amount of computing power a chip can cost-effectively produce. Many thousands of engineers and many billions of dollars are betting not.” In other words, they are betting that if they throw enough money at the problem, they’ll be the ones to break through the limit — and release untold profits and productivity on the other side.


Since the 1970s, the heart of American capitalism has moved from the Midwest to the Santa Clara Valley. The tech sector, built on computing power and non-union labor distributed around the Pacific, has provided the American military with cutting-edge weapons and the American public with mind-numbing toys, all packed with a growing number of transistors. But why did this happen in Northern California? How did the Santa Clara Valley become Silicon Valley? “California’s culture mattered just as much as any economic structure,” Miller offers. “The people who left America’s East Coast, Europe, and Asia to build the chip industry often cited a sense of boundless opportunity in their decision to move to Silicon Valley. For the world’s smartest engineers and most creative entrepreneurs, there was simply no more exciting place to be.” This does not really answer the question though. Miller’s “Chip War” is an excellent and detailed history of semiconductors as a commodity and the people who made them — well, their bosses, anyway. But it falls short as a history of place and power.

Building the Systemic Inequality of the Tech Industry

The history of Silicon Valley begins long before it was called such, as shown in Malcolm Harris’s “Palo Alto: A History of California, Capitalism, and the World.” “Unlike so much of the world, California did not see capitalist economics evolve step-by-step out of feudal property relations. Capital hit California like a meteor, alien tendrils surging from the crash site,” Harris writes. More attentive to racial divisions of labor than Miller, Harris begins his history of Palo Alto with the California gold rush and the “whiteness cartel” that developed in order to organize and allocate profits therefrom. The gold rush fundamentally and irrevocably changed prevailing relations to land and property in California, he argues: while Indigenous societies lived off the land efficiently, communally, and in concentrated territories, the surface miners drawn to the west coast by the promise of gold moved like locusts, exhausting territory and moving on as quickly as possible. “Instead of cycling with the seasons, mining moved linearly, exponentially, cumulatively. There is no such thing as enough gold.”

Nobody would make any money if everyone was constantly stealing and robbing from each other, though. “Crude protocols for collective governance” were developed, but they were not universal: “This was Anglo-Californian self-government, and that hardly described the mass of miners… Excluding foreigners and Indians from gold claims became a raison d’être for the miner councils and then for the Golden State government itself.” This was a regime built on racial violence, Harris argues: the state in and of California does not only manage racial violence but directs it, encourages it, and organizes it.

And so it would remain, even as the regime of accumulation in the state evolved over time. Before long, surface mining was replaced by more productive forms of extraction. Pans were replaced with rockers, rockers with sluice boxes, and sluice boxes with hydrolickers, which could wash away entire hillsides in search of veins of gold. “The more efficient the model, the more investment capital was required — for research into claims, for engineers and construction, for expensive field provisions, and for employees,” Harris writes. “The frontier community of free white gold miners with nothing on their backs disintegrated as specialists just as engineers and managers took over operations on behalf of clean-handed investors.” The era of settlement had passed: “Now the state’s economic life reorganized under capitalist auspices, and settlers became workers.”

Many of those settler-cum-workers (white and otherwise) were absorbed into the burgeoning railroad industry, which was developed in part by robber barons like Leland Stanford, who had made his first fortune during the mining boom. This was an epochal shift, Harris writes:

“With the advent of the integrated world system, in which the transcontinental line was, along with the Suez Canal, a decisive link, investment flows determined the shape of what was to come. Capital’s ravenous hunger for higher returns carved a new physical and social geography out of the earth. It figuratively flattened space, blowing holes in some mountains as well. But contrary to some progressive expectations, it failed to dissolve barriers between peoples. Instead it formalized new ones. Capitalists used racial segmentation to generate wage differentials, and legal, economic, social, and civic exclusion fell together in a dialectical tumble, each determining and determined by the others.”

Through a combination of luck, cunning, and financial ingenuity, Stanford was able to use the opportunities afforded by the railroad to grow his wealth beyond imagining. And what did he do with that wealth? Like a lot of rich guys, he got into horses. This was no mere hobby, however. By 1870, heavily-agricultural California was home to three times as many draft animals per farm as the national average; finding a way to increase productivity meant reducing the cost of horses — making them better, faster, stronger, more durable, more productive. “He saw himself as engaged in a serious scientific campaign regarding the improved performance of the laboring animal — hippology, or equine engineering,” Harris writes. “If he could master the production of better horses, then he could improve the country’s capital stock… Stronger, more durable horses led faster carriages and bigger plows for longer, which reduced the costs of production and increased social circulation in unimaginable ways.”

The logic of the Palo Alto System that Stanford developed in breeding horses — that is, the logic of capital, articulated as eugenics — structured the organization of the new university.

After his son died at a young age, Leland and his wife Jane Lathrop Stanford founded Stanford University in his honor, endowed generously by the family. Following Stanford Sr.’s death, the university president David Starr Jordan engaged in a power struggle with Jane for control of the future of the school. Jane was poisoned twice following her husband’s death; circumstantial evidence points to Jordan. In any case, Jordan outlasted Jane Stanford, and was able to shape the future of the university to his (and his benefactor’s) interests: namely, racial hygiene and the science of evolution. The logic of the Palo Alto System that Stanford developed in breeding horses — that is, the logic of capital, articulated as eugenics — structured the organization of the new university. With Asia across the ocean to the west and Mexico to the south, California was “the frontier of Anglo white dominion, and it became a laboratory of racial classification.” Capitalists drew workers in, segregating, exploiting, and expelling them as necessary, producing and re-inscribing difference through law and practice. Growers became particularly adept at this strategy, pedaling the state’s nonwhite labor “like a bicycle,” as Harris puts it: “When they pushed one group down, another rose to replace it, and the whole contraption moved a little farther down the road.”

If California was a laboratory of racial classification, the “eugenic university” provided the lab techs: “Stanford itself was a self-consciously eugenic project: Administrators believe they were selecting for and promoting not simply the best young men and women but also the best genes.” Bill Shockley Jr., son of a Stanford engineering professor, was one of those young men: “Bill Jr. was promising, but at 129 his IQ ranked slightly subgenius.” Still, the subgenius Shockley carried the lessons of his highly cultivated childhood into adult life. In 1939, when America began to assume a war footing, Shockley was working at Bell Labs, a rising star within the firm’s physics research group even though his model for a semiconducting transistor was (apparently) yet to work in real life. As the war progressed, he was recruited closer and closer to the front lines — or, at least, researching what was happening across the front lines with a squadron of actuaries armed with paper and pencil. “Together they created the field known as operations research, a term Shockley invented over the course of their work. By breaking problems down to mathematical questions, they reduced the war to a series of brain-teasers.” His job was to end the war as efficiently as possible, Harris writes:

Shockley knew that a society’s central resource was its citizenry; everything could be reduced to months of generic labor. When he did the math, he found that the bombing of Germany hadn’t really been so effective: in terms of man-months, building the bombs cost the Brits around a third of the damage they did to the Nazis. The numbers in the Pacific were even worse. But Shockley retrained the radar bombing crews, and in the spring of 1945 they began night raids with napalm and white phosphorus munitions, burning Japan’s cities. These attacks included the March Tokyo sortie that torched half the city to the ground in the war’s single deadliest night.

Shockley wasn’t read in on the Manhattan Project — at least not officially, Harris notes — but he and J. Robert Oppenheimer et al. were operating according to the same logic of efficiency. Two weeks after Shockley sent a memo titled “Proposal for Increasing the Scope of Casualty Studies,” in which he projected that a successful invasion of the Japanese mainland would require the death of 5 to 10 million Japanese and one dead American soldier for every ten Japanese killed, the United States dropped an atomic bomb on Hiroshima.

“Shockley is the founder of Silicon Valley the way a pile of excrement is the founder of a garden.”

After the war, having been given the country’s highest civilian award, the Medal for Merit, Shockley struck out on his own, working his military, industrial, and academic connections for funding for his own semiconductor manufacturing firm. This proved more difficult than might have been expected, because Shockley, while brilliant, was a menace to work with or for. Finally, one of his old mentors, Arnold Beckman, founder of Beckman Instruments, brought him into the fold, allowing him to start his own lab as part of Beckman Instruments. That year, Shockley won the Nobel Prize along with the two scientists who were able to prove his transistor theory. Shockley Semiconductor Laboratory would turn out to be a complete failure, but Shockley himself, Nobel Prize winner and war hero, was able to bring together some of the country’s most brilliant young engineering talent in California — if only to alienate them and motivate them to start up their own ventures.

Eight of them would jump ship, starting their own firm with capital from Sherman Fairchild, the heir to one of IBM’s co-founders. Within a year, Fairchild Semiconductor was selling chips to IBM, initially for $150 each — chips that cost the firm only 13 cents to make. The founders of the new firm became fabulously wealthy and powerful, pillars of the Silicon Valley to come. And as for the subgenius who set the whole operation in motion? “Shockley is the founder of Silicon Valley the way a pile of excrement is the founder of a garden,” Harris quips.

Globalizing the Supply Chain

Still, there was more money to be made: of the 13 cents it took to actually make chips, only three went to materials, leaving an entire dime for labor. The labor process is delicate and complicated: silicon wafers need to be prepared for other components — itself a complex chemical process — sliced into pieces and mounted on circuit boards, tested, and packaged. In the early days, Fairchild oversaw all these steps directly in manufacturing facilities located in the Bay Area: “The firm hired women to do the assembly work and men to supervise them, echoing the gender-segregated division of labor in the orchards and canneries. Fairchild had three main ways to reduce labor costs reduce the amount of labor per chip via automation, find a way to reduce the cost of the labor per chip, or, preferably, both.” Automation, however, was high risk: what if you invested a whole bunch of money in expensive new machinery, only for someone to figure out a way to do the same task more cheaply using human labor? That machine — or, heaven forbid, expensive new manufacturing facility — is already obsolete, even before it has earned back the money put into it.

How else might one reduce the cost of labor per chip? The going rate for assembly line work in the Bay Area at the time was around $2.50 an hour, while in Hong Kong—where Fairchild opened shop in the early 1960s — it was only 10 cents. This was, as much as any design tweak, the central innovation of the computer age: the splitting apart of “high-cost engineering and design from low-cost assembly work,” which Taiwan and TSMC would capitalize on so shrewdly in the decades to come. The American tech industry and the American state stood in uneasy and ambivalent relation to each other over the course of the latter half of the 20th century, but each came to the other’s aid when necessary: the state guaranteeing tech’s profit rates and tech guaranteeing the state’s military dominance: “The First World’s Cold War arsenal created the production enclaves where capital could count on low wages, freeing semiconductor firms and ultimately US industry in general from domestic wage-price inflation. Besides, putting production in East and Southeast Asia kept electronics firms near their biggest customer: the US military.”

From a certain angle, Harris muses, the geography carved into the surface of the earth as a result of this symbiosis itself began to take on the appearance of a microchip: “Like the components in an integrated circuit, America acted to isolate nations from the international current of socialist revolt and connect them in a precise pattern of capital investment, labor exploitation, and profit flow.”


As value in motion, capital is always seeking to shift from one form to another: circulating capital becomes fixed, and fixed capital circulates. It is precisely in the transformation that value is produced. This circulatory process is not a closed loop, however, but a constantly expanding spiral. Capital pushes beyond its boundaries: “Every limit appears as a barrier to overcome,” as Marx puts it in the Grundrisse. It creates the frontier and then leaps beyond it, pushing upward and outward but also drilling down, into the earth but also into itself. Ultimately, this is what “Moore’s Law” is describing: the need to divide and subdivide, partition and repartition, to make more space from which to extract profit and generate power to spread outward further and faster, only to divide and subdivide again and again.

This tendency is reflected across scales, from the microchip to the nation-state. The end of empire and the end of communism gave birth to a slew of new countries midway through the 20th century and again at its close, Quinn Slobodian argues in “Crack-Up Capitalism: Market Radicals and the Dream of a World Without Democracy,” but in the 21st century the nation-state has been joined by a new territorial entity: the zone. “What is a zone? At its most basic, it is an enclave carved out of a nation and freed from ordinary forms of regulation. The usual powers of taxation are often suspended within its borders, letting investors effectively dictate their own rules,” Slobodian writes. “At one end of the socioeconomic spectrum, zones can be nodes in the networks of cross-border manufacturing. Often ringed by barbed wire, these are sites for low-wage production. At the other end, we can see a version of the zone in the tax havens where transnational corporations secrete away their earnings.”

This was, as much as any design tweak, the central innovation of the computer age: the splitting apart of “high-cost engineering and design from low-cost assembly work.

Using the metaphor of “perforation” to describe the “decades-long effort to pierce holes in the social fabric, to opt out, secede, and defect from the collective,” Slobodian argues that while the zone’s promoters have cast it as freeing capital from the shackles of the state in reality not only can these zones not exist without a strong state but that they constitute yet another chain constraining democracy and liberation. For some of the zone’s advocates, of course, this is explicitly a draw. At the beginning of the 21st century, Peter Thiel announced his plan to escape his obligations to the democratic state, with all its taxes and regulations. “I no longer believe that freedom and democracy are compatible,” he wrote, infamously. “The great task for libertarians is to find an escape from politics in all its forms.” And how would that be possible? “If we want to increase freedom,” he argued, “we want to increase the number of countries.” To this end, Thiel would fund the Seasteading Institute for many years, supporting its effort to establish sovereign states, modeled on corporations, floating in the open ocean.

The Seasteading Institute was founded by Patri Friedman, grandson of the libertarian Milton Friedman, who himself long sought ways to replicate the example of Hong Kong, which he saw as typifying a state where freedom for capital could be secured without needing to bother with the demands of popular sovereignty or the complications that stem from democratic governance — namely, demands for social programs, health care, public education, and environmental protections. Friedman and his friends sought to create a “Portable Hong Kong,” as Slobodian puts it, one without contradiction or conflict: “a mobile template, untethered from place and freed for realization elsewhere. As a model zone, Hong Kong held out the prospect of an escape from the dilemmas and pressures of midcentury democracy.” But it is Hong Kong’s specificity that makes it what it is: not only its physical geography, shielded from typhoons by the unique mountain range surrounding the Kowloon bay, but its political and cultural history — that is, its history as a British colony, as China’s airlock, its “role as the switchboard and front shop for the mainland factory, a cockpit for the Chinese boom.”

Another postcolonial British territory in Asia, Singapore, occupied a similar space in the emergent neoliberal imaginary of the period. In 1972, Singapore became the second country in Asia to retool their port for shipping containers, becoming the fourth-busiest port in the world nearly overnight, quickly moving up the value chain. The Dutch electronics firm Philips had come to the island in the early years after the British left. By 1969, Texas Instruments had opened a plant there. Apple followed suit in 1981. Singapore had positioned itself well for the age of the microchip:

The first place to call itself a “smart city,” Singapore attempted to wire the country with broadband and put a computer in every home with the Intelligent Island initiative in the 1990s. Not only did the country produce the literal hardware at its semiconductor foundries, but its laws, when exported to places like coastal China, were referred to as “software.” Singapore chimed with the idea of cut and paste, the notion that a government’s operating system could be duplicated and realized elsewhere.

This idea bounced back and forth between the core and the periphery, the (former, declining) metropole and the (former, rising) colony: Thatcherite neoliberals in the United Kingdom looked to Singapore for inspiration, finding in the postcolonial city-state a potential model for the flagging British economy — especially after leaving the European Union.

But the Thatcherites misrecognized what was happening; or, rather, they saw in Singapore what they wanted to see: not the careful combination of state planning and provision (and repression) that actually was, but a laissez-faire paradise — an image that the Singaporean state was careful to cultivate to attract foreign capital. “The argument over the meaning of Singapore is part of a larger argument over the future of capitalism,” Slobodian writes. “Will there be a race to the bottom based on low taxes, low wages, and light regulation continue or will it be replaced by a race to the top based on high wages and heavy investment? Either way, the vision is marred by blind spots.” The largest of these is the question of labor: “the sand in the machinery of globalization.” There are no Singaporean solutions to Britain’s problems, Slobodian argues, for the simple reason that Singapore’s problems and Britain’s problems are not so different: both are marked by an aging population desperately clinging to eroding social entitlements and reliant on young migrant workers to keep the lights on.

Recently, a new generation of unhinged libertarians (or “Neo-reactionaries,” as they call themselves) is looking again to the postcolonial semi-periphery for inspiration. For Curtis Yarvin, another beneficiary of Thiel’s largesse, Hong Kong, Singapore, and now Dubai prove that “politics is not necessary to a free, stable, and productive modern society.” By the early 2000s, Dubai’s population was made up of roughly 95% foreign nationals. To Yarvin and his co-thinkers, this has freed the state from the bonds of citizenship, transforming the primary social relation into that of the customer: “Abstract ideas of civic belonging or obligation had no place in Dubai.”

The overwhelming majority of that population, however, was not businessmen, entrepreneurs, or other members of the international capitalist class, but migrant workers drawn to higher wages than those on offer in their home countries (mostly across South Asia) and denied the rights or benefits of Emirati citizenship. “While foreign residents from richer countries (known as expats rather than migrants) enjoyed all-you-can-drink brunches and the creature comforts of the West, manual workers were kept in barbed-wire encampments in the desert to minimize flight risk and costs of upkeep.” Unlike nineteenth-century treaty ports in China, where different laws applied to different people, in Dubai, different laws applied to different parts of the territory — jurisdictions organized by function: tech and aviation manufacturing, healthcare, higher education, and finance, for example. “Dubai captured the three qualities of the millennium’s global city: verticality, novelty, and exclusivity,” Slobodian writes. “To someone arriving by air across the vast tan plain of the desert interrupted by islands of desalination plants, grand estates, and industrial bunkers, Dubai’s zones appear like ‘computer motherboards.’ This is how the emirate presented itself to investors, too, a flat space where ‘multinationals can plug in their regional operations.”

The most significant of these is Jebel Ali, “a vast free trade zone and, at sixty-six berths, the world’s largest man-made port… a formally extraterritorial space, five thousand acres of land paved, wired, and ready for construction.” It was also, by the 2000s, the US Navy’s busiest port of call.


Since the outbreak of the Covid-19 pandemic, an intense anxiety has bubbled to the surface of the national consciousness, swirling around supply chains generally and semiconductors and computer processors specifically. The fear, sometimes boiling over into a panic, is that the globalized supply chains American consumers (and firms) rely on for all manner of goods are stretched too thin, have become too vulnerable, and ought to be reeled back; a parallel fear, even more liable to panic, is that America is losing its technological edge to China. In some ways, it was the 2016 election that broke the dam of neoliberalism holding back these fears, but it is the Biden administration that has assembled and passed massive legislative packages ostensibly seeking to address them with a new industrial policy. (That is to say nothing of Biden not only keeping Trump’s China tariffs but also further restricting China’s access to the most advanced chip-making technology.) Whether the CHIPS and Inflation Reduction Acts represent more of the same or the beginning of a new regime of accumulation is yet to be seen—if there is a break or discontinuity, just how drastic is it?

To defeat the nameless, faceless, flagless antagonist of “Top Gun: Maverick,” our hero must descend from the upper atmosphere, down to just a few hundred feet above the ground, to pass by enemy radar unnoticed while navigating treacherous terrain to accomplish a nigh-impossible bombing run, thereby denying the identity-less enemy nuclear capacities. Maverick and his acolytes pull it off, of course, though not without incident. It’s a fun romp, as propaganda goes, and curiously attentive to the fears and anxieties of late imperialism. But the lingering image is not the dramatic dogfights or young guns’ glistening torsos but that of Maverick taking the Darkstar for one last ride, trying to squeeze a little bit more juice from the engines before going up in flames. To those on the ground he appears as “the fastest man alive,” but careening across the commanding heights Mav is barely in control. He is propelled by forces and a machine with whom he can only plead: “Come on sweetheart, just a little more.”

Brendan O'Connor

Brendan O'Connor is a writer in New York and the author of "Blood Red Lines: How Nativism Drives the Right."

Hey there!

You made it to the bottom of the page! That means you must like what we do. In that case, can we ask for your help? Inkstick is changing the face of foreign policy, but we can’t do it without you. If our content is something that you’ve come to rely on, please make a tax-deductible donation today. Even $5 or $10 a month makes a huge difference. Together, we can tell the stories that need to be told.