This Article examines how monopoly power warps incentives to innovate within the largest tech companies across history. Technology monopolies face competing incentives: to innovate and to maintain the status quo. As the center of their market, monopolists have resources and capacity to generate tremendous disruptive innovation. However, disruptive innovation serves only to threaten to destabilize the profitable market structure that a monopolist sits atop. We find that technology monopolists do not fail to innovate, but that they instead restrict that innovation from being released to the market or release the innovation in a diminished way, yoked to the existing technology over which they have monopoly power. We refer to this pattern as “Captured Innovation.” We examine Captured Innovation, and the actions that can break the dam on technology development that it represents, in three case studies of the development of general-purpose technologies at IBM, AT&T, and Google. We find that in these cases where Captured Innovation has taken hold, competition enforcement actions or competition itself can unleash a wave of innovative development.

TABLE OF CONTENTS

I. Introduction

Technical innovation drives human development. It also drives markets: a new invention can rapidly and completely change the way business is done, and which firms dominate a sector. For firms in technology markets, constant innovation is typically table stakes for doing business. Competitors constantly update their products and sometimes invent entirely new product categories, so any company that fails to participate in this cycle of innovation quickly becomes irrelevant. For this reason, tech companies devote a significant portion of their resources to research and development, because this work is the very core of their enterprises. This is particularly true of the largest firms, as Joseph Schumpeter famously noted more than 80 years ago.1 However, transformative innovation can present dominant firms with a unique dilemma: what to do when their scientists and engineers invent something that challenges their business model, or worse, threatens to make it obsolete? The benefits of technological development to society are clear, but the benefits to a dominant firm that believes it has already captured all the value in a market are not. Monopolists in technology sectors thus have dual incentives towards innovation, on one hand, and stasis, on the other. How do companies in such a position respond to these competing incentives? We examine this question through three case studies of IBM, AT&T, and Google, finding that these three dominant firms responded to these conditions with a two-pronged strategy. First, they withheld the innovation from the marketplace for as long as possible. Second, when they did release the innovation to the marketplace, they ensured that the innovation was yoked to their existing product or service, even when the innovation was better suited to a new product category. We refer to this phenomenon as Captured Innovation.

These three case studies show that different patterns of events can rupture the dam of Captured Innovation, allowing welled-up innovation to reach markets and consumers. Our first two case studies of IBM and AT&T show that in the aftermath of enforcement actions, innovation flourishes in these and adjacent markets. In the final case we study, of Google and the market for transformer-based large AI models, we find a similar surge of welled-up innovation is set off by the entrance of a competitor to a fallow market. At least in part, these blooms of innovation occurred because there was underutilized, widely applicable technology waiting to be unleashed. These cases show competition is essential to tear down market-power-based barriers that prevent other market participants from using those new innovations. And they show that legal intervention and structuring of markets to affirmatively promote competition can kickstart the innovation cycle in uncompetitive technology markets.

To ground our case studies, we posit some commonalities of technology-driven markets. First, they are innovation driven. Firms in these markets can rise and fall on the invention of new widely applicable technologies that make older developments obsolete. However, dominant firms sometimes struggle to adapt to disruptive innovation for non-technological reasons.2 Experts also find that incumbents are less likely to bring new products to market than upstarts, for fear of cannibalization.3

Second, the need for constant innovation requires comparatively high investment in research and development (R&D).4 For monopolists in technology markets, this appears to be favorable: they have excess capital to fund large R&D departments and can afford to invest in basic science knowing that some will eventually be very, very profitable, even if most will not be.5

Finally, R&D is expertise-driven, meaning that these firms’ competitive advantage resides in their top-performing scientists and engineers. This is why firms today frequently seek to restrict this category of employees’ ability to leave with non-competes and other agreements.6 Here, monopolists again have some advantages, because they can nurture and retain the best talent with both top pay and the freedom to pursue whatever work is most interesting, regardless of immediate profitability.7

These factors together mean that many widely applicable basic technologies have been invented by scientists working for large monopolies. This has included general-purpose computers at IBM, a basket of signal-transmission technologies and devices at AT&T, and the word embeddings and transformer structures that make large language models possible at Google.

Despite inventing these world changing technologies, dominant firms often fail to fully explore their potential in the marketplace. We are not the first to observe that innovative firms, once dominant, resist innovation; Tim Wu has written on the topic of large companies eating their young, so to speak.8 The pattern we observe is slightly different, however: IBM, AT&T, and Google did not avoid creating innovation. Instead, they resist using innovations they themselves developed, to their full capacity. The monopolist is often first on the scene with the underlying technological innovation that eventually disrupts their dominance. But the incentive to preserve current market structures drives monopolists to only deploy innovation in service of their existing monopoly, rather than exploring new greenfield uses that may create whole new markets.9 In IBM’s case, this was the lease and service-oriented static mainframe model instead of general-purpose computing made pliable with independent software. In AT&T’s case, it was “plain old fashioned telephone service” over copper wires instead of high-bandwidth signal transmission and the internet. And for Google, it is search, instead of answers.

Many of the business and marketing practices these companies employed to maintain their monopolies earned scrutiny from competition enforcers. Some believe that competition enforcement of this sort stifles innovation, especially in technology markets.10 And holding aside whether competition enforcement helps or hinders innovation, this conversation generally treats “innovation” as an atmospheric good, as if technical innovation, once created, is immediately available to consumers and the market as a whole. Intellectual property law rests on the counter assumption that innovators hold back if they do not have sufficient economic incentives to make their advances public. Through the lens of history, this Article studies the facts of competition enforcement against technology monopolists and finds the opposite: competition enforcement is critical to promote and unleash innovation when dominant firms hold it back.

II.   United States v. IBM: 1969–1982

The story of IBM’s long clash with competition laws was foundational to the way the modern digital economy developed. The core of this story is the Department of Justice case against IBM from 1969 to 1982. Scholars have spilled boundless ink on this trial, and the prevailing wisdom for much of this history was that the IBM trial was a morass and an embarrassment for the government—one of the lowest points in antitrust and competition enforcement. More recent work has recontextualized this history as a qualified success: something less than perfect but far from an ineffective effort to vindicate the goals of competition laws. We proffer no further judgement of the case or its management. Our focus is on IBM’s actions in response to the threat of the case and other private antitrust suits. We find that in direct response to the threat of competition enforcement, IBM took actions with profound pro-competitive effects that gave birth to the entire software industry as we know it today.

A. Pre-1960s Litigation & Technical History

To tell the story properly requires some history of IBM’s anticompetitive behavior. In 1913, the DOJ criminally convicted Thomas Watson—the man who would become CEO of IBM—for violations of the antitrust laws.11 In the 1930s, IBM’s primary commercial product was tabulating machines and cards, which mechanically computed mathematical results,12 and was the dominant business in this technology area, controlling roughly 85% of the tabulation market by the early 1930s.13 From this position, IBM engaged in a number of anticompetitive practices including tying,14 and refusing to sell machines (only leasing them15 ). In 1932, the DOJ brought a complaint against IBM for the illegal tying of tabulating machines and punch cards under Clayton § 316 and again in 1952, winning marginal victories. In January 1956, IBM ultimately agreed—without a court decision or admission of guilt—to a consent decree in which it would license its patents and spin off its service business.17 The decree also required IBM to sell its machines in addition to leasing them, to eliminate any further leverage IBM had through its lease terms. Part of the reason for this consent decree is no doubt the force of the government case, but it certainly benefited from an IBM that was not so dearly attached to its tabulating business as it was a decade earlier. By the early 1950s, the technology for computation was beginning a radical shift. By the time of the 1956 consent decree, the transition away from punch card tabulating machines was the clear trend. Electronic digital computing devices would be the critical technology of the next several decades of IBM’s business.

The commercial power and standing IBM had gained in tabulation made it the clear dominant player in this emergent electronic computing field. However, early competition for this market was fierce: Honeywell, Sperry Rand, and General Electric were active players throughout the 1950s and 60s.18 Electronic computing devices during this early period were generally not true general-purpose computers as we know them today. These systems lacked operating systems, and few were Turing complete. These devices were still being sold in the same way that earlier tabulating machines were—single purpose ‘business machines.’ The shift to electronic computation had made these devices easier, faster, and cheaper to manufacture, but had not yet fundamentally changed what they did. In this dynamic nascent market, it was not enough to simply lock up contracts. Technology was changing rapidly enough, even for core components, that patents did not serve as the same guarantee of market control that IBM had previously enjoyed. Because these devices were self-contained rather than reliant on a punch card ecosystem, there was less outside leverage even a dominant player like IBM could bring to bear to prevent defection.

B. General Purpose Technology: General Purpose Computers (The System/360)

This competitive landscape pushed IBM to develop one of the most transformative products in its history: the System/360 (also known as the S/360), introduced in 1964. The S/360 was revolutionary in many ways: it was the first computer sold with an operating system that controlled access to resources like memory and storage. It was also the first computer with a system-independent software architecture. This made it expandable, so that customers could add additional memory or processors after their initial purchase if their needs grew. It also lowered IBM’s long-term costs of development and maintenance, because they developed one architecture which could support a wide range of built machines, from very small to very large. In many ways, it was the first commercially offered general purpose computer that we would recognize as such today.

IBM was initially motivated to build the S/360 by the need to manage their own development and maintenance costs.19 But by the early 1960’s, computation itself became the task that customers needed as opposed to a single business task executed by a computer. Many businesses in a variety of fields wanted to run econometric models, and the need for scientific computing was growing in leaps and bounds with both government and private sector investments in technology development. In this environment, the S/360 was a near-instant success and massively profitable.

A fact that is not widely understood outside of technical circles is that the S/360 was a success not just because of its innovative hardware design, but also because of its massive leaps forward in software.20 Many innovative software features (such as emulators) and design practices (such as the concept of pilot projects) were created during the development of the S/360. “The Mythical Man-Month,” still widely regarded as the Bible of software engineering, was written by Fred Brooks, the chief architect of the S/360, to share the development practices he created while building software at IBM.21 These leaps forward didn’t just benefit IBM’s customers—they were massively profitable for IBM. The flexibility and compatibility of the S/360 allowed it to be sold for a variety of purposes. Its general-purpose nature also created huge cost savings for IBM over its lifetime by reducing maintenance costs. Between 1964 and 1969, IBM’s net annual earnings more than doubled, from $431 million to $934 million.22

Because of this apparent success, it is somewhat puzzling at first glance that after the incredible success of OS/360, the operating system delivered with the S/360, IBM largely retreated from significant software development for nearly a decade. This was despite the fact that IBM researchers continued to research and propose path breaking new algorithms and structures that the company simply refused to develop into products for many years. Inside IBM, the attitude was that the company could, and should, never again build such a product that was so disruptive in their marketplace. IBM president Thomas Watson Jr. said, “There is no question that we cannot go through another announcement like 360 where we obsolete virtually our entire installed revenue base at one time and where we commit a very substantial portion of our total production to a new technology.”23 He made it a policy “never to announce a new technology which will require us to devote more than 25% of our production to that technology.”24 And so, the drive toward innovation which saved them from obsolescence was set aside.

Through the late-1960s, IBM set about consolidating its hold on the market for mainframe computing instead. By this point, the market was waking up to the true potential of electronic computers. The first patent for software was awarded in 1968 to Applied Data Research for a data sorting program.25 The filer said that he did not initially intend to sell it (although ADR eventually did), but merely wished to prevent IBM from copying it and selling it on their own machines.26 The concept that code could be a durable good, and that code written for one computer could be run on another took a few years to truly take hold, but once it did its power was undeniable. Feeling competitors nipping at its heels, IBM employed the strategies it had used to dominate the market for tabulation machines: bundling hardware, software, training, and system support at a flat price to lock in customers.27 It quickly took over and then maintained over 80 percent market share of mainframe computers used in the United States.28

C. IBM Reacts to Enforcement Pressure by Unbundling

Throughout the 1960s, the sense that software development had the potential to be a separate market grew. The first antitrust suit in this era against IBM was a private one, filed by CDC, another supercomputer manufacturer in 1969. The initial CDC complaint alleged 37 violations of competition laws including hardware and software monopolization.29 It also sought (unsuccessfully) to enforce in computer markets the 1935 and 1956 consent decrees binding IBM’s behavior in tabulating machine markets.30 CDC was the first of many private enforcers.31 Beyond bringing their own cases, IBM competitors also complained to the DOJ that “IBM had achieved and maintained its dominant market position not because of better products, good marketing, solid sales practices, or quality support, but principally through IBM’s willingness to provide whatever level of support and services the client wanted, without charging directly for these services.”32

The DOJ case, when it was filed, served as the anchor to enforcement against IBM from 1969 until 1982.33 The case covered an expansive set of practices. While it was ultimately dismissed, it was an open declaration that the government was again heavily scrutinizing IBM. And more important than that general effect, it signaled specific practices that were suspect and allegedly anticompetitive. The original complaint announced scrutiny of:

1. A “pricing policy whereby [IBM] quote[d] a single price for hardware, software and related support [which allowed:]

        a. [cross subsidizing support to inhibit] the entry or growth of competitors; and

        b. Limit[ing] the development and scope of activities of an independent software and computer support industry.

2. [Bundling] accumulated software and related support to preclude [IBM’s] competitors from effectively competing for various customer accounts;

3. [I]ntroducing selected computers, with unusually low-profit expectations, in those segments of the market where competitors had or appeared likely to have unusual competitive success, and by announcing future production of new models for such markets when it knew that it was unlikely to be able to complete production within the announced time; and

4. [G]ranting exceptional discriminatory allowances in favor of universities and other educational institutions [to dominate the educational market.]34

These actions were obviously corrosive to other mainframe computer manufacturers who couldn’t compete with IBM’s army of support and service workers. However, they were also preventing the yet-to-be-born software industry from existing at all. The transition to true, general-purpose computing had happened in hardware, but not yet in the minds of many customers. The concept of software was a revolutionary one, both the flexibility it allowed, and the portability between systems. And since buyers of IBM mainframes could simply turn to IBM for any software they needed for no additional charge, it was virtually impossible for an independent software company to build a business.

IBM was aware of these competition concerns before any lawsuit was filed. These business practices and others were the subject of numerous conversations with its business rivals and subject to investigative demands from the DOJ for years prior to the CDC suit. IBM’s lawyers concluded its tying practices, in particular, would not survive a court challenge.35 In an (ultimately failed) attempt to pre-empt a suit, IBM announced in late 1968 that it would unbundle the services that had been, until this point, included with hardware sales.36 IBM’s decision to unbundle hardware and software had a profound impact on the American economy, effectively giving birth to the modern software industry. The impact of this decision was not merely to create economic incentives; it also created more technical opportunities for competitors to build products that could work with IBM’s hardware.

To explain why this is the case, we must briefly digress to explain the development of the practice of modularization of software. Anyone who has watched a Windows PC boot up in the past 30 years has watched several layers of the ‘software stack’ that are in use by the computer load up. After turning on the computer, first the BIOS loads, which directly controls the physical components of the computer, such as the CPU and system memory. Then, the operating system is loaded, in this case, Windows. Then, the user may launch another application, such as an internet browser, to do actual work. These different layers have defined responsibilities and interfaces. When an application needs to interact with the local computer’s resources, perhaps by saving a file or printing an image, it hands off the operation to the operating system rather than directly interacting with the resource. Early computing devices did not have this modularization of functionality, and borders between layers of this ‘stack’ didn’t really exist in the early period of the development of digital computers. By the late 1960’s, however, the growing physical complexity of computer hardware and the complexity of the tasks being done by software called for new solutions. Modularization was proposed as a strategy to contain complexity and standardize how different components of a computer worked together. Modularization would go on to be key in allowing software built by different parties to work with each other and co-exist on the same device.

When IBM was faced with needing to unbundle their hardware, software, and services offerings, they turned to modularization as the technical means to execute this strategy.37 IBM had to better define and better modularize the software it developed for customers so that it could sell those individual products and services separately. A delivered mainframe computer would include hardware, system control code, and a programming language. After unbundling, other needed items such as peripherals, application code, and code to control peripherals would be sold separately. Separate teams were working on all these different product silos, and in order for these products to work together as individual items, they needed well-documented, consistent interfaces.38 These technical adjustments necessitated by unbundling also made it easier for customers and competitors to expand, adapt, and innovate on the base products IBM delivered, because those base products were better documented and more self-contained, with well-defined interfaces.

By creating more intensive areas of scrutiny, public and private enforcement efforts forced IBM to fundamentally change their business practices. This happened even though many of the cases did not ultimately succeed. One IBM historian noted, “the filing and prosecution of the antitrust case affected IBM’s business behavior for the next twenty years.”39 This effect, regardless of the outcome of the cases, had a fundamental effect on the nature of IBM’s decisions. Scholars credit much of IBM’s decreased antagonism to the “‘antitrust phobia’ resulting from being the subject of a Sherman Act case designed to break up the company.”40

D. Competition Enforcement Preserves the Unbundling

The decision to unbundle software and services from hardware sales could very well have been a blip, however. The decision was intended initially to preempt litigation, but once that failed, IBM might have reversed course. Instead, it concluded that it had to go forward with the plan if it had any hope of winning the private or government suits. Throughout the 1970’s, IBM was under near constant pressure to defend its business practices in court across multiple cases. The private litigation was a key driver of IBM’s behavior. As courts at the time recognized, a private antitrust plaintiff sues “not only on its own behalf, but as a ‘private attorney general’ representing the public interest. Congress established the private remedy to enlist the public as enforcers of the antitrust laws. The courts should encourage this function.”41 Alongside the big public enforcement, private settlements, victories, and even losses reshaped the competitive landscape.

For example, in January 1973, Control Data Corporation settled their five-year suit with IBM for the right to purchase IBM’s Service Bureau Corporation at a heavily discounted rate of $16 million—roughly a quarter of its market value.42 The Service Bureau Corporation sold computing as a service for all of the major business computing needs in the 1960s and early 1970s.43 This amounted effectively to a divestiture, the type one might see as a consequence of later successful monopolization cases. It removed a critical tool in IBM’s repertoire for how it influenced the developing computer industry.

Several other cases went to trial throughout the 1970s for similar claims in several different domains. Symbolic Control Inc. sued IBM in November of 1971 for bundling critical software for commercial machining into a larger hardware and services package,44 and the case remained alive for a decade.45 California Computer Products and Transamerica Computer Company both sued IBM in separate cases beginning October 1973 for monopolization of peripheral products.46 Neither suit was ultimately successful, but both stretched on to the end of the decade.47 Cases like Telex Corp v. IBM and Memorex Corp v. IBM likewise went to trial on claims that IBM monopolized peripheral hardware. Telex brought suit in January 1972, and initially won a trial court award of $259.5 million the following year before it was overturned in 1975.48 Memorex likewise brought a case in December 1973 claiming IBM bundled its disk drives to its mainframes and used predatory pricing to monopolize peripherals.49 The court declared a mistrial and granted a directed verdict to IBM which was ultimately upheld on appeal in 1980.50 The important element of these cases, successful or not, was they filled out a rich and credible set of guardrails that IBM was required to at least be attentive to. This is the backdrop against which IBM felt an “antitrust phobia”—not just a single big case from the DOJ.

The results of this were substantial.

E. Unbundling Result: The Modern Software Industry

After IBM announced its unbundling project, competitors were quick to dive into the newly existent software industry. When IBM announced their unbundling project at the end of 1968, to our knowledge no independent software companies existed in the United States. By 1980, there were more than 6,000. These new enterprises were possible because there were now thousands of customers with computers with identical interfaces for any software they created to run on. Gone were the days of every type of computer, or even every computer, needing custom code to be written. By 1980, sales of packaged software (excluding custom-written programs) exceeded $2.7 billion.51 Despite the difficult economic conditions the 1970s presented, this was a fantastic growth rate from the zero sales from zero companies at the start of the decade.52

Many important software companies were founded during this period, including many by ex-IBMers who now saw opportunities for success outside of ‘Big Blue.’ Gene Amdahl, the Chief Architect of the IBM S/360, left the company in 1970 to launch his own company, which made IBM-compatible mainframes.53 The following year, five German software engineers in IBM’s nascent artificial intelligence division left to form their own company when their project was killed and formed SAP, which would go on to become the world’s largest non-American software company.54 The company that would go on to be Oracle was founded in 1977 to build database software based on a concept published by an IBM researcher. Big Blue did not pursue the idea commercially until a decade after the publication of the academic paper proposing the idea. These companies were not the exception: the software industry roared to life in the 1970s.

This explosion of innovation was not an accident. It happened because the technical changes IBM made to satisfy competition enforcers to implement unbundling created technical stack with multiple, consistent entry points for packaged software on a large install base of IBM 360 computers. The multiple entry points that were necessary for true unbundling meant that this new software industry was not limited to a single category such as consumer applications like spreadsheets or word processors. Software for managing physicals peripherals, performing intermediate functions such as sorting and searching files, business applications such as databases, and many more were all made possible at the same time. And the large install base of IBM created a market—companies could make a software package and sell to many customers.

III. United States v. AT&T Case: 1974–1984

In this Section we explore how AT&T developed, and then failed to use, nearly all the most important signal transmission technologies and devices of the 20th century. With the help of its regulators, AT&T was able to control market for signal transmission for decades despite its refusal to use the innovation its own scientists were creating. Unlike IBM, AT&T did not primarily exert power through technological tying and bundling—though examples of this are not hard to find.55 Instead, it influenced regulators to preserve its integrated business arrangements.

A. Pre-1960s Litigation & Regulatory Posture

The American Telephone and Telegraph Company was founded in 1885 as an outgrowth of Alexander Graham Bell’s Bell Telephone Company established roughly a decade earlier.56 Shortly thereafter, the company adopted a corporate slogan, “One Policy, One System, Universal Service,” that reflected its drive to hold a permanent regulated monopoly—providing service in exchange for unchallenged control of communication.57

Over the next six decades, AT&T accumulated power by acquiring local phone companies and building relationships with the Federal Communications Commission (FCC) and state regulators. After years of anti-competitive behavior, the DOJ sued AT&T and Western Electric for monopolizing the market for telephones and telephony equipment in 1949.58 In a soon-to-be familiar template, the complaint alleged that AT&T gamed the system for regulatory rate setting, excluded rival equipment makers, and delayed the introduction of more efficient, improved telephony technology.59 The proposed relief was an existential threat: breaking up the Bell System.

The AT&T response to this suit is instructive. AT&T did not make a legal argument; instead it mobilized a response against the legal system. On February 28, 1952, AT&T counsel met with the Attorney General J. Howard McGrath to request a formal postponement of any antitrust action for the duration of the Korean War arguing a case would disrupt the important work Bell Labs and Western Electric did for national defense.60 AT&T met with senior DOD leadership on the same day urging the DOD to intercede to protect AT&T.61

In March of 1952, the Defense Secretary, with signoff from the Secretaries of the Army, Navy, and Air Force, wrote a letter urging the Attorney General to indefinitely postpone the antitrust case.62 The DOJ rejected the request, but the campaign by AT&T and the Department of Defense continued. Eventually, AT&T changed tactics and solidified a message: instead of delaying an enforcement action, the DOJ should shelf it altogether. In 1953, AT&T met with Stanley Barnes, the Assistant Attorney General for Antitrust, and left a memorandum urging the DOJ to drop the antitrust case because AT&T’s monopoly power was a matter of “legislative and regulatory policy” that the DOJ should not seek to put before the courts, a position that was supported by the DOD.63

The argument over how to resolve the issue of AT&T’s anti-competitive actions continued to be argued behind the scenes between the FCC, DOJ, and DOD, with AT&T as a de facto equal player influencing and directing communications between the other parties on several key occasions.64 Finally in late 1955, while the AAG of the Antitrust Division was out of town, DOJ leadership met and rapidly moved to settle the case without divestiture. None of the staff on the case supported settlement without divestiture of Western Electric and several explicitly and separately wrote to oppose it.65 Nonetheless, at the end of 1955, DOJ leadership informed AT&T that they would consider a consent decree without divesting Western Electric. AT&T drafted and submitted that decree and the government accepted it on January 24, 1956.66 The decree had several meaningful contributions in licensing IP but the fundamental market power was unchanged, and ultimately protected by government. It was a trend. As the former CEO of American Airlines noted, “the AT&T monopoly survived until the 1980s not because of its naturalness but because of overt government policy.”67

B. General Purpose Technology: Modems

Signal transmission devices for purposes other than voice communications have a surprisingly long history. The core innovations we are concerned about for this Section are the devices that eventually became the backbone for the modern internet: modems.

The legal story for modems begins, strangely enough, with a plastic box on a rotary dial phone. Since 1921, the Hush-A-Phone Corporation sold a plastic funnel for phone microphone receivers to help muffle outside sound.68 To control the phone system, AT&T contractually banned any equipment it didn’t own from connecting to its systems, including what it called “foreign attachments” like the noise cancelling funnels made by Hush-A-Phone.69 AT&T discovered the product in the 1940s and began telling phone subscribers that AT&T would disconnect phone service to anyone using the product.70 Hush-A-Phone, facing an existential threat to their business, petitioned the FCC on December 22, 1948, to invalidate AT&T’s prohibition on these connections.71 Initially, the FCC found no “physical harm of any consequence” from Hush-A-Phone.72 However, AT&T also argued that because Hush-A-Phone affected the sound of phone calls, (which was, indeed, the point of the device) it created ‘degraded quality’ of the call. The FCC credulously agreed and dismissed the complaint on December 21, 1955.73 Hush-A-Phone appealed to the federal courts, which found the whole matter somewhat ridiculous.74 The court overturned the FCC decision in 1956, holding that AT&T may not put up “unwarranted interference with the telephone subscriber’s right reasonably to use his telephone in ways which are privately beneficial without being publicly detrimental.75

This right was expanded in 1966 with the Carterfone decision.76 The Carterfone was a device for connecting a telephone to a two-way radio system intended for use by oil workers and ranchers while they were in the field. These devices were again prohibited by AT&T from their networks.77 Carterfone sued under the Sherman Act.78 The Court referred the matter back to the FCC, believing it had primary jurisdiction.79 By this point, the FCC was changing somewhat, attempting to reorient the regulatory landscape to allow computers to connect to the telecommunications network. In 1968, the FCC agreed that AT&T’s restriction was “unreasonable in that it prohibits the use of interconnecting devices which do not adversely affect the telephone system.”80

AT&T lifted the foreign attachment ban but began selling and requiring the use of “protective devices” that customers who wished to connect equipment to the network could buy for $10 plus $2 a month ($90 and $18 in 2024 dollars).81 The plan, literal rent seeking, was a bold step after such a recent legal rebuke. Nonetheless connection was now open.

These cases were ultimately not about noise cancellation or phone lines for oil fields. They were the starting gun for modems and the modern internet. Modems were invented by Bell Labs engineers in 1958 for the SAGE air defense program. The Bell 101, the first commercial modem, went on sale in 1959. Nearly 10 years later, modems outside AT&T’s control could finally attach to the network and transmit data beyond telephone service. Frustratingly though, under the new data access arrangement, non-AT&T modems could only access the network through the “protective device,” and users of these devices were forced to pay hundreds of dollars annually for being outside the AT&T ecosystem.82 This friction was tremendous, and the FCC was unwilling to resolve it for years despite complaints right out of the gate.83

Other manufacturers like Novation and Vadic sold modems but these devices were severely limited by this mode of connection.84 The FCC eventually created its Part 68 regulations on sale of electronically connected modems and other terminal equipment in 1975.85 However, AT&T sued, and the legal battle lasted another two years until 1977.86 In all, AT&T was able to protect against interconnections that enabled modems for about 30 years. For almost 20 of those years, AT&T itself was sitting on modem technology and preventing widespread uptake by others because it simply saw no use within the phone system it ran.

After 1977, however, the floodgates of transmission device development were unleashed. Almost immediately thereafter, Hayes Microcomputer Products was founded.87 The company would go on to release a series of transformational modems including the Hayes Smartmodem in 1981, the first modem that could be directly dialed by a user’s computer and the more powerful Smartmodem 1200 the following year.88

C. General Purpose Technologies: Signal Transmission Media

The U.S. signal transmission network combines multiple types of transmission technologies for different environments. AT&T owned and deployed nearly the entire copper transmission infrastructure for telephone and telegraph. But beyond that, Bell Labs invented or significantly contributed to nearly every major modern signal transmission technology: microwave, fiber optic, and satellite. In 1947, AT&T built the first commercial microwave relay line from New York to Boston.89 The next commercial microwave relay for voice calls had to be built by MCI several decades later. The 1956 consent decree required liberal licensing of Bell-owned patents, but the legal battle to actually deploy it was lengthy. Signal transmission over fiber optic cable combined the invention of a room-temperature semiconductor laser by Bell Labs scientists and glass suitable for optical transmission by Corning, both in 1970.90 The first commercial single-mode fiber optic transmission cable was also built by MCI in 1983.91 The concept of cellular networks was invented at Bell Labs in 1970,92 although the first commercial cellular network wasn’t built until 1979 by NTT in Japan. Famously, communication satellites were first proposed by Arthur C. Clarke, but the first practical proposal came in Jet Propulsion written by a Bell Labs scientist in 1955. After Sputnik in 1957, the federal government built on this Bell proposal by creating ComSat in 1962.

If Bell Labs invented all these valuable technologies, why were they so rarely the first to commercialize them? Vastly expanded signal transmission capacity was clearly needed, and the business opportunities were readily apparent. However, AT&T did not see its business as competitive signal transmission. “Universal, adequate service” for voice calling was the business, and it was hugely profitable. Instead, other parties, especially MCI, pushed aggressively to offer innovations that AT&T created but did not deploy.

MCI planned to enter the market as many challengers to monopolists do, through a side door. The initial proposal was modest: cheaper interoffice and interplant communications for small businesses.93 However, this would ultimately become cheap, plentiful mass signal transmission to enable the internet, and a smartphone in any pocket.94 MCI’s microwave relay networks could provide an alternate path to copper-wire transmission by AT&T. But they would still need to connect at a local level to the AT&T-owned transmission at a switchboard or some equivalent to give customers access to this capacity. This would not supplant AT&T for mass communication at the outset, but AT&T immediately perceived that it could become a cheaper alternative for some private businesses and open the window for wider viability. As a threshold matter though, any signal transmission competitor to the Bell System had to seek approval from the FCC to operate.

On December 31, 1963, MCI filed initial applications for licenses to build microwave relay facilities from Chicago, Illinois to St. Louis, Missouri.95 From February 24 to April 10, 1964, every relevant company in the Bell System, as well as other signal transmission incumbents like Western Union, filed petitions to deny MCI’s application.96 They alleged, among other things, that there was simply “no demonstrated need” for more signal transmission capacity, that new capacity would be “wasteful duplication,” and that MCI was not sufficiently qualified to operate these microwave relays (in large part because it was not an existing incumbent).97

AT&T wanted signal transmission to be used nearly exclusively by land lines serving mostly local calls. This was the market AT&T most tightly controlled, and it wanted to pause history there, imagining this as a static utility, much like water and electricity delivery. An unchanging utility it would sit atop forever. MCI and those like it envisioned a different world, where the means and methods of communication would continue to evolve and grow.98 Eventually, regulators sided with MCI, agreeing in August 1969 to grant MCI’s licenses.99

As modest as this approval was, it was nonetheless a landmark moment for signal transmission that encouraged other companies to apply for similar approval.100 By June of 1970, the FCC had 37 applications for 1,713 private line microwave relay stations across the country,101 and so released a more general rule that broadly approved new applications for specialized point to point microwave relays like MCI.102 The attempts to delay competition in this small submarket through FCC process had bought the Bell System about eight years, but the significant question of connection to AT&T controlled local networks was still undecided.103

For the next few years of negotiations, AT&T tried to deny MCI connection to the local networks needed to deploy its relays.104 MCI raised funds and built capacity for a national offering, believing the FCC order would ultimately resolve AT&T’s intransigence. MCI hired an experienced legal negotiating team to resolve the dispute through 1972–73 and appealed to the FCC to address negotiating delays.105 AT&T meanwhile denied connection, charged excessive prices when connection was allowed, and intentionally delayed or improperly executed installation and maintenance of interconnection facilities.106 When these methods were exhausted, AT&T simultaneously filed defective interconnection plans with 49 state utility commissions, forcing MCI to contest with AT&T’s legal department in dozens of jurisdictions at once.107 These interconnection plans urged the most restrictive possible interpretation of AT&T’s obligations, refusing to allow MCI to use AT&T’s facilities.108

This was a deadly threat to MCI and microwave transmission. At the time, private lines with the type of interconnection that AT&T was denying to MCI was a $400+ million annual market.109 MCI had borrowed $64 million to build its network and was sitting on only about $7 million in cash and burning about $2.4 million every month.110 At the same time, in September 1973, AT&T CEO John deButts argued to the state utilities “the case for the common carrier principle and thereby implication to oppose competition, espouse monopoly.”111 AT&T was asserting a right to a government-sanctioned monopoly.

In response, MCI could only beg the FCC to enforce its order and prevent delay tactics. To its credit, on October 4, 1973, the FCC rejected AT&T’s state filings and reasserted exclusive jurisdiction.112 It called for “no delay in honoring requests of specialized carriers for interconnection facilities required by such carriers.”113 AT&T still did not provide those interconnections. By December 13, 1973, the FCC initiated proceedings against AT&T for refusal to interconnect MCI.114 Independently, a court ordered AT&T to provide those interconnections weeks later.115

After years of fighting to connect to AT&T’s network, in March of 1974 MCI filed a private monopolization suit against AT&T alleging predatory pricing, denial of interconnections, negotiation in bad faith, and unlawful tying.116 The next month, on April 15, the Third Circuit vacated the court order against AT&T and pushed the issue exclusively to the FCC.117 AT&T took this as a legal blessing to their monopoly, or at least a sign of immunity before the FCC decision. Over a single weekend between Third Circuit and FCC rulings, AT&T ordered its local companies to rip out any AT&T-affiliated connections for MCI’s customers.118 This was a last-ditch attempt to kill MCI: the physical damage over those three days could cost more time, capital, and customer goodwill than MCI could afford. It was especially bold because the FCC had said disconnecting MCI’s customers would be a violation of the law.119 In response, on April 23, the FCC finally ordered AT&T to provide the interconnections, including the ones they had ripped out that connected through AT&T switching systems.120

Unlike IBM, to this point AT&T had primarily enforced its monopoly through government power, generally through the FCC, with support or a blind eye from others. However, if there was a tipping point when this support began to change, it was this. After years of watching AT&T’s anticompetitive behavior, on November 20, 1974, the DOJ finally joined the fray, filing a monopolization suit against AT&T.121 The AAG at the time said the words AT&T had been fighting for years, “I am fully aware of the service that the Bell System has provided. Nevertheless, I believe the law must be enforced.”122 The rest of government was not blind to the growing problem of AT&T and the FCC’s capture either. In 1974 Senator Hart’s Antitrust and Monopoly Subcommittee held hearings on the need to change this regulatory structure. In one, the Director of the White House Office of Telecommunications Policy testified that the FCC was “in the posture of ‘permitting competition,’ a posture that is entirely antithetical to our basic traditions. The burden and the benefit of regulation have shifted: The would-be provider of a new communications service, rather than the monopolist, is now required to justify his existence, and the monopolist, rather than the would-be customer of that new service, receives the benefits of the regulatory machinery” and that the “regulatory apparatus has become a barrier to competition and innovation required for the future direction of communications. The end result is that innovation is discouraged.”123

Nonetheless, the AT&T battle against signal transmission sprawled for the remainder of the decade. For example, in 1975 MCI offered a novel long-distance service to the general public in 15 cities where customers could, in effect, dial in to a private line and then from that private circuit, make a long-distance call that would be fulfilled MCI.124

The playbook was familiar. In the first half of 1975, AT&T met extensively with FCC staff to lobby them informally on why MCI’s Execunet should not be permitted.125 Weeks later, the FCC denied MCI’s application to deploy Execunet without ever holding a formal hearing, and MCI appealed this decision to federal court.126 When the court granted a stay, the FCC immediately sought to return the matter to its own jurisdiction rather than those of the courts; the court obliged and FCC would hold a formal hearing.127

In July 1976, the FCC again denied MCI’s right to challenge AT&T with Execunet, and again, MCI appealed this decision.128 The D.C. Circuit agreed a year later that “nowhere in that decision can justification be found for continuing or propagating a monopoly” and especially not just because “AT&T got there first.”129 This back-and forth between the FCC and the courts looped two more times until April 1978, when Judge Skelly Wright batted down the FCC’s “strikingly unfair” decision “in direct and explicit contradiction to our Execunet decision.”130 He held that AT&T must interconnect MCI and that “neither the Commission nor AT&T is now free to choose to ignore the answer given by this court, in lieu of one more favorable to their position.”131

Alongside this regulatory saga, antitrust enforcement began to land. The private MCI suit went to jury trial from February to June of 1980. The jury found AT&T guilty and awarded MCI $600 million, trebled to $1.8 billion.132 Likewise, the long-delayed DOJ case finally started in 1981. By this point, MCI had already proved much of the DOJ case. On January 8, 1982, the writing was on the wall; AT&T agreed to a consent decree to break up the companies that provided local telephone service.133 On August 11, 1982, Judge Greene finalized the decree noting that it would:

“[a]llow AT&T to become a vigorous competitor in the growing computer, computer-related, and information markets. Other large and experienced firms are presently operating in these markets, and there is therefore no reason to believe that AT&T will be able to achieve monopoly dominance in these industries as it did in telecommunications. At the same time, by use of its formidable scientific, engineering, and management resources, including particularly the capabilities of Bell Laboratories, AT&T should be able to make significant contributions to these fields, which are at the forefront of innovation and technology.” 134

The breakup formally took effect on January 1, 1984.135 For decades AT&T had been able to shape the legal landscape of signal transmission to protect its monopoly of copper wire telephony. With that power broken for the time being, wholly new uses and capacity for signal transmission would emerge. That new landscape would create the modern internet. And after the Bell systems’ breakup, the AT&T long distance company finally began pursuing several infrastructure projects using these and other new technologies in earnest.

D. Aftermath: Growth of Packet-Switched Data Transmission Networks

Cheaper and better functioning devices like modems alongside higher-bandwidth signal transmission advancements allowed data transmission to grow in the 1970’s and 1980’s. As AT&T’s power waned during this period, new data transmission networks combined these resources under a new paradigm for data transmission: packet switching. Historically, for a phone call, an entire dedicated circuit would transmit the signal for the whole phone call—this is circuit switching. Packet switching breaks up data into distinct chunks, called packets, and sends them sequentially.136 This allows multiple streams of data to use a circuit over the same time window, and allows for greater fault tolerance. Packet switching was transformational in the development of data networks. It also contradicted the business model of AT&T, which involved leasing dedicated circuits and charging by the minute for access to them. According to Paul Baran, one of the early pioneers of packet switching, “the one hurdle packet switching faced was AT&T. They fought it tooth and nail at the beginning. They tried all sorts of things to stop it. They pretty much had a monopoly in all communications. And somebody from outside saying that there’s a better way to do it of course doesn’t make sense.”137

The first U.S. packet-switched network, ARPANET, was conceived in 1966 and set up between UCLA and Stanford in 1969. It was instantly popular and grew to nine sites in 1970 and 57 in 1975. But unlike regulated telephone carriers in Europe who set up packet switching networks in the late 1960’s, Bell Labs avoided this area, refusing to participate in ARPANET research (although they would lease the lines it ran on.)138 AT&T even declined an offer from the government in 1971139 to take over the management ARPANET and lease back usage to ARPA members as part of their regular service, saying that it was incompatible with their business strategy.140 AT&T’s tariff structure made the growth of ARPANET difficult, as well. Even though it was a packet-switched network for data, ARPANET still ran over leased AT&T lines, and the company’s rules for the underlying leased communications lines did not permit shared usage.141 This meant that ARPANET could never grow beyond ARPA contractors and “authorized users.”142 As demand grew, people developed other data networks that were less restricted in their potential membership.

In 1972, Bolt, Beranek, Newman (BBN), the company that built the packet routers that ARPANET relied on, created a subsidiary to commercialize data networking for a wider audience.143 The former project manager of ARPANET, Larry Roberts, left in 1973 to become the president of the new venture, Telenet. By the end of that year, Telenet applied to the FCC to be the first regulated, public packet-switched network. To actually offer a public service though, Telenet would need permission from AT&T. Thankfully in 1974, after approving Telenet’s application, the FCC also found AT&T regulations against sharing leased lines overly restrictive and required them to revise their tariffs to allow “Composite Data Carriers” such as Telenet. Others would soon follow, including Tymnet and CompuServe. This growth was further fueled in 1976 by the publication of an international standard for packet-switched networks, X.25.144 This predecessor to TCP/IP made it substantially easier for new entrants to launch new public or private data networks, and dozens of such networks were launched between 1976 and 1980.

By the late 1970s, the rise of packet-switched networks was undeniable, even for AT&T. They applied to the FCC for permission to launch their own packet-switched network in 1978 and were swiftly approved. However, instead of launching a competitor to Telenet or the other networks, AT&T spent several years envisioning and attempting to build a packet-switched network that could eventually be a replacement for the entire Bell network.

AT&T’s Advanced Communication System was eventually launched as Net 1000. The system was a voice-over-IP and cloud computing system several decades before its time.145 It was designed to allow packet-switched voice calls to be made that could also be passed to a traditional circuit-switched network.146 It would also be a distributed computing system that would allow data customers to access computing and storage resources located anywhere in the network. AT&T still saw itself as a provider of end-to-end communication, so the idea of an offering that was designed for “foreign attachments” was antithetical.

As with IBM, when faced with an innovative, disruptive technology, AT&T attempted to use the technology only for its existing product offerings, rather than offering customers the new technology itself. Net 1000 launched in 1982 and unfortunately for AT&T, it was buggy, unreliable, and not what customers wanted. It didn’t offer simple X.25 connections for the mainframes and personal computers that businesses already had and wanted to connect, had terrible uptime performance, and was also several times more expensive than Telenet or any other packet-switched network’s connections. AT&T ceased offering new Net 1000 connections in 1984, the same year the breakup was finalized, and shut down the system entirely in 1986.147

In parallel to this failure to enter, other research and academic institutions that AT&T tariffs had excluded from ARPA began to develop their own alternatives. By 1986, the National Science Foundation established NSFNET, a research network that would be open to all academic and non-commercial researchers.148 The new system ran on the TCP/IP protocol, and lines leased from MCI at reduced rates.149 This system became the backbone on which our current internet was built, Ship-of-Theseus-style. The removal of the final components of this old non-commercial network in 1995 created the commercialized internet we have today.150

IV. Google and Large Language Models: 20172024

A. Technical Background: A Brief History of Deep Learning & Language Models

Advancing to the modern day, we face another of many general purpose technological inflection points: transformer-based neural network AI systems. To understand transformers, we must understand first neural networks. Neural networks are a structure for machine learning that has existed for many decades. In neural networks, each ‘node’ in the network is a function, a small piece of code that takes an input and yields an output. Today most neural networks are ‘deep’ and consist of at least three layers of nodes: an input layer, one or more hidden layers, and an output layer. In this structure, each node is responsible for only a tiny amount of work, and the functionality of the network is a combination of the function of the individual nodes and the structure of the network. The general trend in deep learning has been toward ever-larger models, driven by advances in hardware and advances in techniques of indexing and storing data.

The current cycle of innovation and investment traces to the early 2000’s, when computer scientists began using NVIDIA Graphical Processing Units (GPUs) to train neural networks.151 GPUs are specialized hardware for computation that are designed to perform many simple operations in parallel. They were initially developed for fast rendering of video game graphics. However, computer scientists eventually realized that neural networks are similarly made up of small, simple functions that require scale rather than complexity of operation, meaning GPUs are very well suited to training deep learning models. When the company saw the research demonstrating the promise of their product in the AI market, it reoriented around this burgeoning market instead of its prior sole focus on gaming. NVIDIA not only developed new chips with AI applications specifically in mind, but also developed a software computing platform, CUDA,152 to help AI developers train models as efficiently as possible on the new GPU architecture. By the 2010s, academic researchers were using neural networks trained using CUDA and GPUs to win high profile machine learning competitions, sometimes with enormous improvements in results.153 Clearly, the new paradigm of using GPUs to train deep learning models was showing tremendous success.

Understanding human language, often for the purpose of translating from one language to another, was one of the earliest tasks set for deep learning models. Attempts to do this predate neural networks but the earliest approaches created results that were more comical than accurate. The significant breakthrough that led to models good enough for consumer facing language tools like Google Translate is relatively recent, coming from advances in strategies for word vectorization, also known as word embeddings. These methods emerged in just the last decade. Word embeddings are a relatively simple concept: they are a numerical representation of the relationships between different words. All words in a document have multiple possible meanings, and those meanings depend on context. A word’s meaning is informed by its position in a sentence, the other words in the sentence, the position of the word in the entire document, and the words in earlier sentences. Hundreds of relationships may be needed to understand even a single document. But all those relationships can be represented numerically, and they can be learned from observing a large enough corpus of documents.

A practical breakthrough for word embeddings was the development by computer scientists at Google of word2vec, an algorithm and toolkit for learning word embeddings from a corpus of documents via a relatively simple neural network. This algorithm reads documents sequentially and measures how often words appear and what other words they are frequently nearby. It then trains a machine learning model to learn the word adjacency and frequency probabilities. The work was made publicly available and became widely used, significantly lowering the barrier to entry for training language models (although Google also patented the method). 

B. General Purpose Technology: Transformers and the Rise of Large Models

Transformers are a more recently introduced data structure used when processing training data for machine learning models. First proposed in a paper from Google researchers in 2017, transformers use a mechanism called attention that encodes information about a word’s location early on, when training data is being fed into the model. This additional positional data allows learning to be done in parallel, rather than iteratively. Machine learning models are often trained through millions of cycles, and in general, using more data requires training for more cycles. This ability to parallelize learning over many different GPUs meant a model could be built in the same amount of clock time with vastly more training data. While transformers were initially proposed as an architecture to train language models specifically, the technology is flexible enough to use on many other types of data, like images, video, or sound. In essence, transformers provide the ability to train a model in human-scale time on internet-scale data.

Word embeddings were a powerful tool that allowed the relationships between words in a document to be mapped and became widely used for that reason. However, they had a significant limitation. They were slow to train, and training time scaled with the volume of data the model was trained on. This sounds like it would be a fundamental limitation, but it often isn’t. In many cases, the work of training a model can be distributed among hundreds or even thousands of computers, so that it can be done in significantly less real-world clock time. However, this wasn’t possible to do with word embeddings (because of their sequential nature) until the invention of transformers.

The mechanism that transformers use to parallelize learning of word embeddings has an additional benefit. To minimize training time, prior models had only tracked relationships between words a few words apart, or within the same sentence. By encoding a word’s position in the training data, relationships much further apart could be efficiently learned. This allowed a word’s context to be inferred not just from the few words that immediately preceded it, but from the entire document. These two related benefits of parallelization and the ability to map relationships and context across an entire document allowed models to go beyond the relationships between words and begin to encompass the way language is used differently by different people in different contexts with a wide range of stylistic differences. Of course, to unlock useful insights, these models require truly vast amounts of data. Thankfully, that data does not need to be annotated by humans as many machine learning training sets do.154 Transformers remain effective without annotation if what needs to be learned isn’t the meaning of the data, but the relationships between all the data in corpus and what the patterns of use are. This is also, incidentally, one reason why transformer models can and do “hallucinate” compelling but wrong results—they are built to replicate structure, not to understand meaning.

Transformers were immediately recognized as a breakthrough with a wide range of applications. Within a year, researchers began applying transformers to computer vision and image processing tasks, with seminal models for these released in 2020. One of the first models to take advantage of this new architecture was BERT, launched by Google in 2018. In 2020, Google trained but did not publicly release LaMDA, a successor model which it continued to iterate internally. OpenAI has released a series of models for tasks that take text as an input and output text or images, most notably the models GPT-1(2018) through GPT-4(2023). Meta began work on Llama in 2022 and released it in 2023. Consumer-facing products based on these models only began coming onto the market in 2022.

On the image side, Midjourney, a transformer-based image generation model was released in 2021, the same year the OpenAI released the similar tool DALL-E. Image generation models were likely the first products to market because humans are more tolerant of errors in image-based communication compared to text. GitHub CoPilot, an AI tool that help programmers write code, was released by Microsoft and OpenAI in 2021, but did not attract widespread attention because its audience was confined to people who wrote software code. The tool that has attracted the most public attention by far has been ChatGPT, which was released in late 2022 by Microsoft’s OpenAI.

Transformers are a new paradigm in computing. They abruptly changed what kind of computer architecture is needed, introduced new principles of software design, and made a new class of problems possible to solve. The solutions involve gargantuan datasets and compute, but they exist. Transformers also rapidly shifted the optimal architecture for training machine learning models. GPUs, which were originally optimized for graphics processing by limiting the instruction set as a sacrifice for the ability to execute many more operations in parallel, are vital for training transformer-based models. This makes access to top hardware uniquely important again. Across these impacts from transformers, the ability to access data, to access compute, and relatedly to access GPUs and other parallel processing hardware, have become defining factors for operating effectively at scale in machine learning.

C. Google’s Position

Today, the diversity of the early internet has given way to platform dominance.155 Great gatekeepers sit astride the major vectors of people’s usage of the internet. Meta owns most of social media,156 Amazon e-commerce,157 Apple and to a lesser extent Google are the two dominant smartphone operating systems,158 Microsoft and to a lesser extent Apple likewise are the two dominant PC operating systems,159 Amazon, Microsoft, and Google split the cloud market.160 New use cases crop up from time to time like TikTok or Vine before it, but the competition in these digital markets is for the market itself—they end up with one or two companies in control of the market.161

Google is perhaps the most dominant of all the major tech companies of today. The Google story is still very much one being told, so we enjoy less of the clear vision of history to tell exactly what innovation has yet to flourish. As a basic background however, the markets and technologies Google sits atop as of this writing are search and advertising. The infrastructure required to dominate these markets also happen to be key components for building LLMs and other machine learning products. This means Google may sit at critical chokepoints for the future of AI.

For Google, running a dominant search and advertising means it has a huge simultaneous advantage in data and chips. Additionally, because its core business is so profitable, like IBM and AT&T before it, Google has for many years been able to afford to invest in all of the elements it needs for many different facets of AI from basic research to hard tech to data streams.

The type of data that powers Google’s core business is also the foundation on which the product category of general-purpose foundational models sits. In particular, the data that Google has collected from the open internet and from the users who navigate it with the company’s products including Chrome and Android give Google a tremendous advantage in training large models.

This data flow is unparalleled by any other single company. Moreover, unlike even other search or advertising providers it has unique additional data advantages. Google owns YouTube and has sole power over access to that video data. It also uses its substantial resources to buy access to more data. Google’s exclusive deal to provide search to Apple products for $20 billion per year162 is most commonly discussed as a way to maintain access to users so Google can maintain access to an advertising market, but user searches and what users click on after their search is also extremely valuable data. Likewise, Google made a $60 million payment to Reddit to access human moderated content for training.163

On the chips side, much public attention has been on NVIDIA. And not without reason. Much of the industry relies on their chips and they are the single largest producer of GPUs that people can buy to run AI models. However, this ignores Google’s huge in house offering in this category. Google has been building its own, purpose-built AI chips for the past decade.164 Dubbed “Tensor Processing Units”, Google’s chips have often been better than the NVIDIA chips for energy efficiency or machine-learning-specific specifications.165 The important part here is not the comparison necessarily though. It’s that Google is now, quietly, one of the world’s leading chip designers. It shipped 2 million chips in 2023 alone.166 The catch is that Google does not allow anyone to purchase its chips. It only allows access to the compute through Google Cloud. And even with this limitation, major players like Apple have signed up to develop their AI products on these chips, mediated through Google.167

These create very substantial advantages for Google in AI markets and may also may yet create potential leverage points against fully deployed innovation in the future. This market is very young and is too early to tell what this future will look like. However, history may be a guide. The DOJ filed two monopolization suits against Google in recent years. The first argued “Google’s grip . . . thwarts potential innovation” in search because “new search models are denied the tools to become true rivals: effective paths to market and access, at scale, to consumers, advertisers, or data.”168 The second complaint argues from another angle, “that today’s internet would not exist without the digital advertising revenue that, as a practical matter, funds its creation and expansion” and Google is the “behemoth” that dominates “all aspects of the digital advertising marketplace.”169 This complaint, like the search complaint, argues that “Google has thwarted meaningful competition and deterred innovation in the digital advertising industry.”170 In the first case, the court held that “Google is a monopolist, and it has acted as one to maintain its monopoly.”171

D. Stasis at Google

It is surprising to some at first glance that Google, and early developer of the building blocks of language models and the breakthrough architecture of transformers, was not among the first to release products using this technology. This is especially true since the other necessary ingredient to harness transformers to their full potential, very large volumes of data, is another area where Google has a unique advantage. In 2018, Google was one of only two or three companies that already possessed enough data to train a successful large language model without needing to source datasets from elsewhere.172 So why did Google not release any products?

News reporting suggests that internally, Google leadership was concerned that releasing an LLM-based chat bot would harm the company’s reputation.173 In addition, the new technology was difficult to properly tie to their existing search and advertising product markets, which deepened Google’s incentive to withhold it from the market.174 In fairness, this is an economically rational concern. For Google, innovating search or search adjacent products presents very little opportunity to take additional market share from a competitor—they already hold the vast majority of the market.

This was a similar predicament faced by AT&T in the decades prior to the antitrust lawsuits filed in the 1970s. At both companies, engineers created incredibly effective technologies that executives declined to release into the marketplace on the likely justified belief that doing so might by harmful to the company’s core monopoly.

Google’s refusal to introduce an AI-backed product to the marketplace was clearly frustrating to the engineers who worked on LaMDA. In 2021, the year Google publicly disclosed it was working on LaMDA, the lead engineers working on the project quit in frustration and began working on similar projects at other companies instead. By 2023, every author of the seminal paper that had proposed Transformers had left Google for other AI startups. In doing so, they followed in the footsteps of Gene Amdahl and many other ex-IBMers who left to found other ventures after growing frustrated at IBM’s unwillingness to bring new products to market after the launch of the S/360.

E. Competition Unblocking the Dam: Innovation and Downstream Effects

Google, although dominant in many areas, does not yet have a monopoly in large model development. This means that they were vulnerable to competition, which came for them in the form of ChatGPT on November 20, 2023. Developed by Microsoft’s OpenAI, ChatGPT is a chatbot backed primarily by GPT3.5, a proprietary LLM trained on several datasets comprising, in effect, the entire internet. This chatbot works by repeatedly predicting the next word in a document, until the next ‘predicted word’ is the end of the document. In this design, the initial ‘prompt’ from the user serves as the beginning of the document.

It is reported that doing this required Microsoft to spend hundreds of millions of dollars building custom GPU-backed infrastructure just for the initial training.175 When it was released, ChatGPT attracted a firestorm of interest from the media and the general public. It is well understood that Microsoft is not a legitimate challenger to Google in search. However, the LLM market is only adjacent to search. The data advantage that comes from a running search engine at scale is helpful for training the models but is neither necessary nor sufficient to deploy a commercially successful LLM. As such LLMs are something of a green field into which Microsoft could expand an offering in a market that they could contest in a way that they cannot with search.

Inside Google, the reaction to ChatGPT was that the company could delay releasing LLM-backed consumer products no longer. Senior executives viewed ChatGPT as an existential threat, and senior leaders called in the company’s cofounders, Sergey Brin and Larry Page (who had stepped back from day-to-day involvement at the company) to kickstart the effort to build a competing consumer product on the back of their existing LLM infrastructure.176 The company rushed to develop and release their own chatbot, Bard (later renamed to Gemini), in only 4 months, releasing it in March of 2023. A little over a year later in May of 2024, Google introduce Gemini-backed image content search to its Photos product, and also began incorporating “AI Overviews” or Gemini-generated summaries of search results, in response to some search queries on its most valuable property – core Google search.177 At the same press conference, it promised to integrate Gemini into many more of its existing products in the near future.

It’s interesting to note that while the first of these new LLM-backed products introduced by Google was a chatbot in what appeared to be a direct response to ChatGPT, it quickly followed that up not by developing new products or categories, but by integrating Gemini into its other existing products. Competition changed Google’s incentives to deploy its innovation, but that change to incentives only extends so far. The improvements that Google has made to its existing products are certainly innovative and offer benefits to consumers. But while Microsoft OpenAI has used ChatGPT as an advertisement for new product categories such as its new platform for fine-tuned models for a range of specialized tasks, Google has only iterated on its current offerings (so far). Similarly to IBM and AT&T before it, when faced with a new general-purpose technology, Google chose to tie it to its existing products rather than develop new products or services that might disrupt its existing market power.

In the last year, the release of new transformer-based models has rapidly accelerated. In addition to the closed models from OpenAI and Google that backed the initial, attention-grabbing consumer facing applications, open-source models have proliferated at a rapid rate. It is worth noting that open models (such as BERT and RoBERTa) were the norm until Open AI’s then-unusual choice to release information about and API access to GPT-4 without releasing the model itself. But since then, several more companies have released foundational models that are competitive on leaderboards of top performance across a range of benchmarks.178

First launching in February of 2023 as Llama and most recently updated in April of 2024 as Llama 3, Meta’s open source LLM is widely considered the foundational model of choice for many fine-tuned text applications because it currently outperforms all other open source models.179 This is due in part because it was the first open weight model that was truly competitive to launch after GPT-4, but also because of its liberal license structure that is welcoming of fine tuning use cases. More recent open weight entrants are Mistral’s Mixtral 7B-8BXX models. Mixtral 7B first launched in September of 2023, with newer versions launching every few months. Most recently, Databricks launched DBRX in March of 2024. In a notable shift, Databricks disclosed that they spent only $10 million to train the model, which nonetheless has been performing well on major model benchmarks.180

We are in a period of rapid development not only of foundational models, but also of their applications. DBRX is primarily being monetized by DataBricks as a tool to sell their larger suite of ML pipeline tools. Open AI is attempting to build a platform and marketplace for consumer applications that can be used by app developers who want to sell LLM-backed tools and toys to consumers. Google is primarily using Gemini to power its own products and services, transforming how search results are delivered.

V. Conclusion

The recent history of general-purpose technological innovation tells us that a broad antimonopoly approach ensures that innovation sees the light of day. The structural incentives of monopoly power in technology markets encourage Captured Innovation: where dominant firms produce transformative innovations but restrain those innovations from release or limit them by yoking them to existing dominant business lines. This Captured Innovation does not rest on the ability to suppress rivals even if it is sometimes accompanied by this behavior as well.

The broader process of technological development through innovation is not unlike natural selection. Many competitors try different approaches to a problem, and many will be wrong, but one or two will be right. When an innovation is brought to market, it demonstrates to other competitors a new horizon of possibility, something rival teams of scientists and engineers will attempt to surpass again often in an entirely different way. But this core process of scientific and technical innovation through iteration and competition can be disrupted by a leviathan that chooses to simply not release the innovation it creates when it feels that innovation is a threat to its dominance.

The history of IBM demonstrated the value of effective public and private enforcement of the antitrust laws to release the Captured Innovation of software development on general purpose computers. AT&T showed the importance of whole of government alignment to promote competition—regulators and enforcers fostering competition allowed the devices and transmission technology that undergird the modern internet. Finally, the story of Google and large language models unfolding before us today cautions that we still see Captured Innovation even amongst modern tech giants. The willingness for enforcers and regulators to affirmatively promote competition is vitally important to ensure innovation can reach its full potential.

A. Lessons for Competition Law

Enforcers have long sought to strike a balance between promoting the competition envisioned by the law and ensuring that the mechanisms of their enforcement do not cripple firms, which would itself reduce competition. This balance echoes the academic debate between the Joseph Schumpeter school, which argues monopolies promote innovation, and the Kenneth Arrow school, that insist competition does instead.181 Scholars have done thoughtful work on how to resolve this debate on its own terms by arguing that on the whole, competition increases innovation more than monopoly.182 These analyses are valuable, but they often deal with innovation as a fixed and quantized good without interrogating the actual impact of innovation in particular markets.183 Historical analysis of innovation in high tech markets provides a more nuanced answer for competition enforcers about how to best promote innovation that can be deployed to its full potential.

The historical pattern is that vigorous application of competition law leads to surges of technological innovation that get deployed in markets. This can give context for enforcers, regulators, courts, and policymakers looking to make tangible the more abstract argument of innovation harms in technology markets. There is substantial quantitative and theoretical evidence that competition law enforcement can spur innovation.184 Looking out for conduct that fits these patterns can help us prevent monopoly power from stifling high-quality innovation.185 Our analysis clarifies that conduct which stifles innovation is both noticeable and addressable in near real time if one knows where to look. With past analogous examples, courts and enforcers can be more confident in drawing on direct evidence of harm in addition to the often-compelling indirect evidence of innovation harm that has long been developed in the economic literature. This context can also be a guide for sectoral regulatory bodies with competition equities looking to preserve innovation going forward.

In antitrust law enforcement, both by public enforcers and by private litigants, we have shown that the threat of injunctions, breakups, and treble damages have proved to be an effective corrective to exclusionary conduct. Many assessments have shown the wide array of ways monopoly can exclude rivals. These include raising rivals’ costs,186 foreclosing inputs in adjacent markets,187 vertically integrated suppliers discriminating against rivals downstream,188 self-preferencing generally,189 creating requirements for multi-level entry or by artificially creating cluster markets,190 or investment kill zones created by the fears of these practices.191 These exclusions, if they exclude rivals, can of course exclude the innovations a rival may otherwise bring to market to compete.

Our case studies demonstrate that the innovation harms from monopoly are not just those that result from exclusionary conduct like this, however. Even absent this exclusion, there are instances where the harm is unilateral as well, more akin to a monopolist charging monopoly prices. Monopolists simply don’t need to deploy many of the innovations they create. The refusal of a dominant firm to deploy the innovations it generates is just as much a harm to the marketplace as is raising the prices of goods they sell. Captured Innovation contains both elements: the unilateral harm of holding innovation back as well as the exclusionary harm of tying products to existing dominant markets.

In either case, the threat is relevant for enforcers and regulators. The capacity to fully hold back innovation may rightly be considered a mode of relevant direct evidence of market power for example in a § 2 assessment. Meanwhile, tying of that innovation has historically been used as evidence in the exclusionary conduct requirement of a § 2 case.192 These considerations can be relevant to merger analysis as well. Mergers which enable or incentivize the merged firm to hold back or yoke innovation to existing dominant markets should be understood to substantially lessen competition or tend to create a monopoly.

Likewise, regulators with competition or innovation equities should note how critically important their role has historically been in allowing innovation to flourish. Regulation can be actively harmful to innovation when it ossifies markets and entrenches dominant technology. The story of AT&T shows especially how the short-term protection of existing monopoly and oligopoly market structures can delay the emergence of hugely important innovations. But that same history shows that affirmatively regulating to promote competition can be essential to fulfilling an agency’s public interest mandate. In some cases this can be even more in the public interest than a primary focus on price, convenience, or safety would provide.

Government action has a demonstrable track record of righting markets where innovation was not being fully deployed in ways that delivered transformative new technology to Americans. Looking to the future, we do not take an explicit position on which types of regulatory intervention best serve competition in a technology market. Our case studies show that different tools of competition enforcers, or competition itself can be effective in different circumstances. When considering how to promote competition in the burgeoning AI market, some scholars have recommended an approach to AI that includes industrial policy such as grants, loans, subsidies, and tax incentives; NPU law such as structural separation, nondiscrimination, rate regulation, and interoperability; direct public provision of goods and services; and the encouragement of cooperative governance structures in the marketplace.193 Some of these may be appropriate while others may be impracticable with current tools and resources in other technology markets.

Regardless of the direction of intervention, we are left with the fact that technology markets—from signal transmission to self-driving cars to smartphones and beyond—benefit from attentive enforcement of competition law, ensuring that dominant firms cannot capture innovation. Technical innovation, driven by vibrant competition, has been one of the most powerful forces for advancing human flourishing throughout history, and certainly a key ingredient in the American success story. But innovation has no power when it sits on a shelf, unused. The drafters of our antitrust laws knew this, and it is our hope that our competition regulators continue their recent efforts to ensure that American technology markets remain the most innovative and competitive in the world.

  • 1Joseph A. Schumpeter, Capitalism, Socialism and Democracy 106 (1942) (“What we have got to accept is that [the large company] has come to be the most powerful engine of progress.”).
  • 2See Clayton M. Christensen, The Innovator’s Dilemma: When New Technologies Cause Great Firms to Fail 22 (2013).
  • 3See Mitsuru Igami, Estimating the Innovator’s Dilemma: Structural Analysis of Creative Destruction in the Hard Disk Drive Industry, 1981–1998, 125 J. Pol. Econ. 798 (2017).
  • 4See Einar H. Dyvik, Ranking of the Companies with Highest R&D Spending Worldwide, Statista (July 4, 2022), https://perma.cc/CC7E-E6Q2  (noting that tech companies make up 9 of the top 10 global spenders on R&D).
  • 5See Richard Carson et al., The Risk of Caution: Evidence From an R&D Experiment 1 (Nat’l Bureau of Econ. Rsch., Working Paper No. 26847, 2020), https://perma.cc/Y9V4-JVJA  (“[P]rojects with greater uncertainty have a lower probability of bearing fruit but may also generate more path breaking innovations if successful.”).
  • 6See Brad Stone, Biden Executive Order on Non-Competes Could Roil Tech, Bloomberg (July 12, 2021), https://perma.cc/8B3H-G7VF.
  • 7See, e.g., Cade Metz, Tech Giants Are Paying Huge Salaries for Scarce A.I. Talent, N.Y. Times (Oct. 22, 2017), https://www.nytimes.com/2017/10/22/technology/artificial-intelligence-experts-salaries.html.
  • 8See Tim Wu, The Master Switch: The Rise and Fall of Information Empires, 20 (2011) (discussing the Kronos Effect).
  • 9This is a rather direct recontextualization of Schumpeter’s famous formulation that monopoly markets increase incentives to innovate. Where incentives to innovate may increase, we find that the shape of innovation deployment changes dramatically.
  • 10See, e.g., Daniel Spulber, Unlocking Technology: Antitrust and Innovation, J. Competition L. Econ 3 (2008) (“To avoid disruting incentives for innovation, public policy makers should exercise more forbearance than usual in markets for technology”); Andrew Stirling, Precaution in the Governance of Technology, in The Oxford Handbook of Law, Regulation, and Technology 573, 577–78 (Roger Brownsword, Eloise Scotford & Karen Yeung eds., 2017).
  • 11United States v. Int’l Bus. Machines Corp., 69 Civ. 200 (S.D.N.Y. 1969) (Complaint as reproduced in Folded Spindled and Mutilated Appendix at 355); see 13 F. Supp. 11, aff’d 298 U.S. 131; see also Anthony Sampson, Shenanigans in the Market Place, 326 Nature Vol. 30 (Apr. 1987).
  • 12Martin Campbell-Kelly, Punched Card Machinery, Encyclopedia of Computer Science 1489 (2003) (“Until the advent of commercially available stored-program computers in the 1950s, punched card machines represented the most technologically advanced information processing capability that was routinely available. The leading punched card machine supplier was IBM which dominated the industry.”).
  • 13IBM (International Business Machines), Encyclopedia.com, https://perma.cc/ED5C-96KT.
  • 14United States v. Int’l Bus. Machines Corp., 13 F. Supp. 11, 15, 20 (S.D.N.Y. 1935).
  • 15See United States v. Int’l Bus. Machines Corp., 69 Civ. 200 (S.D.N.Y. 1969) (Complaint as reproduced in Folded Spindled and Mutilated Appendix at 355) (quoting Int’l Bus. Machines Corp., 13 F. Supp. 11 (alleging that IBM conspired “[t]o lease only and not sell tabulating machines; To adhere to minimum prices for the rental of tabulating machines as fixed by IBM; and To require customers to purchase their card requirements from the lessor or pay a higher price for the rental of machines.”)).
  • 16Int’l Bus.Machines Corp., 13 F. Supp. 11.
  • 17United States. v. IBM Corp., No. 72-344 (S.D.N.Y. Jan. 25, 1956).
  • 18Robert Garner, Early Popular Computers, 1950–1970, Engineering and Technology History Wiki (Jan. 9, 2015), https://perma.cc/7NAS-FHDF.
  • 19The IBM System/360, IBM, https://perma.cc/K7WM-P3QM.
  • 20Id.
  • 21Frederick P. Brooks Jr., The Mythical Man-Month: Essays on Software Engineering (1975).
  • 22Chuck Boyer, The 360 Revolution, IBM 37 https://perma.cc/V7YP-3GQM; I.B.M. 1969 Net Up, But Quarter Lags, Gene Smith, N.Y. Times, (Jan. 17, 1970), https://www.nytimes.com/1970/01/17/archives/ibm-1969-net-up-but-quarter-lags-company-cites-relatively-modest.html.
  • 23James W. Cortada, IBM: The Rise and Fall and Reinvention of a Global Icon 214 (2019).
  • 24Id.
  • 25U.S. Patent No. 3,380,029 (filed Apr. 9, 1965).
  • 26Richard Sandomir & Martin Goetz, Who Received the First Software Patent, Dies at 93, N.Y. Times (Oct. 21, 2023), https://www.nytimes.com/2023/10/21/technology/martin-goetz-dead.html.
  • 27United States v. Int’l Bus. Machines Corp., 69 Civ. 200 (S.D.N.Y. 1969) (Complaint as reproduced in Folded Spindled and Mutilated Appendix at 355).
  • 28IBM, Britannica Money (Oct. 8, 2024), https://perma.cc/Z6BV-4M37.
  • 29See Control Data Corp. v. Int’l Bus. Machines Corp., 306 F. Supp. 839, 842 (D. Minn. 1969).
  • 30Id. at 843.
  • 31See, e.g., Sanders Assocs., Inc. v. IBM, Civ. No. 75-14 (D.N.H. 1975); Mira-Pak, Inc. v. IBM, Civ. No. 73-11-677 (S.D. Tex. 1973); Memorex v. IBM, Civ. No. 73-2239 (S.D. Cal. 1973); Telex Corp. v. IBM, 72-C-18, 72-C-89 (N.D. Okla. 1972); Symbolic Control, Inc. v. IBM, Civ. No. 71-2207 (N.D. Cal. 1971); Greyhound Computer Corp. v. IBM, 70 Civ. 1944 (N.D. IMI. 1970); Applied Data Research, Inc. v. IBM, 69 Civ. 1682 (S.D.N.Y. 1969).
  • 32Burton Grad, A personal recollection: IBM’s unbundling of software and services, IEEE Annals of the History of Computing 64 (Aug. 7, 2002).
  • 33This is when the case was ultimately voluntarily dismissed by Ronald Reagan’s AAG for antitrust, Bill Baxter, in the first year of his tenure.
  • 34See United States v. Int’l. Bus. Machines Corp., 66 F.R.D. 223, 225 (S.D.N.Y. 1975).
  • 35Grad, supra note 32, at 64–71.
  • 36Id. at 65.
  • 37Id.
  • 38Id. at 64–71.
  • 39Don Waldman, IBM, in Market Dominance: How Firms Gain, Hold, Or Lose it and the Impact on Economic Performance 140 (David Ira Rosenbaum, ed., 1998).
  • 40See, e.g., Tim Wu, Tech Dominance and the Policeman at the Elbow (Columbia Public Law Research Paper No. 14-623, 2019).
  • 41See Memorex Corp. v. Int’l Bus. Machines Corp., 555 F.2d 1379, 1383 (9th Cir. 1977) (quoting Javelin Corp. v. Uniroyal, Inc., et al., 546 F.2d 276, 279 (9 Cir., 1976, as amended January 13, 1977)).
  • 42A Settlement for IBM, TIME (Jan. 29, 1973), https://web.archive.org/web/20081214072759/http:/www.time.com/time/magazine/article/0,9171,903788,00.html.
  • 43See Clements Auto Co. v. Serv. Bureau Corp., 298 F. Supp. 115, 119 (D. Minn. 1969), aff’d in part, rev’d in part, 444 F.2d 169 (8th Cir. 1971) (Service Bureau Corporation “sells data processing services in the following areas: payroll, personnel records, accounts receivable, billing, sales accounting, marketing studies, cost accounting, inventory records, budgets, and general accounting.”).
  • 44Symbolic Control, Inc. v. Int’l Bus. Machines Corp., 643 F.2d 1339, 1341 (9th Cir. 1980) (“The theory of Symbolic’s case was that because of the importance of software in general and APT processors in particular to the sale of large computers, IBM had a policy of giving the computer program, documentation and instructions to use the program, and maintenance of the program free of charge to computer users.”).
  • 45Symbolic Control, Inc. v. Int’l Bus. Machines Corp., No. C-71-2207 AJZ, 1975 WL 810, at *1 (N.D. Cal. Dec. 31, 1975), rev’d, 643 F.2d 1339 (9th Cir. 1980).
  • 46See California Computer Products, Inc. v. IBM, 613 F.2d 727, 731 (9th Cir. 1979); Transamerica Computer Company, Inc., Plaintiff and Appellant, v. International Business Machines Corporation, Defendant and Appellee, 573 F.2d 646 (9th Cir. 1978).
  • 47See California Computer Products, Inc., 613 F.2d at 727; Transamerica Computer Co. v. Int’l. Bus. Machines Corp., 698 F.2d 1377 (9th Cir. 1983).
  • 48Telex Corp. v. Int’l Bus. Machines Corp., 367 F. Supp. 258, 267 (N.D. Okla. 1973), rev’d, 510 F.2d 894 (10th Cir. 1975), and disapproved of by Memorex Corp. v. Int’l Bus. Machines Corp., 555 F.2d 1379 (9th Cir. 1977).
  • 49Memorex Corp, 555 F.2d at 1380.
  • 50ILC Peripherals Leasing Corp. v. Int’l Bus. Machines Corp., 458 F. Supp. 423, 442 (N.D. Cal. 1978), aff’d sub nom. Memorex Corp. v. Int’l Bus. Machines, 636 F.2d 1188 (9th Cir. 1980).
  • 51Michael L. Rustad, Software Licensing, Cloud Computing Agreements, Open Source, And Internet Terms Of Use: A Practical Approach To Information Age Contracts In A Global Setting § 1.02, 19 (2016–2017 ed.).
  • 52See generally Martin Campbell-Kelly, From Airline Reservations to Sonic the Hedgehog: A History of the Software Industry (2004).
  • 53Timeline of Computer History, Computer History Museum https://perma.cc/9CCX-VYU7.
  • 54The Early Years: 1972–1980, SAP, https://perma.cc/7BKQ-DDYD.
  • 55See, e.g., discussion of acoustic couplers and data access arrangements below.
  • 56See The Story of Ma Bell, CNN Money (July 9, 2001), https://perma.cc/6ZSH-Q3J3.
  • 57Adam B. Thierer, Unnatural Monopoly: Critical Moments in the Development of the Bell System Monopoly, 14 Cato J. 267, (1994).
  • 58See United States v. Am. Tel. & Tel. Co., 552 F. Supp. 131, 135 (D.D.C. 1982) (“January 14, 1949, the government filed an action in the District Court for the District of New Jersey against the Western Electric Company, Inc. and the American Telephone and Telegraph Company, Inc. (Civil Action No. 17–49).”).
  • 59Report of the Antitrust Subcomm. (Subcomm. No. 5), of the H. Comm. on the Judiciary, 86th Cong., Report on Consent Decree Program of the Dept. of Justice 31 (Comm. Print 1959).
  • 60Id.
  • 61Id. at 45.
  • 62Id. at 47.
  • 63Id. at 47, 55–56.
  • 64See id. at 74–75, for example, when FCC staff prepared a memo to the DOJ that it was not well suited to assess whether the market was competitive and needed antitrust enforcement, AT&T individually met with each FCC commissioner to ensure that the final memo instead said the FCC had the expertise and power to resolve any concerns that might emerge.
  • 65See id. at 84 (Victor Kramer, the DOJ staff attorney in charge of the case, wrote a memorandum on why a settlement would be inconsistent with the oath of office to “support and defend and protect the Constitution and the laws of the United States.” The chief DOJ trial attorney, Walter Murphy, wrote a similar memorandum objecting to the settlement. As did W.D. Kilgore, chief of the Antitrust Division’s Judgement and Judgement Enforcement Section.).
  • 66Anthony Lewis, A.T.&T. Settles Antitrust Case; Shares Patents; U.S. Hails Consent Decree as Major Victory--Company Calls Terms “Stringent” One of “Most Important”, N.Y. Times (Jan. 25, 1956).
  • 67Robert W. Crandall, After the Breakup: U.S. Telecommunications in a More Competitive Era (1991).
  • 68In the Matter of Hush-A-Phone Corp. & Harry C. Tuttle, Complainants Am. Tel. & Tel. Co., et al., Defendants, 20 F.C.C. 391, 394 (1955).
  • 69Id. at 413–14.
  • 70Id. at 415.
  • 71Id. at 391.
  • 72Id. at 398.
  • 73Id. at 427.
  • 74Hush-A-Phone Corp. v. United States, 238 F.2d 266, 268 (D.C. Cir. 1956).
  • 75Id. at 269 (emphasis added).
  • 76Carter v. Am. Tel. & Tel. Co., 250 F. Supp. 188 (N.D. Tex.), aff’d, 365 F.2d 486 (5th Cir. 1966).
  • 77Re Use of Carterfone in Message Toll Tel. Serv., 77 P.U.R.3d 417 (F.C.C. June 26, 1968).
  • 78Carter, 250 F. Supp. 188.
  • 79Id.
  • 80Re Use of Carterfone in Message Toll Tel. Serv., 77 P.U.R.3d 417 (F.C.C. June 26, 1968) (citing Hush-A-Phone Corp. v. United States, 238 F2d 266, 269 (1956)).
  • 81The Carterfone Case Again—It May Be Too Early to Rejoice, Datamation, 71 (Oct. 1968).
  • 82Editorial, Wrongful Use of Power?, 6 Computerworld 29 (“The monthly charge (penalty fee?) that Bell collects for a DAA from every customer who refuses to lease Bell modems represents an automatic price advantage to Bell in competing with the independent modem manufacturers. Any businessman would love to have a market advantage like that especially when the whole thing has the tacit approval of a federal agency, in this case the Federal Communications Commission.”).
  • 83See id.; see also Interview with Strassburg, Computer History Museum 8 (May 1998), https://perma.cc/4ZEU-NY3D  (“[E]verybody raised hell about that.”).
  • 84Id. (“We shouldn’t have to rent something from the telephone company, because when the telephone company sells or offers a competing piece of equipment, the customer doesn’t have to have this PCA, so we’re at a competitive disadvantage.”).
  • 85In the Matter of Proposals for New or Revised Classes of Interstate & Foreign Message Toll Tel. Serv. (MTS) & Wide Area Tel. Serv. (WATS), 56 F.C.C.2d 593, 615 (1975) (“Terminal equipment may be directly connected to the telephone network.”).
  • 86N. Carolina Utilities Comm’n v. F.C.C., 552 F.2d 1036, 1040 (4th Cir. 1977).
  • 87Victoria Shannon, The Rise and Fall of the Modem King, N.Y. Times (Jan. 7, 1999).
  • 88Infoworld, Vol. 5, N. 17 at 90–92 (1983).
  • 89Alton Dickieson, The TD2 Story, Bell Laboratories Record (Nov. 1967), https://perma.cc/V92V-9S8X.
  • 90I. Hayashi et al., Junction Lasers Which Operate Continuously at Room Temperature, 17(3) Applied Physics Letters 109–111.
  • 91See Fiber Optics, Hagley (2016), https://perma.cc/LFD3-FD8Y.
  • 92U.S. Patent No. 3,663,762, (filed Dec. 21, 1970).
  • 93In Re Applications of Microwave Commc’ns, Inc., 18 F.C.C.2d 953, 953 (Aug. 13, 1969).
  • 94MCI Commc’ns Corp. v. Am. Tel. & Tel. Co., 369 F. Supp. 1004, 1015 (E.D. Pa. 1973), vacated, 496 F.2d 214 (3d Cir. 1974) (“The various MCI applications propose to provide ‘customized’ communications channels, tailored to the exact requirements of subscribers needing interoffice and intracompany communications, to meet newly developing data and specialized communications needs of the public at significantly low cost. The channels would accommodate transmission of data, facsimile, control, remote metering, voice and other forms of communication.”) (emphasis added).
  • 95Memorandum Opinion and Order Designating Applications For Consolidated Hearing on Stated Issues, 31 Fed. Reg. 2666, n.1 (Feb. 11, 1966).
  • 96Id. at 2666. (American Telephone & Telegraph Co. filed on February 24, 1964; Illinois Bell Telephone Co. filed on February 25, 1964; Southwestern Bell Telephone Co. filed March 11, 1964; General Telephone Co. of Illinois filed on March 12, 1964; the Western Union Telegraph Co. filed on April 10, 1964).
  • 97Id. at 2667.
  • 98See, e.g., In the Matter of Establishment of Pol’ys & Procs. for Consideration of Applications to Provide Specialized Common Carrier Servs. in the Domestic Pub. Point-to-Point Microwave Radio Serv. & Proposed Amends. to Parts 21, 43 & 61 of the Commission’s Rules, 24 F.C.C.2d 318, 324 (1970) (“MCI claims its proposal would provide the benefits of competition in the specialized communications field, stimulate the development of new lines of equipment, introduce new ownership interests in the communications industry, and pioneer new types of communications.”).
  • 99In Re Applications of Microwave Commc’ns, Inc. for Constr. Permits to Establish New Facilities in the Domestic Pub. Point-to-Point Microwave Radio Serv. at Chicago, Ill., St. Louis, Mo., & Intermediate Points, 18 F.C.C.2d 953, 955 (1969) (“[A]s modified . . . we adopt the hearing examiner’s findings and conclusions.”).
  • 100In fact, a main argument by the carriers was that the FCC was “injecting competition for its own sake and without any showing that the public.” In Re Applications of Microwave Commc’ns, Inc. for Constr. Permits to Establish New Facilities in the Domestic Pub. Point-to-Point Microwave Radio Serv. at Chicago, Ill., St. Louis, Mo., & Intermediate Points, 21 F.C.C.2d 190, 193–94 (1970).
  • 101In the Matter of Establishment of Pol’ys & Procs. for Consideration of Applications to Provide Specialized Common Carrier Servs. in the Domestic Pub. Point-to-Point Microwave Radio Serv. & Proposed Amends. to Parts 21, 43 & 61 of the Commission’s Rules, 24 F.C.C.2d 318, 350–52 (1970).
  • 102Id. at 338 (1970) (“[I]t appears that additional competition is reasonably feasible in this burgeoning market and that the entry of new carriers may be expected to benefit the public by providing new and differentiated services.”).
  • 103MCI Commc’ns Corp. v. Am. Tel. & Tel. Co., 708 F.2d 1081, 1095 (7th Cir. 1983) (“The FCC’s Specialized Common Carriers decision was hardly a model of clarity. The decision did not define the specialized services to which it referred, nor did it define the corresponding obligations that the FCC expected the general carriers (primarily AT&T) to assume in order to assist the new carriers.”).
  • 104Id. at 1096–98.
  • 105Id. at 1096.
  • 106Id.
  • 107Id. (“By filing interconnection tariffs with the state commissions rather than with the FCC, AT & T made it more difficult for MCI to oppose the tariffs, since, in the words of one AT & T official, the interconnection ‘controversy would spread to 49 jurisdictions.’”).
  • 108See MCI Commc’ns Corp. v. Am. Tel. & Tel. Co., 369 F. Supp. 1004, 1010–11 (E.D. Pa. 1973), vacated, 496 F.2d 214 (3d Cir. 1974).
  • 109Id. at 1017.
  • 110Id.
  • 111Peter Temin & Louis Galambos, The Fall of the Bell System: A Study in Prices and Politics 96 (1987).
  • 112MCI Commc’ns Corp. v. Am. Tel. & Tel. Co., 708 F.2d 1081, 1097 (7th Cir. 1983).
  • 113MCI Commc’ns Corp. v. Am. Tel. & Tel. Co., 369 F. Supp. 1004, 1016 (E.D. Pa. 1973), vacated, 496 F.2d 214 (3d Cir. 1974).
  • 114In the Matters of Bell Sys. Tariff Offerings of Loc. Distribution Facilities for Use by Other Common Carriers; & Letter of Chief, Common Carrier Bureau, Dated Oct. 19, 1973, to Laurence H. Harris, Vice President, MCI Telecommunications Corp., 44 F.C.C.2d 245, 251 (1973) (“A.T. & T. IS DIRECTED TO SHOW CAUSE (a) why filing Bell tariffs for domestic satellite interconnection facilities with state commissions, rather than exclusively with this Commission, should not be considered as noncompliance.”).
  • 115MCI Commc’ns Corp. v. Am Tel. & Tel., 369 F. Supp. 1004, 1029 (E.D. Pa. 1973).
  • 116MCI Commc’ns Corp., 708 F.2d at 1092.
  • 117MCI Commc’ns Corp. v. Am. Tel. & Tel. Co., 496 F.2d 214, 224 (3d Cir. 1974).
  • 118MCI Commc’ns Corp., 708 F.2d at 1097 (“AT & T ordered its local operating companies to disconnect MCI’s customers on twenty-four hours’ notice.”).
  • 119Id(noting “FCC warnings that disconnection of MCI’s customers would violate the Communications Act”).
  • 120Bell System Tariff Offerings of Local Distribution Facilities for Use by Other Common Carriers, 46 F.C.C.2d 413, aff’d sub nom. Bell Telephone Co. v. FCC, 503 F.2d 1250, 1286 (3d Cir. 1974), cert. denied, 422 U.S. 1026 (1975) (holding that FX and CCSA interconnections were covered in the scope of the rule in Docket 18920).
  • 121United States v. Am. Tel. & Tel. Co., 461 F. Supp. 1314, 1317 (D.D.C. 1978).
  • 122U.S. Dep’t of Just., Press Release (Nov. 20, 1974), https://perma.cc/2C2L-VNJ2.
  • 123The Industrial Reorganization Act: Hearing on S. 1167 Before the S. Subcomm. on Antitrust and Monopoly of the Comm. of the Judiciary, 93rd Cong. (1974).
  • 124See In the Matter of MCI Telecommunications Corp. Investigation into the Lawfulness of Tariff FCC No. 1 Insofar as It Purports to Offer Execunet Serv., 60 F.C.C.2d 25, 33 (1976).
  • 125Id. at 30 (MCI “was concerned about the staff’s handling of AT&T’s written allegations as well as about the matter of ‘lobbying’ at the Commission. MCI claimed that AT&T had used ‘sophisticated presentation techniques . . . to spur some members of the Commission’s staff to pre-judge the matter before MCI had had even the chance to present its side of the story.’ . . . [so] it asked for ‘equal time’ with those persons to whom AT&T had presented its arguments.”); see also MCI Telecommunications Corp. v. F.C.C., 561 F.2d 365, 368 (D.C. Cir. 1977) (“Apparently AT&T representatives approached individual commissioners and various Commission staff personnel with this complaint and even held a demonstration of Execunet in the Commission’s offices.”).
  • 126MCI Telecommunications Corp., No. 75-799 (F.C.C. July 2, 1975) (letter order).
  • 127See In the Matter of MCI Telecommunications Corp. Investigation into the Lawfulness of Tariff F.C.C. No. 1 Insofar as It Purports to Offer Execunet Serv., 57 F.C.C.2d 271, 271 (1975) (“MCI petitioned the United States Court of Appeals for the District of Columbia for review of the Commission’s Order. MCI Telecommunications Corp. v. FCC and U.S., No. 75–1635. In a motion filed with the Court seeking a judicial stay of the Commission’s Order, MCI presented several new arguments or elaborations on arguments that it had presented to the Commission. In order to afford us the opportunity to address those arguments in the first instance before judicial review, Commission counsel asked the Court to hold the review proceeding in abeyance to permit MCI to present its arguments to the Commission. By order released October 24, 1975, the Court granted our motion. Its order directed that any further proceedings ‘be conducted and concluded expeditiously.’”); see also MCI Telecommunications Corp. v. F.C.C., 561 F.2d 365, 369 (D.C. Cir. 1977) (“MCI immediately filed a petition for review in this court and sought a stay of the Commission’s order . . . The request for a stay was granted. Subsequently the Commission, which had previously refused to allow MCI any kind of hearing, moved to have the proceedings remanded so that it could consider matters more fully than it had previously. This motion was granted, although jurisdiction was retained.”).
  • 128In the Matter of MCI Telecommunications Corp. Investigation into the Lawfulness of Tariff FCC No. 1 Insofar as It Purports to Offer Execunet Serv., 60 F.C.C.2d 25 (1976).
  • 129MCI Telecommunications Corp. v. F.C.C., 580 F.2d 590, 595, 596 (D.C. Cir. 1978) (Judge Skelly Wright took pains to reiterate that the FCC “is not free to propagate monopoly for monopoly’s sake.”).
  • 130MCI Telecommunications Corp. v. F.C.C., 580 F.2d 590, 595–96 (D.C. Cir. 1978).
  • 131Id. at 600.
  • 132Id.; see also MCI Commc’ns Corp. v. Am. Tel. & Tel. Co., 708 F.2d 1081, 1101 (7th Cir. 1983).
  • 133United States v. Am. Tel. & Tel. Co., 552 F. Supp. 131, 141 (D.D.C. 1982).
  • 134Id. at 223.
  • 135Andrew Pollack, Bell System Breakup Opens Era of Great Expectations and Great Concern, N.Y. Times (Jan. 1, 1984), https://www.nytimes.com/1984/01/01/us/bell-system-breakup-opens-era-of-great-expectations-and-great-concern.html.
  • 136Lawrence Roberts, The Evolution of Packet Switching, 66 Proceedings of the IEEE 1307 (1978).
  • 137Keenan Mayo and Peter Newcomb, The Birth of the World Wide Web: An Oral History of the Internet, Vanity Fair (Jan. 7, 2009), https://www.vanityfair.com/news/2008/07/internet200807.
  • 138Keith Uncapher & Vinton Cerf, The ARPANET: A User Perspective, 3 (Oct. 17, 1974), https://perma.cc/J69B-DGNE.
  • 139Janus Rose, AT&T Could Have Bought the Internet in 1971, Vice (Jan. 17, 2012), https://perma.cc/SP8V-LU2R.
  • 140S. L. Mathison, L. G. Roberts & P. M. Walker, The History of Telenet and the Commercialization of Packet Switching in the U.S., 50 IEEE Communications Magazine no. 5, 28–45, (May 2012).
  • 141Id.
  • 142Id. at n.7.
  • 143Id.
  • 144Short History of Study Group 17, ITU (June 15, 2023), https://perma.cc/U8Q8-BNZE.
  • 145S. L. Mathison, L. G. Roberts & P. M. Walker, supra note 140, at 30, 42.
  • 146Colin Berkshire, How the Bell System Missed the Internet 1, TalkingPointz (Mar. 25, 2013), https://perma.cc/89J2-FFTS.
  • 147Janet Guyon, AT&T Abandoning Visionary Network of Computers After Spending $1 Billion, Wall St. J., Jan. 22, 1986.
  • 148Birth of the Commercial Internet, U.S. National Science Foundation, https://perma.cc/R87D-MTZN.
  • 149Office of Inspector General National Science Foundation, OIG Review of NSFNET (Mar. 23, 1993), https://perma.cc/W7ZY-D9H2.
  • 150Karen D. Frazer, NSFNET: A Partnership for High-Speed Networking 1987-1995, NSFNET 40–42 (1995), https://perma.cc/NE6F-TQSC.
  • 151D. Steinkraus, I. Buck & P. Y. Simard, “Using GPUs for Machine Learning Algorithms,” Eighth International Conference on Document Analysis and Recognition (ICDAR’05), Seoul, Korea (South), 2005, pp. 1115–1120 Vol. 2.
  • 152CUDA Toolkit—Free Tools and Training, NVIDIA Developer (last visited Oct. 16, 2024) https://perma.cc/UY7L-YBG5.
  • 153Stanford Vision Lab, ImageNet Large Scale Visual Recognition Competition 2012 (ILSVRC2012), https://perma.cc/F9UN-H84H.
  • 154Some consumer-facing products require training on labeled data when meaning is particularly important, such as harmful or fraudulent content, but this is not a technical requirement of the model.
  • 155Majority Staff of the Subcomm. on Antitrust, Commercial and Admin Law of the H. Comm. on the Judiciary, 116th Cong., Report on the Investigation of Competition in Digital Markets 29 (2020) (“As Amazon, Apple, Facebook, and Google have captured control over key channels of distribution, they have come to function as gatekeepers. A large swath of businesses across the U.S. economy now depend on these gatekeepers to access users and markets.”).
  • 156Id. at 111. (“Facebook has monopoly power in the market for social networking.”).
  • 157Id. at 213. (“Amazon is the dominant online marketplace.”).
  • 158Id. at 90. (“Together, Android and iOS account for 99 percent of the smartphone operating systems in the United States.”).
  • 159Id. at 213. (“Amazon is the dominant online marketplace.”).
  • 160Felix Richter, Amazon Maintains Cloud Lead as Microsoft Edges Closer, Statista (May 2, 2024), https://perma.cc/Z2HH-JGA9.
  • 161Chicago Booth Stigler Ctr. For The Study Of Econ. & State, Stigler Comm. On Dig. Platforms 29 (2019) (“The market structure and antitrust report begins by discussing the characteristics of digital markets. These markets often have extremely strong economies of scale and scope due to low marginal costs and the returns to data. Moreover, they often are two-sided and have strong network externalities and are therefore prone to tipping. If so, the competitive process shifts from competition in the market to competition for the market.”); Majority Staff of the Subcomm. on Antitrust, Commercial and Admin Law of the H. Comm. on the Judiciary, 116th Cong., Report on the Investigation of Competition in Digital Markets 37 (2020) (“[T]echnology markets ‘tip’ in favor of one or two large companies.”) (available at https://perma.cc/9W46-YPCU).
  • 162Leah Nylen, Google’s Payments to Apple Reached $20 Billion in 2022, Antitrust Court Documents Show, Bloomberg.com, (May 1, 2024), https://perma.cc/9W46-YPCU.
  • 163Google Strikes $60 Million Deal with Reddit, Allowing Search Giant to Train AI Models on Human Posts, CBS News (Feb. 23, 2024), https://perma.cc/8UKV-XLCG.
  • 164TPU Transformation: A Look Back at 10 Years of Our AI-Specialized Chips, Google Cloud Blog (July 31, 2024) https://perma.cc/8E6W-FNSJ.
  • 165See, e.g., TPU v4 vs. NVIDIA A100: Unraveling the AI Supercomputing Showdown, Santa Barbara Computer Repair “PC Mechanic” (Apr. 5, 2023), https://perma.cc/N3RP-PG4B.
  • 166Alfonso Maruccia, Google Is Now the World’s Third-Largest Data Center Processor Designer, TechSpot (May 22, 2024), https://perma.cc/Q4SR-A5DT.
  • 167Max A. Cherney, Apple Used Google’s Chips to Train Two AI Models, Research Paper Shows, Reuters (July 30, 2024), https://www.reuters.com/technology/apple-says-it-uses-no-nvidia-gpus-train-its-ai-models-2024-07-29/.
  • 168United States v. Google LLC, 1:20-cv-03010 Complaint, 5 (Oct. 2020).
  • 169United States v. Google LLC, 1:23-cv-00108 Complaint, 1–2 (Jan. 2023).
  • 170Id. at 3.
  • 171United States v. Google LLC, No. 20-CV-3010 (APM), 2024 WL 3647498 (D.D.C. Aug. 5, 2024).
  • 172Daisuke Wakabayashi, Google Dominates Thanks to an Unrivaled View of the Web, N.Y. Times (Dec. 14, 2020), https://www.nytimes.com/2020/12/14/technology/how-google-dominates.html.
  • 173Nico Grant & Cade Metz, A New Chat Bot Is a ‘Code Red’ for Google’s Search Business, N.Y. Times (Dec. 21, 2022), https://www.nytimes.com/2022/12/21/technology/ai-chatgpt-google-search.html;  Aaron Mok, ChatGPT, the Scary-Smart AI Chatbot Generating Buzz Around the Internet, May Pose a Threat to Google’s Ad Business, Says Former Exec, Business Insider, https://www.businessinsider.com/chatgpt-may-hurt-googles-ad-business-former-exec-says-report-2022-12;  Parmy Olson, Google Faces a Serious Threat From ChatGPT, Bloomberg (Dec. 7, 2022), https://www.bloomberg.com/opinion/articles/2022-12-07/chatgpt-should-worry-google-and-alphabet-why-search-when-you-can-ask-ai.
  • 174See id. This interpretation is less about the factual proximity of LLMs to search than it is about the extent to which Google perceived LLMs as near enough to be a relevant hypothetical threat.
  • 175Emma Roth, Microsoft Spent Hundreds of Millions of Dollars on a ChatGPT Supercomputer, The Verge (Mar. 13, 2023), https://www.theverge.com/2023/3/13/23637675/microsoft-chatgpt-bing-millions-dollars-supercomputer-openai.
  • 176Nico Grant, Google Calls in Help From Larry Page and Sergey Brin for A.I. Fight, N.Y. Times (Jan. 20, 2023), https://www.nytimes.com/2023/01/20/technology/google-chatgpt-artificial-intelligence.html.
  • 177Sundar Pichai, Google I/O 2024: An I/O for a New Generation, Google Inside (May 14, 2024), https://perma.cc/3VAZ-BW7M.
  • 178LLM Leaderboard, Compare GPT-4o, Llama 3, Mistral, Gemini & other models, Artificial Analysis (last visited July 30, 2024), https://perma.cc/V5KD-RQZD.
  • 179The Fine-tuning Index, Predibase, (Apr. 2024), https://perma.cc/9HVX-6E5C.
  • 180Supra note 178.
  • 181Jonathan B. Baker, Beyond Schumpeter vs. Arrow: How Antitrust Fosters Innovation, 74 Antitrust L.J. (2007).
  • 182See id. at 602 (“It is time to move beyond the ‘Schumpeter vs. Arrow’ debate and to embrace antitrust as essential for fostering innovation. The benefits of antitrust rules and enforcement extend beyond lower prices, greater output, and higher product quality; they also include increased innovation.”).
  • 183For a representative example, see OECD Competition and Innovation: A Theoretical Perspective, OECD Competition Policy Roundtable Background Note (2023), https://perma.cc/2Z4G-VD4F  (“[R]eviewing the most recent developments in the thinking about the relationship between competition and innovation, analy[zing] the many factors that drive innovation, such as firm-specific characteristics and external factors that impact firms’ ability and incentives to innovate, and how these factors interact with competition.” But primarily understanding that innovation through the ‘two of the most commonly used variables to measure innovation . . . R&D expenditure and patent activity’ with little further consideration of the historical texture of what marks a disruptive innovation.”).
  • 184Martin Watzinger et al., How Antitrust Enforcement Can Spur Innovation: Bell Labs and the 1956 Consent Decree, 12 Am. Econ. J.: Econ. Pol’y (4), 328–59; see also I. Saglam, Incentives of a Monopolist for Innovation Under Regulatory Threat, Econ. Gov. 24, 41–66 (2023); Jonathan B. Baker, supra note 181; OECD Competition and Innovation: A Theoretical Perspective, OECD Competition Policy Roundtable Background Note (2023).
  • 185Assessing innovation effects qualitatively also offers the nuance that not all technological changes bearing the moniker of innovation are desirable. For example, AT&T’s acoustic couplers or other “protective” devices were indeed new, but they were creations designed to protect a monopoly moat rather than contribute to the market.
  • 186Morten Hviid & Matthew Olczak, Raising Rivals’ Fixed Costs, 23 Int’l J. Econ. Bus. 1–18 (2015).
  • 187Steven C. Salop, Analysis of Foreclosure in the EC Guidelines on Vertical Restraints, Annual Proceedings of The Fordham Corporate Law Institute 195 (2001) (“In some cases, a combination of input and customer foreclosure can permit a vertically integrated firm to entrench market power by raising barriers to entry.”).
  • 188See generally T. Randolph Beard, David L. Kaserman & John W. Mayo, Regulation, Vertical Integration and Sabotage, 49 J. Indus. Econ. no. 3 (2001).
  • 189Daniel Hanley, How Self-Preferencing Can Violate Section 2 of the Sherman Act, Competition Pol’y Int’l Antitrust Chron. 4 (June 2021).
  • 190See, e.g., Dep’t of Just., DOJ 2023 Merger Guidelines (“Barriers to Entry and Exclusion of Rivals. The merged firm may benefit more from limiting access to dependent rivals or potential rivals when doing so excludes them from the market, for example by creating a need for the firm to enter at multiple levels and to do so with sufficient scale and scope (multi-level entry).”); see also Herbert Hovenkamp, 1 Digital Cluster Markets, Colum. Bus. L. R. (2022) (“Clustering contributes to market power when . . . entering into competition with the cluster is difficult.”).
  • 191Sai Krishna Kamepalli, Raghuram Rajan & Luigi Zingales, Kill Zone, NBER Working Paper (June 2022), https://perma.cc/V9LH-8PTJ.
  • 192See, e.g., United States v. Microsoft, 253 F.3d 34 (D.C. Cir. 2001).
  • 193See Tejas N. Narechania & Ganesh Sitaraman, An Antimonopoly Approach to Governing Artificial Intelligence (Vanderbilt L. Research Paper No. 24-8, 42–50, Jan. 17, 2024).