Tech regulation may be the only thing on which a polarized Capitol Hill can agree. “We should be suing Google and Facebook and all that, and perhaps we will,” President Trump recently declared. Senator Elizabeth Warren, a Democratic presidential candidate, has made the breakup of tech companies a central plank of her campaign. Even Silicon Valley-friendly contenders like Pete Buttigieg have called for curbs on the industry’s power.
If Americans buy into the idea that the tech industry is an entrepreneurial, free-market miracle in which government played little part, then the prospect of stricter regulation is ominous. But that isn’t what actually happened. Throughout the history of the tech industry in the United States, the government has been an important regulator, funder and partner. Public policies — including antitrust enforcement, data privacy regulation and rules governing online content — helped make the industry into the innovative juggernaut that it is today. In recent years, lawmakers pulled back from this role. As they return to it, the history of American tech delivers some important lessons.
Advocates of big-tech breakup often point to precedent set by the antitrust cases of the twentieth century. The three biggest were Microsoft in the 1990s, IBM in the 1950s through the 1980s, and the moves that turned AT&T into a regulated monopoly in 1913 and ended with its breakup seven decades later. Microsoft and IBM didn’t break up, and even AT&T’s dissolution happened partly because the company wanted the freedom to enter new markets.
What made these cases a boon to tech innovation was not the breaking up — which is hard to do — but the consent decrees resulting from antitrust action. Even without forcing companies to split into pieces, antitrust enforcement opened up space for market competition and new growth. Consent decrees in the mid-1950s required both IBM and AT&T to license key technologies for free or nearly free. These included the transistor technology foundational to the growth of the microchip industry: We would have no silicon in Silicon Valley without it. Microsoft dominated the 1990s software world so thoroughly that its rivals dubbed it “the Death Star.” After the lawsuit, it entered the new century constrained and cautious, giving more room for new platforms to gain a foothold.
Bill Gates observed recently that a “winner take all” dynamic governs tech, encouraging only one product — IBM mainframes, Microsoft Windows, the Apple iPhone — to monopolize its market. History shows that he’s right, and that the actions of government have been a critical countervailing force.
Enforcement, however, needs to be savvy about the technology itself. When Congress first took up the issue of computer privacy in the 1960s, its focus was on the information-gobbling mainframe computers of the federal government. Lawmakers paid little attention to what private industry was doing, or could do, with personal data. And they had little inkling of what could happen when such databases became part of a networked communications system.
Had they paid closer attention to some of the experts on computing at the time, they might have acted differently. Paul Baran, a pioneering computer scientist, warned at a 1966 hearing about the dangers of networked computers. “Even a little information improperly used can do irrevocable harm,” he said. “Information is readily counterfeited. It can be quickly reproduced and widely transmitted very cheaply.” Baran knew what he was talking about; a few years later, he helped design the internet. The detailed prescriptions he offered that day — basic encryption of all files, random external audits, mechanisms to detect abnormal information requests — could have altered the trajectory of the online world.
Technology will always move faster than lawmakers are able to regulate. The answer to the dilemma is to listen to the experts at the outset, and be vigilant in updating laws to match current technological realities.
Which leads to a final point: The rules governing the internet made sense in the dot-com era. They don’t anymore. Today’s online world was built largely in the early 1990s, when the government opened up the internet to commercial activity and wrote rules governing its infrastructure. This included Section 230 of the Communications Decency Act of 1996, which said that no internet provider or platform could be considered the publisher or speaker of any information placed on its site by a third party.
The measure came out of another intensely partisan moment. Bill Clinton was in the White House; Newt Gingrich was speaker of the House. Internet policy was close to the only thing on which the rivals could agree. This was thanks, in large part, to the persuasive power of Silicon Valley, which then was the outsider, lobbying Washington lawmakers to protect their electronic frontier from the greedy designs of large cable and telecom companies. Don’t regulate us, dot-com leaders cried. Let us regulate ourselves. Lawmakers agreed.
Before then, the American government had tightly regulated other communications media like radio, television and telephony. But Congress chose not to hold the small, still developing 1990s-era internet to the same standard. That was a wise move then. Google and Facebook didn’t exist when Section 230 went into effect. Amazon’s website had been up less than a year. As online platforms become more powerful than all other media, it is time for policymakers to step back in. But they should do so with care, and with history in mind.
Silicon Valley’s story isn’t just one of freewheeling entrepreneurs and farsighted technologists. It’s about laws and regulations that gave the men and women of the tech world remarkable freedom to define what the future might look like, to push the boundaries of what was technologically possible, and to make money in the process.
Washington’s hands-off approach ultimately permitted a marvelous explosion of content and connectivity on social media and other platforms. But the people designing the rules of the internet didn’t reckon with the ways that bad actors could exploit the system. The people building those tools had little inkling of how powerful, and exploitable, their creations would become.
The tech world likes to look forward, not backward. But reckoning with its past is essential in mapping out where it goes next.
Margaret O’Mara is a professor of history at the University of Washington and the author of The Code: Silicon Valley and the Remaking of America, from which this essay is adapted.