Skip to main content

The 3 most important people and inventions in the history of the Internet

Ada Lovelace

The iPhone’s simplicity conceals 200 years of extraordinary invention. In hindsight, many of the features seem obvious, but they are only possible thanks to a few scientific geniuses who turned imagination into hardware.

I sat down with the famous author of Steve Jobs’ biography, Walter Isaacson, to discuss his new book, The Innovators, which is a timeline of how humanity got to the digital revolution.


 

NOTE: We’ll be interviewing Isaacson this week in VentureBeat’s weekly podcast, “What to Think.” Got questions for Isaacson? Ask them in the comments below or tweet them to @dylan20.


 

Isaacson is certain about the three most important inventions in computer history.

“The first is a computer, then the notion of a microchip that allows it to be personal, and a packet-switched network that allows us to tie it together around the world,” he says.

Calculators are more than numbers

“When it comes to the computer,” he says, “historically you go back to patron saint, who is Ada Lovelace.” The Countess of Lovelace, born in 1815, was the first to believe that a calculator could evaluate more than just numbers; numbers, she discovered, were just symbols, like musical notes or logical statements.

Lovelace was every bit the stereotype of a socially awkward inventor. She obsessed over the technological possibility of a computer, writing countless letters to inventor Charles Babbage, who, at the time, had designed the most advanced number calculator in the world:

Supposing, for instance, that the fundamental relations of pitched sounds in the science of harmony and of musical composition were susceptible of such expression and adaptations, the engine might compose elaborate and scientific pieces of music of any degree of complexity or extent.

Like many inventors, Lovelace was not recognized as a genius during her life, and it would be many years until her ideas would influence the world.

Making computers small

Kilby_solid_circuit

Prior to the 1960s, anyone who didn’t own a warehouse wasn’t on the list of potential computer customers. They operated with massive vacuum tubes run by teams of people.

Then a small team at AT&T’s Bell Labs discovered that electrical signals could be shrunk onto semiconductor devices known as transistors. This was the breakthrough that the Nobel Prize-winning engineer Jack Kilby needed to put many transistors on to a single chip.

“That allowed computers to become personal,” Isaacson notes.

In 1958, when Kilby unveiled his discovery, the team instantly knew they were seeing a glimpse of the future. “Everybody broke into broad smiles,” the Washington Post‘s T.R. Reid reported. “A new era in electronics had begun.”

Connected devices

Isaacson notes that the unsung hero of the Internet was JCR Licklider, who pioneered the concept of decentralized networks inside the obsessively hierarchical military labs in which he worked. “I started to wax eloquent on my view that the problems of command and control were essentially problems of man-computer interaction,” Licklider is quoted as saying in the book.

Licklider was primarily a product manager at the government research labs, setting up his team’s philosophy. His insistence on collaboration, both among his growing team and between the computers they built, would eventually nudge his engineers toward the early Internet prototype, ARPANET. “Licklider inspired me with his vision of linking computers into a network,” said Larry Roberts, who worked on the original ARPANET.

“So, he, to me, with all due respect to Al Gore, is the main father of the Internet,” concludes Isaacson.

For those interested in learning more about the history of computing, pick up Isaacson’s book here.

VB Daily - get the latest in your inbox

Thanks for subscribing. Check out more VB newsletters here.

An error occured.