How the world became data-driven: Advent of the internet
“History of the Net" series — post #1
JANUARY 11, 2023
When IBM and Apple came out with their personal computers in the late 1970s, it wasn't really understood how these desktop machines (which some skeptics referred to as expensive paperweights) were going to become the main instruments used in an imminent data deluge. Typewriters and spreadsheets were already in standard use; television, radio, and newspapers brought us the news; and the U.S. Postal Service and UPS handled our deliveries. What else could possibly be necessary?
The relationship of the early PCs to the internet is parallel to the story about the Vienna-to-Venice train line built through the Alps in the 1840s, before train engines were powerful enough to climb mountains. Austria and Italy hired 20,000 workers and built the tracks anyway, because they knew someday a train would be built that could climb over the pass. About 10 years later, the new Engerth locomotives were powerful enough to make the trip a reality.
Likewise, IBM, Apple, and a few other companies built the tracks (a network of in-place but unconnected PCs), knowing that useful locomotives (applications) were to follow. People who bought PCs in the late 1970s and early '80s believed that their investments eventually were going to bring new functionality, and they certainly did. But they had no idea that their PCs were also laying the groundwork for the internet, which was coming about 15 years later — and bringing with it an explosion of digital information the world had never seen. This was the inception of the big data era.
It's hard to imagine a time before the internet now, isn't it? But think about this: the U.S. Census went fully digital for the first time in 2020. Analyzing petabytes of data is no easy task.
Learn how they handled it
The internet is nearing 50 years of age
May 2024 will be the 50th anniversary of the first publication of the description of what we know today as the internet.
In September 1973, Vinton G. Cerf and a colleague, Robert Kahn, wrote a paper, “A Protocol for Packet Network Intercommunication,” for the May 1974 edition of IEEE Transactions on Communications. The dissertation described how packets of digital data would be able to move from one computer node to another, then to another, then to many others, using new protocols and standard phone networks.
The major purpose of the internet is all about data: loading, moving, using, and storing it. One of those data protocols for transporting it, designed and written that same year, was TCP/IP, short for Transmission Control Protocol/Internet Protocol. It remains the key data movement protocol of the Internet; in 1983, it became a standard. Another of those protocols, FTP, or File Transfer Protocol, enables users to log on to a remote computer, list the data files on that computer, and download files from that computer.
Cerf, 78, now serves as vice president and chief Internet evangelist for Google. He was in the room when the internet was turned on using TCP/IP and FTP in 1983 and is one of the fathers of the network because he helped code it and was influential in many of the biggest milestones in its history.
“It started out as a bunch of geeks who basically thought it would be really cool if every network in the world, every computer in the world would be interconnected in some very informed way, and wouldn’t that be amazing if they could share information in a very fluid and flexible way?” Cerf told me in an interview.
“For a very long time, it was the property of the scientific and military community, but in about 1989, the commercial services came along, and not very long after that, Tim Berners-Lee’s invention (of the World Wide Web) becomes visible in 1989, then Netscape's Marc Andreessen and Eric Bina with Mosaic (the first graphical browser in 1994), suddenly, the general public comes onto the net. At that point, we have a sea change.”
And what a sea change it was. Nothing in information technology, arguably, has equaled it since, although the introduction of cloud computing (the genre was called "application service providers" at first) in the late 1990s came awfully close.
Creator of the World Wide Web explains why he did it
The growing amount of data on the internet was beginning to require new tools to make it work more efficiently for users. That's when the World Wide Web came into being, and it couldn't have come at a better time.
“On March 12, 1989, I distributed a proposal to improve information flows: ‘A web of notes with links between them,’” Berners-Lee, while working for the CERN laboratory, said on the 25th anniversary of the WWW in 2014. “Though CERN, as a physics lab, couldn’t justify such a general software project, my boss Mike Sendall allowed me to work on it on the side. In 1990, I wrote the first browser and editor. In 1993, after much urging, CERN declared that WWW technology would be available to all, without paying royalties, forever.”
Those first pieces formed the basis for what would become tens of thousands of people who began working together to build the Web, Berners-Lee said. “Now, about 40 percent of us are connected and creating online. The Web has generated trillions of dollars of economic value, transformed education and health care, and activated many new movements for democracy around the world. And we’re just getting started.”
Now it’s time to celebrate and to “think, discuss — and do” in regard to the future of the web as we know and use it, Berners-Lee said. “Key decisions on the governance and future of the Internet are looming, and it’s vital for all of us to speak up for the web’s future. How can we ensure that the other 60 percent around the world who are not connected get online fast? How can we make sure that the web supports all languages and cultures, not just the dominant ones?” he said.
“How do we build consensus around open standards to link up the Internet of Things? Will we allow others to package and restrict our online experience, or will we protect the magic of the open web and the power it gives us to say, discover, and create anything? How can we build systems of checks and balances to hold the groups that can spy on the net accountable to the public? These are some of my questions — what are yours?”
Other key individuals made their marks at this time in this massive connectivity movement, including Sun Microsystems' Dr. James Gosling (leader of the company's Java protocol development team); John Chambers (Cisco Systems CEO, which took leadership of the internet networking business in 1992 and never let go); Oracle's Larry Ellison (who produced the first reliable parallel database); Andy Grove (co-founder of Intel, which led the way in producing processors for all the computers); John McAfee (network and device security pioneer); and a long list of others.
Convergence of PCs, the Internet, WWW protocols made the internet come alive
The internet was designed by scientists with trust in mind; users were expected to be cordial to each other and respect other users' privacy and information. However, computer hacking for stealing private data – and there was a fast-increasing amount of data to hack – has joined the world's most profitable businesses, ranking alongside hedge funds, oil and gas production, weapons-making, and the health care/pharmaceutical industries.
It was then, right at the outset, that the security of the public network posed a thorny, lasting problem.
“At this point, you have the general public involved. This means that not only do you get a lot of good guys using the net, but you get a lot of bad guys, as well,” Cerf said. “And although there may not be that many bad guys, there are enough of them to cause a lot of trouble. They’re out there to take advantage of other people.”
Thus, computer hardware and software security has become a multi-billion-dollar business in the 28 years since the internet went mainstream in 1994 with the introduction of the Netscape browser. Cerf has suggested that perhaps enterprises, governments, and individuals should look to the origins of the internet to reinvent security of personal and business information in the network for the future.
“It’s very important for us to seriously revisit the ability of operating systems to defend themselves,” Cerf said, “to use strong authentication, two-factor authentication and to utilize mechanisms for applying cryptographic methods in order to defend every single computer from everything. And, of course, it’s getting more important to do that because some of the things that are joining this network are not only desktops and laptops and pads and mobiles but now intelligent appliances in the machine-to-machine world.”
Internet usage – and data – exploded between 1995 and 2006
When those siloed desktop PCs finally became connected by standard telephone lines in the mid-1990s, the users of the internet — propelled by the easy-to-use Netscape browser and the new Microsoft browser — multiplied in extraordinary fashion. InternetWorldStats.com estimated that about 16 million people used the internet regularly in 1995 and that the number swelled to 1.08 billion by September 2006, when Amazon Web Services launched the first big commercial cloud service, the Simple Storage Service, or S3.
That's when the story of the cloud began to be told.
"I have to say that the most astonishing effect of the arrival of the internet and the World Wide Web was the enormous avalanche of content that flowed in because people just wanted to share what they knew, on the possibility that it would be useful to someone else," Cerf said. "I think that intention is still there, although this is a big tent, and all business models are welcome."
Yes, the desire for people to find and post data for the utility of others proved a magnanimous purpose, but it certainly wasn't the only mission of the internet. With the World Wide Web came the dawn of e-commerce, which was to open the gates to a quantum leap in data creation and movement and accelerate collection of it into what we now know as Big Data.
This is the first post of the “History of the Net" series. Please look out for two more stories coming soon.
Chris J. Preimesberger
Chris J. Preimesberger is a journalist, editor, and researcher based in Redwood City, California. He is a former Editor in Chief of eWEEK, where he supervised all enterprise IT coverage. In 2017 and 2018, he was named among the 250 most influential business journalists in the world (No. 65) by Richtopia, a research firm that used analytics to compile the ranking.