Fast Start Gen AI with a 15% discount

Moving to the Cloud 

 

“History of the Net" series — post #2

 

Chris J. Preimesberger IT Editor/Reporter/Panel Moderator

FEBRUARY 1, 2023
A glass building looking upward reflecting clouds

1993-2019

For most people, the internet became a “thing” in the early 1990s, when car manufacturers started carrying WWW website markers on the last-second signatures of their television commercials. Merchants started using the term “log on to our website,” and early internet users started adding the term “online” to their regular vocabulary. That was when the internet started to become a mainstream activity; but it was already old-school to university students, researchers, and other power users, who were busy doing work online in the 1980s. A good yet brief timeline for the history of the internet from Thomas Jefferson University can be found here.

As with any major trend, there were people who saw the digital future and started laying the groundwork for the coming “super information highway,” as it was first described. During the late 1980s, the first internet service provider (ISP) companies were started. Startups such as PSINet, UUNET, Netcom, and Portal Software provided service to the regional research networks and brought alternate network access, UUCP-based email, and Usenet News to the public. The first commercial dial-up ISP in the United States The World opened in 1989. 

Sandy Lerner and Leonard Bosack, computer science professors who had built on-campus networks at Stanford University, started network hardware maker Cisco Systems in San Francisco in 1984. Data storage hardware providers EMC, Veritas, and NetApp — which all started in business in the late 1980s — added connectors for internet data. AT&T, deemed a monopoly by Congress, was broken into seven regional "Baby Bells" in 1984; the breakup gave consumers access to more choices and lower prices for long-distance service and phones. It also smoothed the way for ISPs to broker deals regionally on the usage of telephone lines for the coming online services.

Users accessed these online services on desktop and laptop computers made by IBM, Apple, Hewlett-Packard, Sun Microsystems, Silicon Graphics Inc., and several other manufacturers, most of which used operating systems licensed from IBM or Microsoft. 


Many companies are still making decisions about which workloads to move to the cloud, and some are even having to repatriate workloads back on-premises due to astronomical cloud costs. That's why it's so essential to have a hybrid cloud approach.
 

Learn more on why hybrid and multi-cloud is the only way forward

Laptop showing charts and graphs

Going mainstream

Despite all this action, it still took the world some 10 years (1995-2005) to become acquainted with the global network and to begin to utilize it for everyday tasks, such as email, e-commerce, gaming, and banking. This is evidenced by the fast multiplication of users on the internet from 16 million to 888 million during that time span. According to AT&T Labs research (PDF), total internet data traffic grew from a mere 16.3 terabytes in 1994 to between 20,000 and 35,000 terabytes only six years later, in 2000.

How the internet began to grow depends on the set of people using it:

  • Universities: Many historians of the Internet have observed that the Internet was mainstream within the U.S. university system by the late 1980s. Every professor and graduate student in a research-oriented university could expect to have email and expect other peers to have it as well.

    The internet became mainstream with undergraduates in the early 1990s, and the deployment of Marc Andreessen's Mosaic web browser — particularly the publication of a Windows-compatible version in 1995 — made "surfing" the Web a common occurrence among undergraduates. By 1994 it was commonly in use at all major universities.

  • Silicon Valley: The first browser for Netscape, NCSA Mosaic, was launched in beta in November 1994, and it went on the market for sale in February 1995. Mosaic was developed at the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign beginning in late 1992.

    Microsoft licensed Mosaic to create Internet Explorer in 1995. This was a graphical browser that ran on several popular office and home computers; it was the first to bring multimedia content to non-technical users by including images and text on the same page, unlike previous browsers. At the urging of his own employees, Microsoft co-founder and CEO Bill Gates spent a night surfing in April 1995, and wrote a memo to his executive staff, “The Internet Tidal Wave,” in May 1995.

    For those working in the tech business and its surrounding sectors, use of the internet became widespread in that same year. The stage was now set for the liftoff of the global internet as we know it.

  • General public: “Mainstream” refers to the adoption of the internet by a majority of households. When a trend reaches 50% or more households, most analysts sanction it as mainstream.

    The data from the National Telecommunications Information Administration indicates the internet reached a third of households in mid-1999 and 50% in mid-2001.

    So take your pick: The internet was mainstream among researchers by 1989, among Silicon Valley insiders by mid-1995, and by most U.S. households by 2001.

Smart watch and mobile with data analytics

How Web 2.0 changed the business internet 

The growth trajectory of the internet was slowed for a couple of years (2001 to about 2003) when the bottom fell out of the economy following the 9/11 attacks. This caused the so-called tech "bubble" of investment to burst, and internet-related innovation was slowed for some of this time period. But the industry, as it had previously, reinvented itself and VC investment returned. With a few key companies leading the way (namely AWS, Salesforce and Microsoft), things began looking up again for the tech sector.

Cisco Systems, the world's largest internet networking "pipefitter," has kept global internet traffic and data statistics since 1990. In 1994, the company reported that 0.02 petabytes of data per month had been transmitted on the web; that number had moved to 2,426 petabytes per month by 2005. At the start of the Web 2.0 era in 2008, data was pulsing through the internet at a rate of 10,174 PB per month. By 2017, the number had ballooned to 122,000 petabytes per month.

When VMware (short for "virtual machine"-ware) and Amazon Web Services (AWS, which began as a small division of Amazon) started up in the early 2000s, they were both instrumental to the growth of the internet. VMware enabled users to mirror data processing workloads to anywhere they were needed in seconds; this was something the internet absolutely required to grow and prosper. AWS enabled subscribers to rent computing and data storage functionality in its new computing "cloud." A cloud was the use of a computer in someone else's data center through the internet that one could access from a personal computer or server. 

From then on, companies providing data processing services have transformed the business internet. The exponential growth of business and personal data has mandated new ways to process it all. In 2008, Cloudera was founded in response to this, on the belief that open source, open standards, and open markets are best. Web 2.0 was now well under way and data analytics became crucial for business success. 

“At the time, people were just creating a data lake, and pulling in, and dumping whatever they had. That created data swamps, and you could not get any sense out of it, and you could not process it in a time and at a price point that you wanted,” said Abhas Ricky, Cloudera's Chief Strategy Officer. “Therefore, we created distributed computing that would allow customers to have meaningful data sets and meaningful workloads in a secure and enterprise-ready fashion at the price point that they wanted. That's what founded Cloudera.”

But early on, few companies wanted to risk storing their crown jewels – proprietary business data – in the cloud. It was too unproven in the mid-2000s.

"Cloudera was originally founded on the principle of taking open source software, created by Doug Cutting at Yahoo, and based on Google publications and making it ready for the enterprise, from the cloud," said Dr. Christopher Royles, Cloudera's Field CTO for EMEA. "But back in the 2000s, few wanted to do that on the cloud.

"In the last five to 10 years, we have seen broad industry adoption of cloud. The next five to 10 years will see a recognition that businesses have workloads, and that there are optimal places to run them. Cloud may not be the only answer."

Prior to this, the closest thing anybody had to a distributed processing service was an application service provider. The networking channels were not nearly as fast and storage wasn't as plentiful as they are in 2022, and data backed up very quickly if there were bottlenecks. The networking technology also was brittle. Like most pioneering technologies, enterprise ASPs had their issues. For example, if too much data overflowed a storage silo or came in too quickly for the storage software to process it, the entire system would clog up and shut down. When deadline after deadline was not met by these frequent data snags, companies retreated to moving data by physical hard drives and transporting those often heavy volumes by air or ground freight.

Table showing account growth

Exponential growth of cloud services

That's when AWS debuted its pioneering cloud storage service, S3 (Simple Storage Service) in the fall of 2006. This was the first major landmark for cloud computing. Companies could now subscribe to a cloud provider (AWS soon had plenty of competition with Microsoft Azure, Google Cloud and others) to park their business files, images, and data logs without buying expensive physical servers or storage arrays. It took another 10 to 12 years for most businesses to trust the security of the cloud, and by 2019, even the highest-regulated financial, military, and government institutions were using the cloud in the administration of their applications. 

By 2022, the cloud is second nature for all people using smartphones, desktops, laptops, and tablets; virtually all applications are now cloud-based.

“I believe no technology can compare to the cloud in the last 40 years: If you look at the growth in terms of billions of dollars, there is no technology or technological improvement that generated so much traction,” says Abhas. “But also, it’s very exciting for us that the cloud is about empowering data practitioners and business owners, because now, they're not beholden to a centralized function and they can actually make decisions for themselves better, faster, cheaper. That's what the cloud has been able to do.”

The internet wasn't going to stay stuck on Web 2.0 for long. The rise of artificial intelligence, machine learning, edge computing, the internet of things, and metaverse are now populating the new Web 3.0 — and what a Wild West show this is becoming!

“I believe no technology can compare to the cloud in the last 40 years: If you look at the growth in terms of billions of dollars, there is no technology or technological improvement that generated so much traction,” says Abhas.

 

“But also, it’s very exciting for us that the cloud is about empowering data practitioners and business owners, because now, they're not beholden to a centralized function and they can actually make decisions for themselves better, faster, cheaper. That's what the cloud has been able to do.”

The internet wasn't going to stay stuck on Web 2.0 for long. The rise of artificial intelligence, machine learning, edge computing, the internet of things, and metaverse are now populating the new Web 3.0 — and what a Wild West show this is becoming!

Do look out for part 3 of the “History of the Net" series coming soon.
 

In part 3, we will talk about Web. 3.0, the Metaverse, and where It's all going (2019 and beyond). We will address how new and inventive uses of AI/ML in online apps and services, 5G connectivity, 3D graphics, automation, data management/privacy, and the coming metaverse are all changing everything rapidly. There are lots of theories about what digital life will look like 10 to 50 years from now, and we'll examine some of the more intriguing ones.

Article by

Photo of author Christ J. Preimesberger

Chris J. Preimesberger

Chris J. Preimesberger is a journalist, editor, and researcher based in Redwood City, California. He is a former Editor in Chief of eWEEK, where he supervised all enterprise IT coverage. In 2017 and 2018, he was named among the 250 most influential business journalists in the world (No. 65) by Richtopia, a research firm that used analytics to compile the ranking.

Learn more

More articles


How the world became data-driven

Learn more


AI-first Benefits: 5 Real-world outcomes

Learn more

Your form submission has failed.

This may have been caused by one of the following:

  • Your request timed out
  • A plugin/browser extension blocked the submission. If you have an ad blocking plugin please disable it and close this message to reload the page.