4 Lessons Learned from 20+ Years of the IT Industry

From the .com enthusiasm to the cloud, the dynamics of the foundation of the future, which the IT industry is supposed to be

Luca Vettor, The Note Strategist
7 min readApr 24, 2023
Photo by Alex Knight on Unsplash

Until the 2000s, the IT industry had an old-style mindset. It was about building software to explore what automation could do, but the hardware was the protagonist.

Think of what Bill Gates remembered for the 40th anniversary of Microsoft:

Early on, Paul Allen and I set the goal of a computer on every desk and in every home. It was a bold idea and a lot of people thought we were out of our minds to imagine it was possible. It is amazing to think about how far computing has come since then, and we can all be proud of the role Microsoft played in that revolution. [Source: Bill Gates’ 40th anniversary email: Goal was ‘a computer on every desk’]

At the time, IT success was tied to spread around computers “on every desk and in every home.”

There was the awareness that IT was an enabler even before knowing to be the enabler of what. Yet, it was crystal clear the need — I’d say the urgency — that everybody had a computer. The software was little more than a proof of concept. The focus was on the hardware.

Then the 2000s came.

.com economy

Photo by Austin Distel on Unsplash

Once computers reached almost every desk, many people began to glimpse how software could be a lever to make money. The first ones were people in finance.

Quoting Digital Economy 2000, by the U.S. Department of Commerce, June 2000:

In the twelve months since our previous digital economy report, confidence has increased among both experts and the American public that the new, proliferating forms of e-business and the extraordinary dynamism of the industries that produce information-technology products and services are harbingers of a new economic era. For most economists, the key measure of our new condition is the exceptional increase in productivity of the last five years, which has helped drive a welcome combination of falling inflation and very strong growth. For many people, however, the clearest evidence lies in the extraordinary increase in the electronic connectedness among individuals and businesses through the Internet. Three hundred million people now use the Internet, compared to three million in 1994. They can access more than one billion web pages, with an estimated three million new pages added every day.

Can you feel the enthusiasm? And the lack of awareness, too? On the other hand, at the beginning of a “new economic era,” everything was sparkling and so new that it was necessarily an opportunity.

Moreover, something crucial was happening that would be the game changer and make software emerge as the new protagonist:

The advent of this new era has coincided with dramatic cost reductions in computers, computer components, and communications equipment.

When the 2000s came, the hardware started being like the air: everywhere and affordable. Otherwise stated, the hardware stopped being a relevant cost of the IT industry.

On the other hand, hardware requires software, and that’s become the most relevant cost since then up to now, 2023.

Lesson learned #1

The IT industry is cost-driven — focused on what’s most costly — with no purpose other than being an undefined opportunity spread everywhere.

Internet and social media

Photo by George Pagan III on Unsplash

Affordable computers on every desk are the nodes of a silent network. The next natural step is giving them a voice by connecting those nodes — still a balanced mix of hardware and software. The Internet was born.

True that the Internet was born well before the 2000s, as the Brief History of the Internet describes, early in 1997:

The first recorded description of the social interactions that could be enabled through networking was a series of memos written by J.C.R. Licklider of MIT in August 1962 discussing his “Galactic Network” concept. He envisioned a globally interconnected set of computers through which everyone could quickly access data and programs from any site. In spirit, the concept was very much like the Internet of today. Licklider was the first head of the computer research program at DARPA,4 starting in October 1962. While at DARPA he convinced his successors at DARPA, Ivan Sutherland, Bob Taylor, and MIT researcher Lawrence G. Roberts, of the importance of this networking concept.

Yet, it was only a vision: something that could happen but also couldn’t happen. And it was still a technical stuff.

Instead, in the 2000s, the “Galactic Network” started actually existing. And the big bang had been that computers were no longer the intended network nodes but people. Computers — and soon, from the 2010s, smartphones — have become personal extensions, like arms and hands, directly connected to their brain and feelings. Today in the 2023s, nobody says computers are connected, but people are.

The time was ripe for the rise of social media. The History of Social Media in 33 Key Moments well summarizes it:

So, how old is social media? It depends, but early social media platforms started to pop up in the late 90s to early 2000s, with the technology budding as early as the 70s.

Lesson learned #2: IT is much more than technology; IT is a new version of humankind that roots its value in constantly connecting. Being connected has no explicit goal: it’s a goal per se — which enables a worldwide advertisement ecosystem, but that could be accidental. Or not. Anyway, it led to the app economy.

App economy

Photo by Alexander Shatov on Unsplash

What’s not accidental is that the app economy arose when the vision of “a computer on every desk and in every home” resulted in a smartphone in every hand.

Sitting in front of a computer was no longer necessary to be connected; this multiplied the time available to be connected from a few hours a day to always. Consequently, the occasions when being connected can be useful or at least enjoyable have multiplied exponentially.

For each occasion, Apple invented and trademarked “there’s an app for that”: the Apple App Store. With Google Play, Google moved in the same direction.

So, what’s the app economy? Following The State of the App Economy and App Markets in 2022:

Put succinctly, the mobile app economy refers to the range of economic activities stemming from the development and distribution of mobile apps and games. The app economy includes device manufacturers, operating system developers, app distribution platforms (i.e. the Apple App Store, Google Play, etc.), mobile app and game publishers, software companies and API / SDK developers, ad networks and other mobile advertising companies, app data and analytics companies, and more.

Gaming is so relevant in the app economy that both Apple and Google have dedicated stores for gaming apps. But gaming is entertainment, and that leads to another lesson learned.

Lesson learned #3: IT feeds on human time and returns value as a ubiquitous communication channel for knowledge sharing and entertainment.

Cloud economy

Photo by NASA on Unsplash

The cloud economy is the oil of the app economy. It sustains the business by providing the physical and logical infrastructure for running the app economy.

If you think of the cloud beyond the market leader providers — like aws — it’s easy to understand that it’s ground for web 3.0 and the many blockchains, too.

The cloud economy is a consequence and a necessity for IT to permeate the world.

Lesson learned #4: IT consumes a tremendous amount of energy to permeate the world.

Conclusion

Let’s put all together the 4 lessons learned:

  1. The IT industry is cost-driven — focused on what’s most costly — with no purpose other than being an undefined opportunity spread everywhere.
  2. IT is much more than technology; IT is a new version of humankind that roots its value in constantly connecting. Being connected has no explicit goal: it’s a goal per se — which enables a worldwide advertisement ecosystem, but that could be accidental. Or not. Anyway, it led to the app economy.
  3. IT feeds on human time and returns value as a ubiquitous communication channel for knowledge sharing and entertainment.
  4. IT consumes a tremendous amount of energy to permeate the world.

20 years is relatively short compared to the average life expectancy of about 80 years in the Western world: it probably means that the story will still develop its plot.

Yet, 20 years have been enough for IT to transform the world by connecting it.

Is all that just aimed at advertisement, entertainment, and some helpful knowledge and information sharing?

Photo by Kittitep Khotchalee on Unsplash

The four lessons learned above say something different. The keywords are costs, connection, time, and energy. Believe it or not, that’s a new strategy of life to expand upon humankind, beyond but rooted in humanity.

My truth: Things are less complex when you write them down!

If you enjoyed my article and found it helpful, please share it and consider following me and subscribing to my new writings!

Not yet a Medium member? Join Medium through my referral link: you’ll support my writing and get into a sea of knowledge!

--

--

Luca Vettor, The Note Strategist
Luca Vettor, The Note Strategist

Written by Luca Vettor, The Note Strategist

Life is too good to forget without understanding! Many small, humble, and well-organized notes make the difference. Let's learn to take notes together!

No responses yet