The Internet, and the World Wide Web

You might say our story begins in the early 1990s - when the World Wide Web swept around the world following the connection of thousand of disparate computer systems to the "Internet". The internet itself was a second coming of sorts - an extension to the Arpanet that introduced transmission control protocol (TCP), and internet protocol (IP).

Before TCP/IP, computer systems had to be wired directly to each other in order to communicate. Afterwards, what we now know as routers and switches allowed terms such as "packet switching", "datagrams", and "hot potato" to enter the common vernacular.

The invention of "The Internet" grew the forest of interconnected computers from a few might oaks, to many thousands of trees of all shapes and sizes spread throughout the world. While many of the early servers connected to the internet ran commercial installations of "Unix", the search was on to find a flexible, fast, efficient, and preferably free operating system that could run the same software.

In a happy coincidence, at the same time a Finnish computer science graduate called Linus Torvalds was putting the finishing touches to a home-brew operating system Kernel, and a software engineer straight from the Don Quixote mould called Richard Stallman was leading a team to re-write Unix for the community - to replace what had been taken away when it became commercial. The result of course became known as Linux (or GNU/Linux, if you grow tired of being lectured).

Hindsight tells us that if Unix had not become commercial, Linux might never have existed. But Unix did become commercial, and Linux does exist.

LAMP

Linux immediately became wildly popular as an internet server operating system. It was designed from the start as a network operating system, so had inherent advantages over anything coming out of Cupertino or Redmond. The source code of Linux was readily available - it could be extended and enhanced in whichever way developers saw fit - and boy did they extend it.

Like chaotic dominoes toppling towards an inevitable conclusion, the Apache webserver, MySQL database, and PHP programming language converged on each other over the next several years - forming an unlikely partnership, and a platform upon which much of what we now know as the world wide web was originally built.

Along the way other technologies came and went - among them Perl (upon which PHP is loosely based), Active Server Pages (which scaled horribly), and Java Server Pages (a superset of Java). For a time developers felt a bit like throwing spaghetti at the wall when choosing platforms - trying to guess which might stick.

Hubris and Defeat

Following on from the successful infiltration of server farms throughout the world, in the early 2000s eyes were turned towards a new prize - the ocean of desktop computers, and the hearts and minds of millions of consumers.

Could Linux really topple Microsoft and Apple? Was the world ready for a third option on the desktop? It turned out not. Redhat, Corel, Mandrake, Debian, SuSE, Novell, and Ubuntu all tried. Some organisations bet the farm and destroyed themselves in the process.

In a strange sort of way, in a world where computers run a suite of installed software a monopoly is the best option - it lowers training, management, and maintenance costs exponentially.

And then something unexpected happened

For the perhaps the last twenty years, Microsoft and Apple have more-or-less owned the education market between them. Business studies students the world over have learned to use Microsoft Office, and media and art students have learned the Adobe Creative Suite. You might say that the business and art worlds were a battle for hearts and minds, and had been considered won for many years.

Stability breeds complacency.

While nobody was taking much notice, a team of educators in Cambridge, England heard increasing concerns that the number of computer science graduates was falling. Many of the lecturers were of the generation that grew up during the 1980s home computer boom - "bedroom coders" - and wondered if it might be possible to re-create history.

The Raspberry Pi was born - first as a gadget for the "maker" community, and then as a trojan horse into education - levering computer science back into the national curriculum. While the world of engineering had the "Bloodhound SSC" supersonic car to inspire the next generation, computer science bootstrapped itself with a home-build computer that schools and hobbyists could afford - and that tiny little computer ran Linux.

This may have all been a distraction though.

The battle may have been over before it began

While the Raspberry Pi was busy winning hearts and minds across the world, Google was playing a very long game. Taking no notice of Apple and Microsoft competing to build the fastest, thinnest, most expensive laptops and tablets that anybody might ever have imagined, Google set about building a suite of cloud services, and an operating system just good enough to access them through a browser.

Microsoft wasn't entirely clueless - they had forked the business some years previously - with one side churning out the operating systems and applications they are famous for, and the other building a vast cloud city in the sky called "Azure". An operating system agnostic city.

Apple caught on to the changing sands beneath their feet and started scrambling to build their own services - bolting together the disastrous MobileMe, followed by the similarly woeful iCloud. It turns out some old dogs really struggle with new tricks.

Sometimes you find yourself in the right place at the right time - and sometimes it's by design. As the world was rocked with financial turmoil in recent years, Google arrived on the doorstep of educational establishments around the world armed with laptops that cost a fraction of their competition, that didn't require expensive installation and management procedures, and that people already knew how to use.

The avalanche of Chromebooks through primary schools has happened astonishingly quickly. You might say it was something of a massacre for Microsoft and Apple, and it's still happening now.

Of course, Chromebooks run a variant of Linux.

Is there an endgame?

A generation of children is growing up using exchangeable, disposable computers, wheeled into classrooms on charging trolleys, and logging into virtual classrooms from home during a pandemic where their work is "just there". A generation has learned it's not about the device in their hand - it's about the thoughts and ideas they can store and access through it.

As I wrote once before, perhaps Linux was never "the thing" - perhaps Linux was "the thing that gets us to the thing".

The Second Coming

While Linux might not have immediately swept all before it after it's first campaign across the internet landscape, it didn't stop a quiet army of developers from joining the cause over time. New desktop targeted distributions such as Mint, Manjaro and Elementary OS were hatched, and older distributions such as Ubuntu and SuSE doubled down - making long term investments into a homogenous future, where computers are seen more as utilitarian tools than luxury objects.

The desktop computer I'm writing this on runs the Linux subsystem within Windows 10 - Linux is essentially consuming Windows from the inside out. I'm not using a word processor - I'm logged into a web application called "Notion", through a Web Browser called "Firefox" - which is available for all of the common operating systems. The laptop propped on the corner of the desk is running Elementary OS - a derivative of Ubuntu Linux. It can dual-boot into Windows if absolutely required, but I haven't done so for months.

Linux is here - it's all around us. It's in our phones, our tablets, our laptops, our computers, and in the cloud. It just didn't announce it's arrival.