Subscribe to the feed

Today there's no question that Linux, and open source, belongs at the heart of enterprise IT. 20 years ago, though, Linux was the underdog and it took a lot of faith to bet big workloads on Linux. Red Hat had the confidence that it could bring Linux into the enterprise, and delivered on it with Red Hat Enterprise Linux (RHEL).

Roughly 20 years ago we released the first version of RHEL, which we initially called Red Hat Advanced Server. Because 1.0 and .0 versions were always seen as iffy, Red Hat started with version 2.1 and embarked on an adventure few would have thought possible. 

Early days: From server closet to supported hardware to major workloads

These days you can be fairly confident that server hardware is going to work with Linux. That wasn't the case 20 years ago. One thing that RHEL brought to the table early on for enterprise Linux was a list of certified hardware you could count on working with Linux, and a stable platform for software vendors. 

RHEL 2.1 was about "lighting up the hardware" and creating an ecosystem, along with a commitment to a long-term lifecycle with bug fixes that didn't require updating certified software. For how long? RHEL 2.1's end of support came in May, 2009. Seven years of support was unheard of at the time. 

One thing Red Hat understood early on is that solving the technical problem is only half the solution. It's not enough to run an enterprise application, the customer has to be able to get support. They have to be able to plan out a roadmap for IT that spans years. It's hard to do that without long-term support. Red Hat provided that.

The next few years were about handling challenging workloads technically, too. Linux crept into the workplace doing useful but not crucial work: Running Samba, for example, for organizations that wanted to offer print and file serving. Linux, of course, shone at running web servers and helped the web grow.

But high performance workloads? Those were the domain of expensive UNIX systems with proprietary hardware and operating systems. 

Meanwhile, x86 hardware became more powerful and Linux helped power it. Red Hat's developers were busy working upstream contributing to the Linux kernel, compilers, libraries and many other components that had to evolve to handle load at scale. 

From one, many: Let's get Xen

With more powerful and accessible hardware and the growth of Linux applications to run the business, a new set of problems emerged. Namely, these systems often went underutilized. 

Running two or more major applications on a single system was non-trivial, since the applications might need different dependencies or have different security and access requirements. (To name only a few issues.) In the mid-2000s, virtualization started making headway into enterprise environments. Red Hat first addressed this with the Xen hypervisor, and later with the Linux Kernel-based Virtual Machine (KVM).

This allowed businesses to pile several virtual machines (VMs) on a hefty machine, make fuller use of the hardware and even run multiple OSes on the same machine. 

Virtualization on Linux opened a gateway to the next major evolution of computing: Cloud.

Into the cloud

The cloud, as they say, is just somebody else's computer. The thing is, lots of organizations love the ability to push hardware and infrastructure ownership and maintenance to another company. 

Now, dedicated hosting was already a thing for many years. Lots of companies rented servers from hosting providers who handled network connectivity, power and cooling infrastructure, and the actual servers. 

Virtualization allowed customers to make full use of that hardware, but early virtualization was not much more elegant than managing operating systems on bare metal hardware. 

Public cloud as we know it today really only came about due to the economies of Linux — no costly licenses, constant innovation and the ability to run on standardized hardware were key factors in the cloud boom. Building on the foundation of Linux, cloud computing brought new features that made it compelling over traditional hosting. First, it was usually by the hour instead of by the month or year. You could run workloads for a few hours and just pay for what you used, instead of having a dedicated server sitting idle and chewing up spend.

Even better, the cloud came with APIs that allowed customers to programmatically manage infrastructure at scale. 

RHEL grew to this challenge by adding testing and tools that made running on the cloud just as easy as running in your own data center. Things like Red Hat-produced ‘gold images’, integration into provider marketplaces and complementary technologies like Ansible — all made RHEL a great choice in the cloud. 

Especially since it was all one RHEL — you could run RHEL on bare metal, virtualized, or in the cloud — it provided a lingua franca for customers to manage applications across all the footprints. Or more accurately: Across the open hybrid cloud.

But, even with cloud APIs and everything — getting an application working locally wasn't the same thing as making it work in production. And what about short-lived applications? A VM takes a lot of time — computationally speaking — to spin up. The cloud and VMs left something to be desired. We needed to take packaging applications to the next level.

You need a box, er, container for that?

Software management on Linux has come a long, long way since the first days of RHEL. With the RHEL 2.1 release, users had to install software package by package. That means that if you wanted to install an application that required a handful of RPMs, you had to do it manually — often in the right order — without any dependency management. 

It beat compiling software from source each time, but only by a little. Within a few releases, RHEL adopted Yum — which allowed you to specify the package you wanted to install and it would try to grab any dependencies and install those as well. You could even update all your packages on the system with a few commands. Life was good. 

But configuring the application and all of that? That was an exercise left to the user or admin. We then started shipping applications around as "virtual appliances" bundled into a single VM. 

With the RHEL 7 release in 2014, however, we shipped a major breakthrough in software management in the form of Linux containers. Popularized by Docker, Linux containers were not wholly new, but they were suddenly much more usable.

Now we were able to work on an application on a developer's laptop, put it into continuous integration/continuous delivery (CI/CD) pipelines, test it and move it into production more easily. 

Now, RHEL is at the heart of Red Hat OpenShift, the industry’s leading enterprise Kubernetes platform. With the introduction of Kubernetes, Linux containers could be the basis of incredible, scalable applications that could run just about anywhere. 

That's not all…

The past 20 years of enterprise Linux have been a parade of technical achievements, but those achievements were only half the story. RHEL has not only been up to the technical challenges, Red Hat has also led the way to meet the business challenges as well. 

RHEL's technical evolution and functionality has been impressive, evolving to meet emerging enterprise needs like edge computing, artificial intelligence/machine-learning (AI/ML) and more, but the full offering is much more than that. Red Hat has built an ecosystem with an array of partners that have depended on RHEL to help their customers. We have worked hand in hand with hardware providers, cloud providers and others to certify RHEL across the footprints.

In the past few years, we've added Red Hat Insights to RHEL — which helps amplify the ability of system admins to manage systems at scale. With Ansible Playbooks, we can automate many of the tasks that are mundane and prone to human error to make system administration simpler, faster and more uniform.

It's impressive to see how far RHEL has come in 20 years, but the journey is just getting started. Just imagine what the next 20 years are going to hold for businesses and how RHEL will adapt to help our customers succeed.

 

About the author

Red Hat is the world’s leading provider of enterprise open source software solutions, using a community-powered approach to deliver reliable and high-performing Linux, hybrid cloud, container, and Kubernetes technologies.


Red Hat helps customers integrate new and existing IT applications, develop cloud-native applications, standardize on our industry-leading operating system, and automate, secure, and manage complex environments. Award-winning support, training, and consulting services make Red Hat a trusted adviser to the Fortune 500. As a strategic partner to cloud providers, system integrators, application vendors, customers, and open source communities, Red Hat can help organizations prepare for the digital future.

Read full bio
UI_Icon-Red_Hat-Close-A-Black-RGB

Browse by channel

automation icon

Automation

The latest on IT automation for tech, teams, and environments

AI icon

Artificial intelligence

Updates on the platforms that free customers to run AI workloads anywhere

open hybrid cloud icon

Open hybrid cloud

Explore how we build a more flexible future with hybrid cloud

security icon

Security

The latest on how we reduce risks across environments and technologies

edge icon

Edge computing

Updates on the platforms that simplify operations at the edge

Infrastructure icon

Infrastructure

The latest on the world’s leading enterprise Linux platform

application development icon

Applications

Inside our solutions to the toughest application challenges

Original series icon

Original shows

Entertaining stories from the makers and leaders in enterprise tech