Subscribe to the feed

Today, cloud providers are commonplace in the IT landscape. Companies of any size, whether it's a major enterprise on the scale of Netflix or a small startup with a few, aspiring entrepreneurs, can have access to massive amounts of computing power on demand for the price of an hourly rental fee. This is all possible because of the emergence of the modern data center.

However, before cloud providers came along, large-scale computing was confined to those big businesses that had the money and expertise to host their own machines in their own facilities. It was a very exclusive club. But that was then, and this is now.

The appearance of the modern data center was not a grand reveal. Rather, it resulted from the evolution and convergence of two technologies that emerged out of the early 1980s. The first is the personal computer. The other is ubiquitous networking. Without either, the modern data center would not exist.

This article is the first installment in a four-part series that tells the story of how these two technologies transformed the personal computer from what was essentially a hobbyist obsession into the foundation of modern IT architecture. This evolution, from PC to data center, provides a way to understand not only our present but, more importantly, it allows us to anticipate what our future might become.

This technology was built on the shoulders of many different inventions.

A brief history of pre-connectivity computing

For context, it's helpful to imagine the world before the immediate connectivity to a network was commonplace.

Computational tools have been around since the abacus, which goes back thousands of years to Mesopotamian times. Mechanical calculators that could do addition, subtraction, multiplication, and division appeared in the mid 17th century. Charles Babbage, a member of the Royal Astronomical Society, designed the Analytical Engine in 1823. The machine was intended to do complex computations relevant to astronomy. Babbage was not successful in bringing his design to fruition, but the idea was planted.

Charles Xavier Thomas, founder of the fire insurance company Le Soleil, patented a mechanical calculator named Arithmometer in 1820. He made the machine commercially available until 1851. The comptometer, another device with similar capabilities, was manufactured by Dorr E. Felt starting in 1887. A year later, William Seward Burroughs was awarded a patent for his adding machine. Burroughs's company, the American Arithmometer Company, manufactured the device. In 1904 the company was renamed the Burroughs Adding Machine Company. That company evolved into the Burroughs Corporation, one of the early manufacturers of mainframe computers.

The business of making business machines

The road to the mainframe computer was not a direct connection between adding machines and what they became. Another type of machine, the tabulator, was in the middle, and that machine was to become the foundation of IBM's entry into the mainframe computing business.

International Business Machines (IBM), a company that today is synonymous with computing, started out making, as its name implied, business machines. But, it didn't start with the name IBM. Rather, it was founded in 1911 as the Computing-Tabulating-Recording Company (CTR).

The predecessor companies that eventually amalgamated into IBM manufactured a variety of devices. The Computing Scale Company manufactured a scale for businesses that allowed a clerk to determine an item's price according to its weight. Another soon to be subsidiary, Bundy Manufacturing Company, invented and manufactured time clocks that allowed companies to track an employee's arrival and departure at the workplace. This concept was improved by yet another eventual subsidiary, International Time Recording Company (TRC), that made a machine that produced the timecard, which went on to become a staple of the factory floor.

CTR was to have another subsidiary, the Tabulating Machine Company (TMC), which manufactured the Hollerith Electric Tabulating System, a.k.a. the Hollerith machine. The Hollerith machine was a device that used punch cards to store large amounts of data and perform complex calculations on that data. The Hollerith machine was used in the 1890 and 1900 US Censuses as well as the censuses of Russia, Austria, Canada, France, and Norway. TMC was absorbed into CTR in 1911. In 1924 CTR was renamed IBM. Hollerith's punchcards would go on to power early mainframe computing.

While IBM was to become a leader in the field, it did not invent the first commercial mainframe. That was the work of the Eckert-Mauchly Computer Corporation (EMCC) when in 1948, it released the machine it named UNIVAC. (See Figure 1, below)

The front cover of the first UNIVAC manual

Figure 1: The front cover of the first UNIVAC manual, which you can read here

And that is where our story really begins.

The arrival of the mainframe

The UNIVAC was an amazing machine for its time. According to Tom's Hardware, it weighed 14.5 tons and had a clock speed of 2.25 MHz. It could do 455 multiplications per second. By today's standard, where a cellphone weighing a little over 5 ounces and running a 1.2 GHz can do hundreds of millions of multiplication operations per second, it's considered a stone-age relic. But, back then, the UNIVAC was a game-changer.

The UNIVAC was built according to a uniform design. Earlier mainframes were manufactured as one-offs, not intended for mass-production. Standardizing the UNIVAC made the machine a commodity, an expensive commodity, but a commodity nonetheless. Forty-six were sold in 1951. More sales were to follow, fueled by the advertising and promotion needed to drive the desired growth.

The machine famously predicted the winner of the 1952 presidential election in the USA, although the CBS executives who had commissioned using the "electronic brain" were hesitant to believe it at first. The polls had predicted Democratic candidate Adlai Stevenson to win. Dwight Eisenhower, the Republican, went on to win the election, making mainframe computing front-page news. These room-sized machines were the beginning of massive computational power, the likes of which the world had never known.

Once mainframes became commercially viable, the dynamics of market competition set in. Even though the UNIVAC received all the attention at first, IBM, a company with strong relationships within the Fortune 500 customer base, was working on its own mainframe at the time. It introduced the IBM 710 to the public in 1953. Eventually, the company would dominate the mainframe industry, so much so that the term IBM and the Seven Dwarfs became a popular way to describe the industry. The Seven Dwarfs came to be: Burroughs, Sperry Rand, Control Data, Honeywell, General Electric, RCA, and NCR.

Within the decade, computers were a thing. Mainframe sales went from 2,500 units in 1959 to 50,000 units in 1969. By 1964, American Airlines coordinated all of its airline reservations using the SABRE system, which had come online in 1950. Two IBM 7090 mainframes powered SABRE. According to BusinessWeek, in 1968, IBM stock was worth "least as much as the combined shares of 21 of the 30 companies that go to make up the Dow-Jones industrial average."

Not only were computers affecting the bank accounts of Big Business, but they were also tugging on the fabric of society in the most unexpected ways. People who weren't typically part of the Big Business landscape were making significant contributions. Dorothy Vaughn, made famous in the 2016 movie Hidden Figures, was the first African-American woman supervisor at West Area Computers, which would later get absorbed into NASA. At first, she supervised the "human computers" that did the computational work necessary for World War II aeronautical research. Later, in the early 1960s, when mainframes came along, she taught herself and others FORTRAN.

A few years later, in 1965, a Catholic nun, Sister Mary Kenneth Keller, was the first of two people in the USA to earn a Ph.D. in Computer Science. She received her doctorate from the University of Wisconsin. Her dissertation was titled, Inductive Inference on Computer Generated Patterns.

As Bob Dylan wrote in 1963, the times they are a-changin'. And, the mainframe computer loomed large on the landscape on which the changes were happening.

Moving beyond punchcards

Mainframe computers were becoming commonplace in the technical landscape. Yet for all the connectivity we have today, one of the more stunning facts about mainframe computing is that prior to the early 1970s, most were completely standalone machines that had no monitor and keyboard attached. Communication was facilitated using Hollerinth's punchcard technology or tape, either paper or magnetic. (See Figure 2.)

The IBM 029 keypunch device

Figure 2: The IBM 029 keypunch device prepared the punch cards that were the standard input for mainframe computing

Operators fed punch cards directly into the machine or loaded reels of tape onto spindles that allowed the tape to pass through heads that read the information in order to convert it to machine-readable commands for the computer to execute. (See Figure 3.)

The IBM 2401

Figure 3: The IBM 2401 transferred data on magnetic tape to a mainframe computer

Later on, programmers and data entry clerks interacted with the machines using a keyboard and printer. The intermediary between punch card preparation and device was eliminated. They entered data into the computer using the keyboard. The computer responded by sending output directly to a printer fed by a continuous stream paper. (See Figure 4)

Printer with continuous feed paper

Figure 4: Printers were used early on to display output from a mainframe

Eventually, terminals and monitors were connected by wires using simple RS-232 serial connections. Initially, these serial connections only worked when the wires connecting the monitor and keyboard were less than 15 meters from the mainframe. As a result, until modems and daisy-chaining technologies came along, working in a mainframe computing environment meant that you were physically very close to the machine. This need for proximity gave rise to the data processing center, a place where everyone gathered together to create programs, convert them into machine-readable formats, and then feed both the programs and input data to the computer. These data processing centers were the seeds from which the modern data center grew.

The emergence of the connected mainframe

For the most part, mainframe computing was an undertaking in which big companies and organizations on the order of banks, insurance companies, and government agencies such as the IRS leased their machines from IBM and the Seven Dwarfs. These machines were delivered to special locations for the exclusive use of the customer. Connecting these machines together was not yet a mainstream idea. The need really wasn't there because having the machines work in isolation wasn't a problem. With the introduction of the Compatible Time-Sharing System (CTSS) in 1961, mainframes had become multiuser machines. As long as people were connected to the same computer, they could share data and even send emails to each other using programs such as MAILBOX and SNDMSG. In a way, mainframes were the data center before the data center.

However, by the mid-1960s, connectivity between different types of mainframes was being given serious thought. In October of 1969, about three months after the United States put the first man on the moon, the first computers of different types were connected on a network. This was an important event but not really useful until a technology called Network Control Program (NCP), an early predecessor to the Transmission Control Protocol (TCP), came along about a year later.

NCP enabled protocol-based communication, which is a critical aspect of modern networking. One of the first uses of NCP was to allow a user on one machine to login to another device via the Telnet protocol and upload a file using another protocol, FTP. More protocols emerged. Mail Box Protocol, a precursor to SMTP, was proposed in 1971.

By the early 1980s, the general computing infrastructure was unifying. Networked mainframes and multiuser minicomputers—a powerful yet smaller entry to the enterprise computing landscape—were becoming a mainstay. It seemed as if the vision of the wired planet was just around the corner, except for one gnawing fact. It all cost a small fortune. Then the PC came along and with it the age of affordable computing.

Read other articles from this series: 

Subscribe to Enable Architect.


About the author

Bob Reselman is a nationally known software developer, system architect, industry analyst, and technical writer/journalist. Over a career that spans 30 years, Bob has worked for companies such as Gateway, Cap Gemini, The Los Angeles Weekly, Edmunds.com and the Academy of Recording Arts and Sciences, to name a few. He has held roles with significant responsibility, including but not limited to, Platform Architect (Consumer) at Gateway, Principal Consultant with Cap Gemini and CTO at the international trade finance company, ItFex.

Read full bio
UI_Icon-Red_Hat-Close-A-Black-RGB

Browse by channel

automation icon

Automation

The latest on IT automation for tech, teams, and environments

AI icon

Artificial intelligence

Updates on the platforms that free customers to run AI workloads anywhere

open hybrid cloud icon

Open hybrid cloud

Explore how we build a more flexible future with hybrid cloud

security icon

Security

The latest on how we reduce risks across environments and technologies

edge icon

Edge computing

Updates on the platforms that simplify operations at the edge

Infrastructure icon

Infrastructure

The latest on the world’s leading enterprise Linux platform

application development icon

Applications

Inside our solutions to the toughest application challenges

Original series icon

Original shows

Entertaining stories from the makers and leaders in enterprise tech