search
close-icon
Data Centres
PlatformDIGITAL®
Partners
Expertise & Resources
About
Language
Login
banner
Article

A brief history of data centres

The History and Evolution of Mainframes to Data Centres

In a world of countless digital interactions, it’s easy to take seamless connectivity between people, places and things for granted. Yet, it’s the data centre that makes our modern world possible by enabling all the connectivity, processing and storage we depend on day-to-day.

Of course, data centres haven’t always been the sleek, efficient facilities we know and love. With that in mind, let’s take a look at the origins of data centres, how they’ve evolved and where they’re headed next.

When was the first data centre built?

In the 1940s, the concept of data centre was developed by J. Presper Eckert and John Mauchly. Eckert and Mauchley were the primary inventors of the ENIAC computer, which was the world's first data centre, built in 1946 at the University of Pennsylvania in the US. However, the ENIAC was very different from the data centres we have today.

Even in the 1950s and 60s, data centres were a far cry from their modern cousins. In fact, they weren’t even called data centres, but mainframes. A mainframe is a large computer designed to handle significant processing loads and data storage. The CDC 6600, from Control Data Corporation, is often recalled as first mainframe supercomputer and boasted a mighty processing speed of 40MHz. Costing the earth and custom-built for specific business uses, these ‘Big Iron’ computers were scarce, fickle and labour-intensive; keeping them operational for even days at a time was something of an achievement.

With no network connectivity, these early mainframes were islands of computing power in a pen and paper world. Here’s how Pitt Turner, Executive Director of the Uptime Institute, recalls the nearby mainframe for a large regional bank: “In the evening, all these trucks would arrive…carrying paper. Through the night that paper would be processed, the data would be crunched, new printouts would be created and then they would send the documents back out to the branch banks so they could open in the morning”.

Throughout the 1970s and 80s, Moore’s Law continued to thunder on: computing power climbed ever higher and desktop computers became a common sight. Yet, mainframe evolution during this time wasn’t primarily concerned with processing power and efficiency, but reliability. The ability to ensure data purity and avoid corruption steadily increased, but computing power continued to be costly to manage, causing many organisations to outsource their requirements rather than maintain in-house ‘machine rooms’.

The evolution of data centres

Throughout the 1970s and 80s, Moore’s Law continued to thunder on: computing power climbed ever higher and desktop computers became a common sight. However, during this time, the evolution of data centres focused more on improving reliability than increasing processing power. Ensuring data storage integrity and preventing corruption became priorities, yet managing computing power remained costly, leading many organisations to outsource their computing needs rather than maintain in-house ‘machine rooms’.

The 1990s marked a turning point in the history of data centres. The combination of the microprocessor boom, the rise of the Internet, and the adoption of client-server computing revolutionised the industry. IT suddenly became more agile, enabling businesses to deploy applications faster and more cost-effectively. Old mainframe rooms were transformed into in-house data centres filled with microprocessor-based servers, laying the groundwork for the modern data centre.

Meanwhile, external forces were also at play. As having a permanent presence on the Internet became crucial, network connectivity and colocation services became business-critical. Internet service providers and hosting companies began constructing large, external facilities to deliver their services, sparking widespread data centre adoption.

Suddenly, IT became nimble: delays and bureaucracy gave way to the ability to provision and install business applications much more rapidly on relatively inexpensive hardware. Old mainframe rooms filled up with microprocessor computers acting as servers, laying the foundation for the first in-house data centres. Slowly, this infrastructure became standardised in both design and operation, and the modular racks we know today were born.

Things were changing outside the enterprise as well. As a permanent presence on the Internet became essential, network connectivity and colocation services were suddenly business-critical. Internet providers and hosting companies began building large, external facilities to provision their services, igniting a feeding frenzy of data centre adoption.

What’s more, Kenneth G. Brill, who is widely recognised as the father of the modern data centre industry, introduced tier certification (please add the link) to revolutionise this field. The ingenious system he devised to delineate the standardised classification of data centres has effectively provided a much-needed framework, which continues to be extensively utilised to this day.

The boom of the data centre industry

By the early 2000s, as the Internet matured, data centres had become indispensable. IT investment surged, and new facilities were built globally in response to the dot-com boom. However, when the bubble burst, the industry suffered a major downturn, with 17 of the 27 pan-European data centre providers going out of business.

Despite the turmoil, this period also saw the rise of virtualisation—a quieter but transformative revolution. In the wake of the crash, efficiency became paramount, and virtualisation allowed for a dramatic reduction in data centre power consumption, space requirements, and cooling needs, by up to 80%.

The financial crisis of 2008 further accelerated the trend towards reducing IT spending, outsourcing, and taking advantage of economies of scale. This led to the rapid growth of the colocation market, a trend that continues today.

The future of the data centre industry

Thriving trends like cloud computing, the Internet of Things (IoT) and the emerging field of Cyber Physical Systems (also known as Artificial Intelligence) will continue to put the data centre at the heart of the digital economy.

To meet stringent performance, reliability and security demands, organisations are increasingly choosing to abandon on-premise data centre strategies in favour of colocation. Today’s colocation facilities harness all the connectivity, sustainability, efficiency, resilience and expertise that’s been so hard-won over the last half century. It’s no surprise then that business is booming; according to Research and Markets, the colocation industry is accelerating towards a total value of US$55.31 billion by the end of 2021.

Of course, further change is inevitable. No one knows what the future holds, but state-of-the-art colocation facilities offer organisations the best chance to be ready for it.

FAQs

What is the role of a data centre in modern technology?

A data centre plays a critical role in processing, storing, and managing data. It acts as the backbone of the digital economy, facilitating seamless connectivity between people, devices, and services around the world.

How does a mainframe differ from a modern data centre?

Mainframes were large, standalone computers designed to handle significant amounts of data and processing power but lacked network connectivity. Modern data centres are highly connected, modular systems that offer much greater efficiency, scalability, and flexibility, often supporting cloud computing and colocation services.

When and where was the first data centre built?

The first data centre, ENIAC, was built in 1946 at the University of Pennsylvania, USA. It was the world’s first large-scale, electronic, general-purpose computer.

What led to the evolution from mainframes to data centres?

The transition from mainframes to data centres was driven by advances in technology, such as the microprocessor boom, the rise of the Internet, and client-server computing models. These innovations allowed IT systems to become more agile and cost-effective, transforming mainframe rooms into modern data centres.

What is colocation, and why is it important for businesses today?

Colocation refers to the practice of renting space in a third-party data centre to house servers and other IT infrastructure. It allows businesses to benefit from secure, efficient, and scalable facilities without having to invest heavily in building and maintaining their own data centres.

How has virtualisation changed the data centre industry?

Virtualisation has dramatically reduced the physical space, power, and cooling requirements in data centres by enabling multiple virtual machines to run on a single physical server. This has improved hardware utilisation and cost-efficiency, revolutionising how data centres operate.

Who is Kenneth G. Brill, and why is he important in the history of data centres?

Kenneth G. Brill is widely regarded as the father of the modern data centre industry. He introduced the tier certification system, which standardises the classification of data centres based on performance, reliability, and security. This framework is still extensively used today.

How has the Internet impacted the development of data centres?

The rise of the Internet in the 1990s and 2000s led to significant growth in the data centre industry, as businesses required constant, reliable connectivity. This demand spurred the construction of larger, more standardised facilities to handle the increased network traffic and data storage needs.

What trends are shaping the future of data centres?

The future of data centres is being influenced by trends such as cloud computing, the Internet of Things (IoT), and artificial intelligence. These technologies demand greater processing power, efficiency, and scalability, driving further innovation in data centre design and operations.

What is the global value of the data centre and colocation industry?

As of 2021, the colocation industry is projected to reach a total value of US$55.31 billion. This growth is driven by the increasing demand for secure, scalable, and cost-effective data storage and processing solutions.

Tags