How Servers Have Evolved And Improved
Although not considered to be a decade associated with computing, the 1980s did lay the foundations for modern digital life. Home computers reached the mainstream thanks to affordable devices like the BBC B and the Sinclair Spectrum. Microprocessors facilitated the creation of portable devices like laptops, mobile phones, and PDAs. And the first server units pointed to a future where individual computers didn’t operate as standalone data silos.
From spare room to cyberspace
In the beginning, servers were used to share data among terminals in commercial buildings. Individual minicomputers or desktops would be connected via Ethernet to a central repository of data and storage. These heavy plastic boxes contained expensive storage disks, requiring laborious set up by highly qualified IT personnel. Throughout the Eighties, servers evolved to offer username-and-password security, plus support for printers and an ever-widening array of electronic equipment. And as data requirements increased, so did server space requirements. Individual boxes became racks, expanding to fill entire rooms with snaking cables and noisy cooling fans.
In the Nineties, the World Wide Web expanded the remit of servers. No longer were they restricted to local area networks – the information had to be dispatched in response to terminal requests from around the world. And despite the principles of Moore’s Law ensuring that server technology was becoming ever more powerful, servers struggled to deliver mushrooming volumes of internet traffic. Towards the end of the decade, specialist web servers were being marketed, which reduced the configuration required before a server could begin distributing data.
Virtual insanity
Dedicated server space was seen as increasingly inefficient by the early Noughties, but nothing had arrived to challenge its monopoly on data storage. Virtualisation was perhaps an inevitable development. This describes a process where server space is subdivided into portions, each of which is able to run an independent operating system (usually Linux or Windows). Because each install is completely distinct from the others, virtualisation enables one device to serve multiple masters. That helped to bring down unit costs, which in turn has democratised online storage.
Without virtualisation, it’s doubtful many of today’s cloud-hosted services would exist. Dropbox and WordPress are just two examples of services developed from partitioning online storage for multiple individual users. Virtual private servers also brought another unexpected benefit, since the hypervisor software used to emulate standalone server space may also be used to create a real-time record of its contents. Copying this onto another server duplicates the virtual machine, supporting instant restoration in the event of system failure or corruption.
Contain your excitement
Virtual private servers have transformed our lives, but another server-based innovation has received far less attention. The principles of containers were developed in Linux, creating a hermetically-sealed environment in which individual programs or applications could run. Because there’s no interaction with the wider device, containers minimise the risk of incompatibility or conflict. This makes them incredibly stable, as well as being fast and consuming minimal CPU resources.
As society’s data needs grew exponentially, specialist companies evolved to offer vast amounts of server space and internet bandwidth for online service providers. UK2 is one such firm. We have developed a global network of data centres, each of which is packed with the latest Cisco hardware. These centres are nothing like the hot, windowless server rooms of the 1980s. Our data centres are manned around the clock, comprehensively protected against unauthorised access, and supported by dedicated backup generators. And with 200Gb of connectivity, we can supply data around the world at speeds users of early network servers would have found inconceivable…