Will “The Machine” change computing forever?
By now, everyone knows what a computer looks like and what it does. We’ve evolved from the days of mainframes and minicomputers through to the desktop, laptop and tablet model of the 21st century, and it’s all starting to look a lot like the kind of future envisaged on Star Trek.
Except maybe a change is on the way. That’s the future envisaged by HP, which is dropping a lot of hints around its next generation computing architecture, which it’s simply dubbed “The Machine”.
The brains of the “The Machine” work around a system that eschews the current general purpose processors that you find in today’s systems in favour of highly specialised, power efficient processors. Where today’s systems use physical, albeit microscopically tiny connections to manage input and output functions, The Machine will use photonic relays (or in other words, light) to enable very high data rate, very low power usage data transfer. HP envisages being able to retrieve a single byte of data in a 160 petabyte storage system in under 250 nanoseconds. Fast, in other words.
It’s not just about fast, but also small and power efficient. Current large scale technology deployments use a staggering number of both systems and power, which is why companies such as Google and Apple invest huge sums in both data centres and power generation. On the power generation side they’re at least looking at more environmentally friendly alternatives such as solar power. On the computing, side, however there’s an upcoming brick wall in performance terms, and The Machine’s pitch is that it solves for that without needing to simply throw another 1,000 servers at a problem.
On the storage front, HP’s been doing a lot of research around memristors, a high performance memory product that retains its storage capacity even when power is removed. While there’s a lot of work still to be done on memristors, given that HP originally estimated commercial availability of memristor products by mid-2013, there’s a bright future here indeed. In theory, a memristor-based storage system wouldn’t care if there was a sudden system crash or power outage, because it effectively becomes persistent storage as soon as it’s written to.
It’s pretty unlikely we’ll have “Machines” on our desks or in our pockets even in the medium term. The first port of call for these particular systems will be in those areas that need a lot of data crunching abilities — think stock exchanges, medical research — but if HP’s vision is accurate, they’re the systems — ahem, “machines” — that we’ll interface with before the turn of the decade. It’s pitching The Machine both as a low-cost, space efficient way to manage the enormous quantity of data that current cloud computing services require, as well as a gateway to managing the even larger quantity of data that’s envisaged under the idea of an Internet of Things.