Site icon IT World Canada

HPE’s prototype for ‘The Machine’ offers early glimpse of memory-centric computing

HPE The Machine prototype

HPE The Machine prototype

If we were to invent the computer today instead of using the same design architecture that was made 60 years ago, what would we make?

That’s the question that HP Labs asked itself when it started down the path of creating The Machine in 2014. The moonshot R&D initiative seeks to answer the problems posed by the impending influx of big data brought by the Internet of Things and growing connectivity around the globe.

“This is the ‘Hello World’ moment for the next generation of computing,” says Mark Potter, chief technology officer of the enterprise group at HPE. “Some of the fundamentals that have helped grow and revolutionize what we have as a world today are starting to slow down. This will continue to take us forward.”

Mark Potter, CTO at HPE, shows of a prototype of The Machine.

The basic problem that HPE is trying to solve with today’s computer architecture is the necessary exchange between memory and storage. Today, a computer’s memory (RAM) holds the data that can be exposed to the CPU. When new data requires processing, it must be retrieved from storage (the hard drive) and brought to the RAM. So a CPU’s full processing power can only be exposed to segments of data limited by the amount of RAM in a computer.

This model of a CPU-defined approach (hence the name “central” processing unit) and limited memory has defined the way computers are programmed.

“We treat memory as a precious resource, we say that if an algorithm takes a little less memory, it’s better than another algorithm,” says Keith McAuliffe, chief technologist at HPE. “This architecture turns this upside down. We have to unlearn a lot of computer science that we’ve learned in school.”

While The Machine that HPE is working towards will make these new vast memory pools available with a new type of memory, the prototype uses DRAM. The long, custom boards that HPE has built for the prototype contain 4 TB of DRAM each, wired with a design that gives a powerful System-on-a-Chip (SOC) immediate access to all of the memory at once. Stacked together in a rack, The Machine prototype can combine with other boards to provide access to many terabytes of memory at once.

Stacked together in a rack, The Machine prototype can combine with other boards to provide access to many terabytes of memory at once. Multiple board enclosures are linked by a fibre photonics connection rather than electrical wiring, explains Andrew Wheeler, deputy director of HP Labs.

“Right now computer chips ‘speak’ in electronic signaling, so so over short distances you can do very high bit rates at short distances,” he says. “Photonics gives you the same high-speed signaling at greater distances and there’s less power used.”

Fibre cabling is also more manageable than copper cabling as it’s more flexible and compact, he adds.

In short, The Machine is still using conventional hardware, but it’s blending in the new memory-centric architecture that The Machine will offer. That’s needed so HPE is in a position to start talking about the concept and get developers interested in writing programs that can take advantage of it.

A couple of tweaks to take advantage of more memory at the middleware and OS levels could result in performance boosts of 200-800 per cent pretty quickly, Wheeler says. But that’s just the “low-lift, get started” performance gains that can be achieved. Some applications could stand to see as much as an 8,000x gain, such as a financial model that HPE created to predict individualized financial portfolio risk curves based on real-time market data.

Then there are the applications that you just couldn’t have otherwise. Like discovering new laws of nature.

That’s Sean Carroll’s job. He’s a research professor of theoretical physics at the California Institute of Technology (CalTech) and he’s not just whining when he says “We are constrained by our ability to store and manipulate vast amounts of information.”

To confirm the physical laws that govern our universe, Carroll says that his team sets up experiments and matches data to a template of what they’d expect to see if their hypotheses are correct. That’s how the existence of gravitational waves measured by earth-based telescopes earlier this year confirmed Einstein’s theory of thermodynamics. But it’s a limiting approach.

“What we’d like to have is the ability to look for things that we weren’t expecting,” he says. “You could look for something like this and discover it so every telescope in the world could point at the same thing.”

The Machine will be seeing many prototype phases beyond this before HPE’s memory-centric vision is achieved in full. Wheeler says the immediate focus to to scale up from what they have today and achieve some additional use cases and proof points.

There’s no hard timeline for when the next build of The Machine prototype will be completed. You never know what roadblocks you’ll hit when you’re working with entirely new components, he says.

 

Exit mobile version