There’s a brand new piece of hardware on the streets, and it can store the entire library of congress, five times over.

The Hewlett Packard Enterprise has recently announced that it has finished a functional prototype of The Machine, a memory-learning based computing system. The machine is outfitted with a whopping 160TB of memory, spread across 40 nodes. However, the project is still in its comparatively early stages, and its creators hope that, given more time, it can be scaled to an exabyte-scale single-memory system, and from there leap to a 4,096-yottabyte pool of memory. It may be decades before this kind of technology can be applied to domestic use. That said, its creators still seem certain that they stand at the precipice of a very exciting journey. Andrew Wheeler, Vice President and Deputy Director of Hewlett Packard Labs, explained the project’s process in an interview with Digital Trends last week.

“The architecture is a fundamental tenet, so we do want to exploit any memory technology that comes online and is made available from now, five years out, ten years out,” said Wheeler. “Certainly, we do believe one of the emerging non-volatile memories will provide us with the density, and really, the cost, that allows us to build for the problem at hand.”