Hewlett Packard Enterprise Co on Tuesday unveiled a new computer prototype that it said could handle more data than any similar system in the world.
The Palo Alto, California-based company said the prototype contains 160 terabytes of memory, capable of managing the information from every book in the U.S. Library of Congress five times over.
It is the latest prototype from "The Machine" research project by HPE, which aims to create super-fast computers by designing them around memory. Traditionally, the way processors, storage and memory interact can bog down computers.
The prototype underscores HPE's ambition to lead computer technology as huge datasets place new strains on devices.
"We need a computer built for the Big Data era," HPE's Chief Executive Meg Whitman said in a news release.
While large data centers that piece together many computers may have enough calculating power, they cannot transfer data efficiently, said Kirk Bresniker, Hewlett Packard Labs Chief Architect, in an interview. That means HPE's single-system model may one day compete with the infrastructure spearheaded by cloud-computing companies like Amazon.com Inc.
HPE expects its model will over time contain more and more memory. While the prototype remains years away from being commercially available, HPE is already bringing some of the tech from its research program to market.
Still, companies and the scientific community have yet to concur on what technology will best serve users.
"You need computing that scales up with the size of the dataset," said Kathy Yelick, a professor of electrical engineering and computer sciences at the University of California at Berkeley.
There's still discussion "about what the right answer is."
(Reporting by Jeffrey Dastin in San Francisco; Editing by Stephen Coates)
0 nhận xét:
Đăng nhận xét