Will Obama's supercomputer be obsolete by 2025?

Intel backs the US challenge to China's supercomputer dominance, but is the project doomed before it starts?

President Barack Obama's project to build the world's fastest supercomputer by 2025 could be doomed from the start, thanks to Moore's law of computing.

The president signed an executive order this week that calls for the US to get back into the supercomputer race, with a new machine ready within 10 years, after losing the crown to China, which has topped the world's list of mega-PCs for the last two-and-a-half years with the Tianhe-2.

Advertisement - Article continues below

The National Strategic Computing Initiative (NSCI) will oversee the building of the machine, which  set to be 20 times more powerful than China's model, to become the world's first exaflop supercomputer.

This means that it will perform one quintillion operations every second, ideal for High Performance Computing' tasks involving large quantities of data.

"HPC has historically focused on using numerical techniques to simulate a variety of complex natural and technological systems," an official blog stated, "such as galaxies, weather and climate, molecular interactions, electric power grids, and aircraft in flight."

Many manufacturers are fully behind these developments, as Charlie Wuischpard, vice president of Data Center Group and general manager of Workstations and HPC at Intel, demonstrates. 

"Intel absolutely believes achieving exascale computing is feasible over the next several  generations of silicon process technology", he says, stating that "Intel is "all-in" on the achieving the exascale goal". 

Advertisement - Article continues below

"Industry, government and academic collaborations are going to be crucial for this to be successful. We believe this Executive Order establishes the imperative for all of us to look at new ways of cooperating." 


But other experts have doubted the project's feasibility, warning that the pace of development in technology could scupper the supercomputer's chances before it is even built.

Advertisement - Article continues below

Jason Ward, senior director of UK Enterprise EMC, said: "Speed in technology is everything, whether it is to provide the biggest, fastest or be the first to market.

"Those building the newest supercomputer will need to accurately anticipate the technological environment 10 to 20 years from now in order to ensure that it is not outdated before it is even launched."

Ten years of development can be a long time in technology, and keeping up with the bleeding edge of hardware may be difficult, as Moore's law that the number of transistors in a chip doubles every two years continues to be proven right.

A good example is Microsoft's original Xbox, which debuted with a 733MHz Intel Pentium 3 CPU and 64MB of RAM. Just five years later, the Xbox 360 shipped with 512MB of RAM and a triple-core 3.2GHz processor.

This issue becomes more pronounced when looking at high performance computing. IBM's BlueGene L, which held the title for world's fastest supercomputer from 2004 to 2007, had a theoretical peak of around 183 teraflops.

Advertisement - Article continues below

Tianhe-2 was developed roughly ten years later, and holds the current speed record. Its theoretical peak, by contrast, is 54,902 teraflops. That's around 300 times the speed of BlueGene L, and proof that the industry can move at astonishing pace.

Back in 2009, IBM built the first petaflop computer, nicknamed Roadrunner, to track the decay of America's aging nuclear arsenal. But by 2013, it was already outclassed, and was decommissioned due to its hefty power consumption.

It means there is a real danger that the NSCI will have its work cut out for it just avoiding instant redundancy, though by taking advantage of more advanced technology than its competitors, the US supercomputer could still top the 2025 list.

It's possible, for example, that the new computer will leverage Intel's new 3D XPoint memory structure, which supposedly boasts speeds up to 1,000 times faster than traditional NAND flash memory, a non-volatile storage technology that does not require power to retain data.

Featured Resources

Preparing for long-term remote working after COVID-19

Learn how to safely and securely enable your remote workforce

Download now

Cloud vs on-premise storage: What’s right for you?

Key considerations driving document storage decisions for businesses

Download now

Staying ahead of the game in the world of data

Create successful marketing campaigns by understanding your customers better

Download now

Transforming productivity

Solutions that facilitate work at full speed

Download now

Most Popular

Google Android

Over two dozen Android apps found stealing user data

7 Jul 2020

How to find RAM speed, size and type

24 Jun 2020

The road to recovery

30 Jun 2020