Will Obama's supercomputer be obsolete by 2025?

Intel backs the US challenge to China's supercomputer dominance, but is the project doomed before it starts?

President Barack Obama's project to build the world's fastest supercomputer by 2025 could be doomed from the start, thanks to Moore's law of computing.

The president signed an executive order this week that calls for the US to get back into the supercomputer race, with a new machine ready within 10 years, after losing the crown to China, which has topped the world's list of mega-PCs for the last two-and-a-half years with the Tianhe-2.

The National Strategic Computing Initiative (NSCI) will oversee the building of the machine, which  set to be 20 times more powerful than China's model, to become the world's first exaflop supercomputer.

This means that it will perform one quintillion operations every second, ideal for High Performance Computing' tasks involving large quantities of data.

Advertisement
Advertisement - Article continues below
Advertisement - Article continues below

"HPC has historically focused on using numerical techniques to simulate a variety of complex natural and technological systems," an official blog stated, "such as galaxies, weather and climate, molecular interactions, electric power grids, and aircraft in flight."

Many manufacturers are fully behind these developments, as Charlie Wuischpard, vice president of Data Center Group and general manager of Workstations and HPC at Intel, demonstrates. 

"Intel absolutely believes achieving exascale computing is feasible over the next several  generations of silicon process technology", he says, stating that "Intel is "all-in" on the achieving the exascale goal". 

"Industry, government and academic collaborations are going to be crucial for this to be successful. We believe this Executive Order establishes the imperative for all of us to look at new ways of cooperating." 

Datacentre

But other experts have doubted the project's feasibility, warning that the pace of development in technology could scupper the supercomputer's chances before it is even built.

Jason Ward, senior director of UK Enterprise EMC, said: "Speed in technology is everything, whether it is to provide the biggest, fastest or be the first to market.

Advertisement - Article continues below

"Those building the newest supercomputer will need to accurately anticipate the technological environment 10 to 20 years from now in order to ensure that it is not outdated before it is even launched."

Ten years of development can be a long time in technology, and keeping up with the bleeding edge of hardware may be difficult, as Moore's law that the number of transistors in a chip doubles every two years continues to be proven right.

A good example is Microsoft's original Xbox, which debuted with a 733MHz Intel Pentium 3 CPU and 64MB of RAM. Just five years later, the Xbox 360 shipped with 512MB of RAM and a triple-core 3.2GHz processor.

This issue becomes more pronounced when looking at high performance computing. IBM's BlueGene L, which held the title for world's fastest supercomputer from 2004 to 2007, had a theoretical peak of around 183 teraflops.

Advertisement
Advertisement - Article continues below

Tianhe-2 was developed roughly ten years later, and holds the current speed record. Its theoretical peak, by contrast, is 54,902 teraflops. That's around 300 times the speed of BlueGene L, and proof that the industry can move at astonishing pace.

Back in 2009, IBM built the first petaflop computer, nicknamed Roadrunner, to track the decay of America's aging nuclear arsenal. But by 2013, it was already outclassed, and was decommissioned due to its hefty power consumption.

Advertisement - Article continues below

It means there is a real danger that the NSCI will have its work cut out for it just avoiding instant redundancy, though by taking advantage of more advanced technology than its competitors, the US supercomputer could still top the 2025 list.

It's possible, for example, that the new computer will leverage Intel's new 3D XPoint memory structure, which supposedly boasts speeds up to 1,000 times faster than traditional NAND flash memory, a non-volatile storage technology that does not require power to retain data.

Featured Resources

What you need to know about migrating to SAP S/4HANA

Factors to assess how and when to begin migration

Download now

Your enterprise cloud solutions guide

Infrastructure designed to meet your company's IT needs for next-generation cloud applications

Download now

Testing for compliance just became easier

How you can use technology to ensure compliance in your organisation

Download now

Best practices for implementing security awareness training

How to develop a security awareness programme that will actually change behaviour

Download now
Advertisement

Recommended

Visit/government-it-strategy/28305/ir35-news
Policy & legislation

Government announces review of IR35 off-payroll changes

8 Jan 2020

Most Popular

Visit/policy-legislation/data-governance/354496/brexit-security-talks-under-threat-after-uk-accused-of
data governance

Brexit security talks under threat after UK accused of illegally copying Schengen data

10 Jan 2020
Visit/security/cyber-security/354468/if-not-passwords-then-what
cyber security

If not passwords then what?

8 Jan 2020
Visit/web-browser/30394/what-is-http-error-503-and-how-do-you-fix-it
web browser

What is HTTP error 503 and how do you fix it?

7 Jan 2020
Visit/policy-legislation/31772/gdpr-and-brexit-how-will-one-affect-the-other
Policy & legislation

GDPR and Brexit: How will one affect the other?

9 Jan 2020