Will Obama's supercomputer be obsolete by 2025?

Intel backs the US challenge to China's supercomputer dominance, but is the project doomed before it starts?

President Barack Obama's project to build the world's fastest supercomputer by 2025 could be doomed from the start, thanks to Moore's law of computing.

The president signed an executive order this week that calls for the US to get back into the supercomputer race, with a new machine ready within 10 years, after losing the crown to China, which has topped the world's list of mega-PCs for the last two-and-a-half years with the Tianhe-2.

The National Strategic Computing Initiative (NSCI) will oversee the building of the machine, which  set to be 20 times more powerful than China's model, to become the world's first exaflop supercomputer.

This means that it will perform one quintillion operations every second, ideal for High Performance Computing' tasks involving large quantities of data.

Advertisement
Advertisement - Article continues below

"HPC has historically focused on using numerical techniques to simulate a variety of complex natural and technological systems," an official blog stated, "such as galaxies, weather and climate, molecular interactions, electric power grids, and aircraft in flight."

Many manufacturers are fully behind these developments, as Charlie Wuischpard, vice president of Data Center Group and general manager of Workstations and HPC at Intel, demonstrates. 

"Intel absolutely believes achieving exascale computing is feasible over the next several  generations of silicon process technology", he says, stating that "Intel is "all-in" on the achieving the exascale goal". 

"Industry, government and academic collaborations are going to be crucial for this to be successful. We believe this Executive Order establishes the imperative for all of us to look at new ways of cooperating." 

Datacentre

But other experts have doubted the project's feasibility, warning that the pace of development in technology could scupper the supercomputer's chances before it is even built.

Jason Ward, senior director of UK Enterprise EMC, said: "Speed in technology is everything, whether it is to provide the biggest, fastest or be the first to market.

"Those building the newest supercomputer will need to accurately anticipate the technological environment 10 to 20 years from now in order to ensure that it is not outdated before it is even launched."

Ten years of development can be a long time in technology, and keeping up with the bleeding edge of hardware may be difficult, as Moore's law that the number of transistors in a chip doubles every two years continues to be proven right.

A good example is Microsoft's original Xbox, which debuted with a 733MHz Intel Pentium 3 CPU and 64MB of RAM. Just five years later, the Xbox 360 shipped with 512MB of RAM and a triple-core 3.2GHz processor.

This issue becomes more pronounced when looking at high performance computing. IBM's BlueGene L, which held the title for world's fastest supercomputer from 2004 to 2007, had a theoretical peak of around 183 teraflops.

Advertisement
Advertisement - Article continues below

Tianhe-2 was developed roughly ten years later, and holds the current speed record. Its theoretical peak, by contrast, is 54,902 teraflops. That's around 300 times the speed of BlueGene L, and proof that the industry can move at astonishing pace.

Back in 2009, IBM built the first petaflop computer, nicknamed Roadrunner, to track the decay of America's aging nuclear arsenal. But by 2013, it was already outclassed, and was decommissioned due to its hefty power consumption.

It means there is a real danger that the NSCI will have its work cut out for it just avoiding instant redundancy, though by taking advantage of more advanced technology than its competitors, the US supercomputer could still top the 2025 list.

It's possible, for example, that the new computer will leverage Intel's new 3D XPoint memory structure, which supposedly boasts speeds up to 1,000 times faster than traditional NAND flash memory, a non-volatile storage technology that does not require power to retain data.

Featured Resources

Application security fallacies and realities

Web application attacks are the most common vulnerability, so what is the truth about application security?

Download now

Your first step researching Managed File Transfer

Advice and expertise on researching the right MFT solution for your business

Download now

The KPIs you should be measuring

How MSPs can measure performance and evaluate their relationships with clients

Download now

Life in the digital workspace

A guide to technology and the changing concept of workspace

Download now
Advertisement

Recommended

Visit/government-it-strategy/28305/ir35-news
Policy & legislation

IT contractor wins £240k IR35 appeal against HMRC

5 Nov 2019

Most Popular

Visit/operating-systems/25802/17-windows-10-problems-and-how-to-fix-them
operating systems

17 Windows 10 problems - and how to fix them

4 Nov 2019
Visit/strategy/28115/the-pros-and-cons-of-net-neutrality
Business strategy

The pros and cons of net neutrality

4 Nov 2019
Visit/domain-name-system-dns/34842/microsoft-embraces-dns-over-https-to-secure-the-web
Domain Name System (DNS)

Microsoft embraces DNS over HTTPS to secure the web

19 Nov 2019
Visit/social-media/34844/can-wikipedia-founders-social-network-really-challenge-facebook
social media

Can Wikipedia founder's social network really challenge Facebook?

19 Nov 2019