There is a worldwide race to build the next generation of supercomputers, but US efforts have stalled. China and Europe, in particular, are moving ahead with programs. And Japan is increasingly picking up the pace.

The US government, meanwhile, has yet to put in place a plan for achieving exascale computing.

Exascale programs aren't just about building supercomputers. Development of exascale platforms will also seed new processor, storage and networking technologies. Breakthroughs in these areas in other countries may give rise to new challenges to US tech dominance.

The systems, which would be capable of achieving 1 quintillion (or 1 million trillion) floating point operations per second, one thousand times more powerful than a petaflop system, could be capable of solving the world's greatest scientific problems.

If the US falls behind, the research would increasingly be done in other countries. In sum, the world has awakened to need of high performance computing. The US, for now, is dozing.

Five reasons that the US lead in high performance computing is in danger follow:

The US doesn't have an exascale plan

An exascale development project wold cost the US billions. Europe has estimated that its own exascale effort will cost $3.5 billion Euros (£3 billion) over ten years. China is putting untold amounts of money into its effort.

In 2008, China had 15 systems represented on the Top 500 list of the world's most powerful systems. In the latest list, released this month, 74 Chinese-built systems, or 14.8% of the world's total, appeared. In 2010, a China-built system topped the list. Japan now owns the top stop on the supercomputing list as its government shows renewed interest in high performance computing development.

The US continues to fund big projects such as IBM's planned 20 petaflop computer for Lawrence Livermore National Laboratory that's due next year. That system may put the US back in first place on the Top 500 list.

But despite what's going on in Europe and China, America has yet to set a budget for exascale development.

The Department of Energy is shortly due to deliver the timetable and the costs of building an exascale system to Congress. The delivery couldn't come at a worse time, particularly with this week's failure of the Congressional Super Committee to come to a budget agreement, which will trigger mandated cuts.

US scientists have been warning for a year that Europe and China are on a faster exascale development path. Alex Ramirez, computer architecture research manager, Barcelona Supercomputing Centre, shows an ARM and Nvidia processor card.

"The EU effort is more organised at this stage with respect to exascale with strong backing from the European Commission," said Jack Dongarra, a professor of computer science at University of Tennessee, a distinguished research staff member at Oak Ridge National Laboratory, as well as an organiser of the Top 500 list.

"The Europeans see this as an opportunity to work together on a software stack and be competitive on the world stage," Dongarra said. "The bottom line is that the US appears stalled and the EU, China and Japan are gearing up for the next generation."

It's mistakenly assumed the US will win the exascale race

Although China's supercomputing development effort gets much attention, the Europeans are focused on developing a technology infrastructure to rival the US.

The Large Hadron Collider (LHC), a 16.8 mile circular tunnel on the French and Swiss borders, is establishing Europe as the world's centre for high energy physics research. This may mean that physicists who once wanted to work in the US may find Europe more advantageous. That may help seed the creation of new industries in Europe.

The US once had plans to build a 54-mile supercollider tunnel in Texas, but Congress pulled the funding and abandoned the partially constructed project after its projected cost increased from about $5 billion in the late 1980s to $11 billion in 1993.

European nations are also acting jointly in building their own GPS system, Galileo. It's a $20 billion project.

LHC and Galileo illustrate that European nations are willing to pool resources and work together on technology. They see a similar opportunity in exascale, especially in software development.

"The US, Europe, China and Japan all have the potential to realise the first exascale system," concluded the European Exascale Software Initiative, the group that's leading Europe's effort, in a report last month.

The path to exascale is uncharted, which opens the door to challengers

Although the US has not produced a plan for exascale development, it has outlined some requirements for a system. The system must be ready by 2019-2020 and can't use more than 20 MW of power, which is a small amount of power for a system that may have millions of processors.

The need for low power systems is prompting new approaches to development. The Barcelona Supercomputing Centre, as part of Europe's exascale initiative, is working with ARM, the smartphone chip maker, on technology that combines its processors with Nvidia's graphics processors. They may use expected ARM co-processors as well.

Alex Ramirez, computer architecture research manager at the Barcelona centre, said the project is demonstrating that you can build a high performance computing cluster based on ARM architecture. It is also building a complete software stack for the cluster.

"There are a big number of challenges ahead," said Ramirez, mostly getting the software to work in an environment that is different from servers or mobile computing. "The human effort and investment in software development is going to be significant," he added.

Europe has other exascale developments in progress, including one using Intel technology.

Ramirez said the Barcelona effort is now two years old, and the ultimate goal is to build a system that can reach exascale performance at reasonable power levels. But he also sees European-wide goals in this effort.

"There is an opportunity to keep embedded and high performance industry in Europe in the front line," said Ramirez. "There is a clear convergence between embedded technology and high performance computing technology."

If the US doesn't lead in exascale, what happens when planning for zetascale begins?

A computer science freshman today should know in four years the pathway to an exascale system. By the time this same student completes his or her graduate work, there will be discussion about a zetascale system, something that's one thousand times more powerful.

If high performance computing maintains its historic development pattern, a zetascale system can be expected around 2030. But no one knows what a zetascale system will look like, or whether it's even possible. Zetascale computing may require entirely new approaches, such as quantum computing.

The White House says it doesn't want to be in an "arms race" in building ever faster computers, and warned in a report a year ago this month that a focus on speed "could divert resources away from basic research aimed at developing the fundamentally new approaches to HPC that could ultimately allow us to 'leapfrog' other nations."

But the US is in a computing arms race whether it wants it or not. To develop technology that leapfrogs other nations, the country will need sustained basic research funding as well as building an exascale system.

"A lot of countries have realised that one of the reasons the US became so great was because of things like federally funded research," said Luis von Ahn, an associate professor of computer science at Carnegie Mellon University and a staff research scientist at Google

"There are lot of countries that are trying to really invest in science and technology. I think it's important to continue funding that in the US. Otherwise it is just going to lose the edge, it's as simple as that. "

The US hasn't explained what's at stake

President Barack Obama was the first US president to mention exascale computing, but he didn't really explain the potential of such systems.

Supercomputers can help scientists create models, at an atomic level, of human cells and how a virus may attack them. They can be used to model earthquakes and help find ways to predict them, as well design structures that can withstand them. They are increasingly used by industry to create products and test them in virtual environments.

Supercomputers can be used in any way imaginable and the more power, the more compute capability, the more precise the science.

Today, the US dominates the market. IBM alone accounts for nearly 45% of the system share of the Top 500 systems, followed by HP at 28%. Nearly 53% of the most powerful systems on the list are in the US.

At the SC11 supercomputing conference held earlier this month in Seattle, there were 11,000 attendees, more than double the number from five years ago. A key reason: The growing importance of visualisation and modelling.

This conference draws people from around the globe because the US today is the center for high performance computing, something the world is beginning to challenge on the path to exascale.