Featured Article : US Supercomputer Breaks The ‘Exascale’ Barrier
The world’s first ‘exascale’ computer, called the ‘Frontier’ computing system (i.e. a supercomputer) from Oak Ridge National Laboratory in Tennessee, has smashed the exascale computing speed barrier.
What Is The Exascale?
The exascale is a computing system threshold / level of computing performance of a quintillion calculations per second, i.e. a computer that’s capable of at least one exaflop or a billion billion operations / mathematical calculations per second i.e. 1018 or 1,000,000,000,000,000,000. Each individual mathematical calculation (of a number containing a decimal) is known as a ‘floating point operation’ or ‘FLOP’ for short. That’s an awful lot of FLOPS! By way of contrast in terms of how far we’ve come, the first electrical computer in the world was the Colossus vacuum tube computer, which was a 500,000 FLOPS supercomputer built in Britain during WWII.
To put things into perspective, to do what an exascale supercomputer can do in one second, you’d need every human on the planet to calculate 1 FLOP every second (for 24 hours a day) without a break for more than four years!
A computer capable of breaking the exascale barrier is, therefore, around 50 times faster than the most powerful supercomputers being used today.
Scale
First announced back in 2019 as project by the U.S. Department of Energy and Cray Inc., the Frontier supercomputer is housed in 74 separate cabinets, comprising 9400 CPUs, or standard computer processors, 37,000 GPUs, and has 8,730,112 cores capable of parallel computing tasks.
No.1 In The ‘Top500’
Breaking the exascale barrier has put the Frontier system in the no.1 position at the very top of the Top500, the international collaboration to rank the world’s most powerful supercomputers.
What is particularly impressive is that the Frontier supercomputer represents 25 per cent of the total performance of the whole list!
To try and put the speed and power of the Frontier system in context, whereas the Frontier has 8,730,112 cores capable of parallel computing tasks, a typical laptop only has between five and nine. Although a typical laptop (currently at best) is capable of an impressive sounding few ‘teraflops’ (a trillion operations per second), this is still millions of times less than the Frontier system.
Could Get Even Faster
Even though the Frontier system has smashed the exascale barrier, it is expected that with further optimised software it could become even faster in the near futire and could reach a theoretical peak of 2 exaflops.
What Can It Be Used For?
An exascale computer of this size can be used as powerful tool by businesses, scientists, and academics to accomplish a vast range of tasks. Oak Ridge National Laboratory (ORNL), which developed the Frontier system, sees exascale computers as playing important roles in enabling scientists to develop new technologies for energy, medicine, and materials, also to deliver breakthroughs in scientific discovery, energy assurance, economic competitiveness, and even (US) national security. Supercomputers with the capability of the Frontier system could also be used for brain mapping, weather and climate forecasting, product design, astronomy and other applications.
The Frontier supercomputer is also second-generation AI system (following on from ORNL’s the ‘Summit’ system) which means that it can also provide new capabilities for deep learning, machine learning and data analytics for applications ranging from manufacturing to human health.
Environmental Issues
There are, however, some environmental issues around the operation of supercomputers like the Frontier system. For example:
– Supercomputers require a massive amount of electricity to operate them meaning, ironically, that although they may be capable of helping to speed up solving of some of the world’s biggest challenges, they could contribute to the global warming that is producing the changing weather conditions that they are capable of predicting. Back in 2020, for example, The Met Office invited potential providers to come up with low-carbon options and it is likely that much of the processing work could be located in countries with easy and abundant sources of clean energy within the European Economic Area, e.g. Iceland (geothermal energy) or Norway (hydropower).
– Supercomputers also require massive amounts of water. For example, at peak power, the Frontier supercomputer generates so much heat that it needs four high-powered pumps to send more than 25,000 litres of water around the machine each minute. This means that new supercomputers may need to be located near large water sources and use renewable energy for their pumping systems.
Other Threats and Concerns
In addition to the enormous potential benefits that supercomputers offer in solving complex problems in dramatically reduced timescales, there are concerns about computers becoming so powerful that they may be a peril to humanity, unlocking frightening new possibilities, or may be a security-threat if used by bad (state) actors. Some of the concerns include:
– Ethical issues about the possible development of creating computers that have a kind of ‘consciousness,’ and what a kind of artificial brain could and should be used for. Also, there may be an ethical debate about whether a computer powerful and complex enough to be a kind of ‘artificial brain’ should be brought into existence.
– Possible unforeseen moral issues which could arise from the use of super-computers when they are developed.
– Quantum computers, which will be the next new generation of technology use quantum algorithms to accelerate digital computation could be a staggering 150+ million times faster than the most sophisticated supercomputers. Despite this staggering potential for good, there is a fear that someone (e.g., threat actors or a foreign power) could use a functioning quantum computer to break the kind of encryption that we trust to secure our data, transactions, and communications. This fear is often called the ‘quantum apocalypse.’
The Future
Following on from exascale and quantum computers, further down the line (some predictions say by 2035) zettascale, data-centric computers look likely to be developed, i.e. one zettaFLOPS, equal to 1,000 exaFLOPS. Some tech commentators have even considered that decentralised computing may be a possibility, although it may have many challenges.
What Does This Mean For Your Business?
There is no doubt that developing this exascale barrier-busting system is a massive achievement that could bring huge benefits and breakthroughs in so many critical areas such as medicine and energy. For US-based ORNL (and Cray Inc) this milestone is also likely to be an important victory over competitors in Japan and China, and it is likely that the race and competition to build more powerful computers will continue at a pace (although some say it is slowing down). Despite the huge benefits they can bring, there are clearly some environmental issues around the operation of supercomputers like the Frontier system, i.e. huge power and water requirements, and the need for these to be supplied in a way that can minimise the environmental impact. There may also be some ethical and moral concerns about trying to develop future generations of computers that are more like ‘brains’ or that could create unforeseen problems and/or pose a threat to our own existence. That said, for the time being, the potential for good and for being able to solve some our biggest challenges quickly at time when we are facing huge challenges with climate, weather, and health, should be celebrated. It should also be recognised that exascale computing holds enormous potential for businesses in multiple industries around the world and could contribute to significant innovation.