Editor’s note: last summer, I was asked to contribute to a book called ‘Faster than the future’, commissioned by the Digital Future Society, a programme supported by the Government of Spain in collaboration with Mobile World Capital Barcelona that seeks to build an inclusive, equitable and sustainable future in the digital era. In collaboration with the Barcelona Supercomputing Center, I wrote a chapter on ‘supercomputing’ or high-performance computing, and now that the book is out – download it for free here – I am sharing it on tech.eu in three parts, slightly updated with the latest news/information since the initial write-up.
Supercomputing in Europe: the present
Going back to the TOP500, which ranks and details the most powerful supercomputer systems in the world, Japan, China and the US are currently undoubtedly in the lead, and actively duking it out for the top spots on the list.
Europe plays a secondary role at best, with only eight supercomputers ranked in the top 25, four of which are located in Italy (alongside two in france, and one each in Switzerland and Germany). The highest-ranked supercomputer in Europe today is the HPC5, the sixth most powerful supercomputer in the world, and the overall leader among non-governmental systems – it was unveiled in February 2019 by Italian energy company Eni.
An update to the book chapter:
Since the book’s publication, the list has been updated. There are now nine instead of eight Europe-based supercomputers in the top 25, and the fastest of those is Germany-based ‘JUWELS’, which overtook HPC5.
But the entirety of supercomputers in Europe also rely heavily on non-European technology, which limits the EU’s ability to compete effectively, and comes with its own set of challenges on a number of data sovereignty, intellectual property, security and geopolitical levels.
As is often the case, the European Union suffers from multiple (often conflicting) dynamics, in part driven by a fragmented approach to innovation and research in the high-performance computing field, in part by nationalistic reflexes.
This is how the EuroHPC Joint Undertaking, a €1 billion joint-initiative between the EU and several member states to develop a top-tier supercomputing and high-performance computing innovation ecosystem in Europe, describes the situation on its website:
“The computing and data needs of European scientists and industry do not currently match the computation capabilities available in the EU. No EU supercomputer is in the global top 10, and the existing ones depend on non-European technology. This brings an increasing risk for the EU of being deprived of the strategic or technological know-how for innovation and competitiveness. This situation may create problems related to privacy, data protection, commercial trade secrets or ownership of data. In addition, Europe consumes about 29% of HPC resources worldwide today, but the EU industry provides only ~5% of such resources.”
Coincidentally, the Barcelona Supercomputing Center is one of the most active centers in the EuroHPC project, as director Valero continued to advocate the importance of having a European-scale supercomputing initiative and strategy after Arm’s sale to SoftBank in 2016.
In 2021, BSC will have one of the first supercomputers co-funded by the European Commission (dubbed ‘MareNostrum 5’) and is taking an active role in the research of technologies (processor, accelerators, software and hardware stacks, etc.). Even though it continues to work with Arm, it is betting heavily on RISC-V; de facto, the Barcelona Supercomputing Center is a member of the RISC-V Foundation and heavily involved in the further development of its technology.
Says BSC director Valero: “Open source hardware makes it possible to avoid being a prisoner of the countries that control the large multinational supercomputer vendors and related technologies in the sector.”
Europe’s plan to catch up (so far)
Indeed, as highlighted previously, competing in the supercomputing field is of great strategic importance for sovereign countries and blocs for a variety of reasons, which has led to the European Union launching and supporting a number of initiatives to up its game and give a significant boost to its competitiveness. As previously mentioned, the EuroHPC Joint Undertaking is the largest, with its aim to develop a world-class high-performance computing innovation ecosystem in Europe, but there are several others worth highlighting.
An update to the book chapter:
In March 2021, the EU said it plans to further invest €8 billion until 2033 in high-performance computing systems.
One such initiative, the HPC Europa3 project, fits within the pioneering Distributed European Infrastructure for Supercomputing Applications (or DEISA), which was formed in 2002 as a consortium of eleven supercomputing centers from seven European countries. In essence, it provides academic and industrial researchers with access to world-class supercomputers and other high-performance computing systems, albeit with a limited budget (€9.2 million).
Another is Eurolab4HPC, a two-year Horizon 2020 project meant to strengthen academic research excellence and innovation in high-performance across Europe. There’s the non-profit association PRACE (‘Partnership for Advanced Computing in Europe’), an EU-funded initiative that aims to enable impactful scientific discovery and engineering R&D “across all disciplines to enhance European competitiveness for the benefit of society”.
And then there is ETP4HPC, essentially an industry-led think tank and advisory group of companies and research centres that was set up in 2012 and is heavily involved in high-performance computing research in Europe.
An update to the book chapter:
A new consortium, Exscalate4CoV was created to use leverage supercomputing in the battle against Covid-19.
An EU supercomputer chip in the making?
A fifth and arguably most notable project is the European Processor Initiative (EPI), which is building a new low-power central processing unit (or CPU) based on European technology. This CPU will admittedly rely on a closed-source processor core, but will bundle an accelerator based on the open-source RISC-V architecture.
“Ultimately, the goal is to create a microprocessor for an ‘exascale machine’ based on European tech, rather than proprietary alternatives,” as Valero explains.
This supercomputer will be capable of one exaflop of performance: one quintillion or a billion billion operations per second, which is around a billion times faster than your average desktop computer. Exascale capability is the next frontier; a computing power level roughly comparable to aggregating all the combined computing capabilities of the entire EU population’s smartphones – to give you some idea.
In November 2019, the Barcelona Supercomputing Center, one of the leading supercomputing centers in Europe, announced the opening of the European Laboratory for Open Computer Architecture – or LOCA. Its self-described mission is to design and develop energy-efficient and high-performance chips, based on open architectures such as RISC-V, but also OpenPOWER, and MIPS, for use within future exascale supercomputers.
In March 2020, the research center followed up on the LOCA announcement with the unveiling of the ‘MareNostrum Experimental Exascale Platform’ (or MEEP), an emulation platform that will explore hardware/software co-designs for exascale supercomputers and other hardware targets, based on European-developed IP.
On a related side note, there is more innovation happening on the processor front in Europe. A British startup called Graphcore is taking on several semiconductor titans with a recently launched chip designed specifically for running cutting-edge artificial intelligence algorithms.
When Graphcore, which is based in the English city of Bristol, unveiled the new computer chip, it said it managed to put a remarkable 59.4 billion transistors and almost 1,500 processing units into a single silicon wafer. The company also said that in benchmark tests, its chip performed up to 16 times faster than those from Nvidia, which currently leads the market for chips designed specifically for intensive machine-learning applications.
Europe and the exascale road ahead
In its policy outline upon its formation, the current Ursula von der Leyen-led European Commission predicted that exascale supercomputers will be available around 2022, and that it has every intention to play in the top league:
“Pooling and rationalising efforts at the European Union level is essential to reach exascale capabilities and place a European supercomputer among the world top three by 2022.”
One reason why this is considered such a vital part of the EU’s strategy: a third of the global demand for high-performance computing capabilities comes from European industry players, SMEs and researchers, currently a mere 5% of those capabilities are actually provided by European supercomputing centres.
As a result, European innovators are increasingly using supercomputers located outside the European Union, which leads to important risks in terms of access, data protection, cybersecurity, and privacy.
But if the EU wants this to change, it will also need to find a way to bankroll the steps that need to be taken to get there. As the European Investment Bank wrote in a ‘Financing the future of supercomputing’ paper published in June 2018:
“In order to address this investment gap, securing the appropriate financing to cover the high costs of funding and maintaining the world-leading position of European High Performance Computing remains a challenge. While the EU and national governments are actively promoting the sector with various initiatives, public funding alone will not be sufficient to finance the broad uptake of HPC by industry and SMEs in the coming years.
The EU needs to continue acting as a cornerstone investor, mobilising public funding alongside a clear public value proposition. This should ultimately be recognised and rewarded by the private sector through co-investments.”
Put frankly, the EU will need to put its euros where its mouth is.
If you look at the top 5 fastest supercomputers in the world today, it would appear the high-performance computing race involves only American, Chinese and Japanese contestants.
But with a number of supercomputers currently ranked in the list of top 20 speediest machines, Europe has healthy ambitions not to stay in the backseat for much longer.
Put simply: Europe wants to be a player, not a buyer, in the field of high-performance computing. This is considered a strategic resource to have, and not some vanity contest.
After all, high-performance computing systems are bound to play a major role in things like drug development, cancer research, artificial intelligence, modelling climate change and weather forecasting, molecular chemistry, quantum mechanics, astrophysics and much more, and supercomputers can also be employed for a number of military and large industrial purposes.
The EU has committed to double down on its effort to create a cluster of pan-European innovation ecosystems linked to the field of high-performance computing, and build supercomputers that can ultimately achieve exascale performances.
“The Fugaku demonstrates that new HPC technology can break through and become the top machine, for a short time. This is exciting because we can trace the origins of the machine back to the research done by the BSC. We believe this can be done again, but leveraging a completely open ecosystem, software and hardware, based on Linux, many other open-source software components and RISC-V as the basis for the hardware,” explained Barcelona Supercomputing Center director Matteo Valero.
Indeed, with a deeply rooted history in the creation and mainstream adoption of open-source software programs (e.g. the Linux kernel for operating systems and Arduino for IoT devices), there are many lessons to be learned in Europe to help it keep a seat at the high-performance computing table and maintain, or even increase its global competitiveness.
“Coupled with a large, active and world-class scientific and research community, Europe has an opportunity to benefit from current advancements in open hardware ecosystems and standards to create and operate supercomputers that do not rely heavily on non-European technology, which would represent a real risk to its future data and technological sovereignty,” Valero added.
“As the United States and China continue to be locked into an ongoing trade war and economic conflict, it is not an option for Europe to stay on the sidelines when it comes to supercomputing. In fact, it has a clear incentive to take its destiny in its own hands,” Valero concludes.