TREX Centre of Excellence in HPC for Quantum Chemistry
In order to compete in the demanding rush in high-precision quantum chemical simulation methods, the TREX Center of Excellence (CoE) federates European scientists, High Performance Computing (HPC) stakeholders, and SMEs to develop and apply high-performance software solutions for quantum mechanical simulations at the exascale. The final goal of the project is to develop a set of flagship Quantum Monte Carlo codes, able to exploit the capabilities of the recent exascale computers at its highest.
To achieve this goal, TREX’s main focus will be the development of a user-friendly and open-source software suite in the domain of stochastic quantum chemistry simulations, which integrates TREX community codes within an interoperable, high-performance platform. This will permit to greatly enhance the tools available to the scientific community for the design of new materials and the understanding of the fundamental properties of matter. In parallel, TREX will work on show-cases to leverage this methodology for commercial applications as well as develop and implement software components and services that make it easier for commercial operators and user communities to use HPC resources for these applications.
Advancing Quantum Monte Carlo technology
In quantum chemistry and materials science, it is really hard to achieve exascale scaling and optimally exploit the precious software resources which will become available in the near future. The Quantum Monte Carlo (QMC) approaches at the heart of TREX are among the few methods in the field of quantum simulations that can fully exploit the massive parallelism of future exascale supercomputers. The marriage of these advanced methods with exascale will enable simulations at the nanoscale of unprecedented accuracy, targeting a fully consistent description of the quantum mechanical electron problem for very large systems.
Quantum Monte Carlo methods are cutting edge computational algorithms applied in quantum systems, allowing a large range of numerical simulations and providing, at the same time, extremely accurate results. QMC methods can twist the fate of modern research, allowing the gathering of results with incredible accuracy and speed, thus overcoming the need for experiments in labs.
- Co-design of computational kernels of flagship QMC codes with efficient algorithms that are scalable for HPC applications, flexible to adapt to future architectures, and can cater to a large base of HPC users and players in synergy with existing CoEs.
- Rational design of an ecosystem of highly scalable, optimized, and inter-operable QMC codes for exascale applications of ultimate accuracy in the domain of quantum chemistry and computational material design. Improvement of the codes adopting parallel paradigms able to fully exploit the potentials of exascale architectures.
- Robust management of complex scalable QMC workflows in high-throughput calculations for materials simulations to further leverage on exascale performance, thereby ensuring the convergence of HPC, HTC, and HPDA.
- Foster wider access, usage, and uptake of knowledge in HPC via direct involvement of present and potential user communities via demonstrators in the development of our scalable eco-system of QMC codes within an integrated HPC, HTC, and HPDA framework
Quantum chemistry has become an indispensable tool in chemistry, physics, biology and materials science and the arrival of exascale computers in the coming years has the potential to increase that utility even further. However, this potential will not be realized unless existing software is radically redesigned so that it can fully exploit the massively parallel architectures on which exascale computers are based. The family of so-called Monte Carlo quantum chemistry methods are likely to play a major role in the redesign effort and it is essential that the leading Monte Carlo methodologists in Europe work closely with mainstream quantum chemistry software developers to ensure that their cutting-edge advances are made available to researchers quickly and efficiently.
High-Performance Computing and High Throughput Computing are new methodologies that are highly demanding in terms of energy. Hence, producing tailored software able to address parallel computing will open new scenarios and possibilities to exploit the full potential of similar infrastructures, optimising the output while reducing the costs. In addition, the development of Quantum Monte Carlo kernels and codes will contribute to the exploitation of exascale machines.
High-performance computing (HPC) refers to systems with extremely high computational capabilities. They involve hundreds of thousands of processors working in parallel to analyze billions of pieces of data in real-time. Today’s most powerful systems can perform calculations thousands of times faster than a normal desktop computer. HPC is a strategic resource for Europe's future as it allows researchers to study and understand complex phenomena while allowing policymakers to make better decisions and enabling industry to innovate in products and services. Inside the European ecosystem. It enables deeper scientific understanding and breakthroughs in nearly every scientific field.
The applications of supercomputing in science are countless: from fundamental physics (advancing the frontiers of knowledge of matter or exploring the universe) to material science (designing new critical components for the pharmaceutical or energy sectors) and earth science (modelling the atmospheric and oceanic phenomena at planetary level).
The European Union has recognized high performance computing as a key component of the digital single market strategy.
For more information visit: https://eurohpc-ju.europa.eu/ and https://www.hpccoe.eu/eu-hpc-centres-of-excellence2/