20.1 C
Delhi
Monday, October 27, 2025

See how Exascale computing “black hole” will change the world

North AmericaSee how Exascale computing "black hole" will change the world

Supercomputing will help scientists capture the first-ever video of a black hole. Exascale computing “black hole” will change the world!

Exascale computing refers to computing systems capable of calculating at least 1018 floating-point operations per second (1 exaFLOPS). The terminology commonly refers to the performance of supercomputer systems and although no single machine has reached this goal as of January 2021, there are systems being designed to reach this milestone. In April 2020, the distributed computing network Folding@home attained one exaFLOPS of computing performance.

Exascale computing would be a major achievement in computer engineering. Primarily it will allow enhanced scientific applications and better prediction such as in weather forecasting, climate modeling, and personalized medicine.

Exascale also reaches the estimated processing power of the human brain at the neural level, a target of the Human Brain Project. As in the TOP 500 list, there is also a race to be the first country to build an exascale computer.

Black holes form an exacomputer:  Scientists develop simulation code for a new generation of supercomputers.  What transpires when two black holes merge, or when stars collide with a black hole? This has now been simulated using a novel numerical method. The simulation code ‘ExaHyPE’ is designed in such a way that it will be able to calculate gravitational waves on the future generation of ‘exascale’ supercomputers.

The challenge in reproducing black holes lies in the need of solving the complex Einstein system of equations. This can only be done numerically and exploiting the power of parallel supercomputers. How correctly and how swiftly a solution can be approximated depends on the algorithm used. In this case, the team headed by Professor Luciano Rezzolla from the Institute of Theoretical Physics at the Goethe University and the FIAS achieved a milestone. Over the long term, this theoretical work could expand the experimental possibilities for detecting gravitational waves from other astronomical bodies besides black holes.

The novel numerical method, which employs the ideas of the Russian physicist Galerkin, allows the computation of gravitational waves on supercomputers with very high accuracy and speed. “Reaching this result, which has been the goal of many groups worldwide for many years, was not easy,” says Prof. Rezzolla. “Although what we accomplished is only a small step toward modeling realistic black holes, we expect our approach to become the paradigm of all future calculations.”

HPE is delivering Frontier to Oak Ridge National Lab, a system that will stimulate innovation in science and technology and maintain US leadership in high-performance computing and artificial intelligence.

Exascale computing seeks to accelerate innovation, hasten scientific discovery all the way from [harmaceutical research, to extreme weather prediction, to space exploration, high-performance computing systems speed discovery, increase accuracy, and stretch the limits of human imagination as we explore new frontiers.

Computing at scale with flexibility
With the rise of cloud computing and the availability of HPC as a service, companies have the opportunity to tackle projects that require heavy-duty compute on an on-demand basis. This flexibility enables unprecedented project speed, agility, and scalability.

Powering the next generation of artificial intelligence
AI and machine learning are moving from labs to industry, and supercomputers provide critical infrastructure for intensive data processing. With businesses across many sectors facing the increasing need to analyze and gain insights from large data sets, supercomputing will become the ultimate competitive advantage.

Today’s science, technology, and big data questions are bigger, more complex, and more urgent than ever. Answering those questions demands an entirely new approach to computing.

Now, they aim to build more sophisticated models by simulating how weather and fire interact, combining machine learning with physical data—without sacrificing modeling speed.

”The resolution of weather forecasts today is a few miles. For firefighters to understand what wildfires might do and how to respond, they need models that are significantly finer. Upgrading to the latest generation of supercomputers will give us the processing power we need to enable those kinds of simulations.”

The added value is:
Support insights into terrain, conditions to within 30 meters
Equip stakeholders to get predictions within 1–3 hours
Improve ability to model variables and quantify uncertainties.

The world is moving now towards new breakthrough technology.

Check out our other content

Check out other tags:

Most Popular Articles