How are supercomputers kept up to date?

The physicists feed all their knowledge into the computer, which reproduces the formation of the universe numerically - a purely digital brew of 20 billion virtual stars clumps together to form a structure that is similar to today's cosmos.

"If you take into account not only gravity and gas pressure, but also the formation of individual stars, it is very complex," says the Potsdam astrophysicist Stefan Gottlöber. "We want to understand how galaxies arose from tiny density fluctuations in the cosmos 300,000 years after the Big Bang," adds his doctoral student Arman Khalatyan.

Such a huge project cannot be carried out on normal computers: a desk PC would have had to calculate day and night for 114 years to process the masses of data. In other research areas, too, the demands on computing power have grown enormously in recent years.

Biologists simulate chemical reactions in body cells, climate researchers predict the effects of climate change with increasingly reliable models. They all need supercomputers - the top-of-the-range computers available worldwide. In contrast to the times of the legendary mainframes, modern supercomputers today consist of numerous networked processors that work in parallel.

52 days instead of 114 years

Europe's currently fastest mainframe computer and the fifth fastest in the world is located in Barcelona, ​​framed by the cool walls of a former church. The physicists around Stefan Gottlöber have also booked computing time on the "MareNostrum" supercomputer.

Its 10,240 processors have a computing power of 63 teraflops, which means 63 trillion computing operations per second, which is around 20,000 times faster than a standard PC. Gottlöber's cosmic simulation, MareNostrum creates on 800 processors in 52 days instead of the 114 years that a PC would need.

Supercomputers are fast because many processors are nibbling on a problem at the same time. In order to optimize this form of division of labor, the researchers have to break up their calculation steps into suitable packages. Each processor in the supercomputer then deals with a partial problem - the computing cores only exchange data every now and then in order to keep themselves up to date.