The digital twin of our planet is intended to simulate the Earth system in the future. It is designed to support national policymakers in taking appropriate measures to better prepare for extreme events. A new strategy paper from European scientists and computer scientists, ETH Zurich, shows how this can be achieved.
To become climate neutral by 2050, the European Union has launched two ambitious programs: Green Deal and DigitalStrategy. As a key component of their successful implementation, climate scientists and computer scientists have launched the Destination Earth Initiative, which will begin in mid-2021 and is expected to last up to ten years. During this period, a high-precision digital model of the Earth, a digital twin of the Earth, should be created in order to display climate development and extreme events in space and time as accurately as possible.
Observational data will be continuously fed into the digital twin to make the digital Earth model more accurate to track evolution and predict possible future trajectories of change. But in addition to the observational data commonly used to model weather and climate, the researchers also want to integrate new data on relevant human activities into the model. The new model of the Earth system will display as realistically as possible almost all processes on the planet’s surface, including the human influence on the management of water resources, food and energy, as well as processes in the physical system.
The digital twin is intended to be an information system that develops and tests scenarios that demonstrate more sustainable development and thus better inform policy.
“For example, if you are planning to build a two-meter dam in the Netherlands, I can look at the data in my digital twin and see if the dam will still protect against anticipated extreme events in 2050.”Peter Bauer, Deputy Director of Research, European Center for Medium-Range Weather Forecasts (ECMWF) and co-initiator of Destination Earth
The digital twin will also be used for strategic planning of fresh water and food supplies or wind and solar power plants.
The researchers say it needs to account for the steady evolution of weather patterns since the 1940s. Meteorologists were the first to begin modeling physical processes on the world’s largest computers. Today’s weather and climate models are ideal for defining entirely new ways to use supercomputers efficiently for many other scientific disciplines.
In the past, weather and climate modeling have used different approaches to modeling the Earth system. While climate models represent a very wide range of physical processes, they usually do not account for the small-scale processes that are needed for more accurate weather forecasts, which in turn focus on fewer processes. The digital twin will unite both areas and allow the complex processes of the entire Earth system to be simulated in high resolution. But to do this, the codes of the simulation programs must be adapted to new technologies that promise much higher computing power.
With the computers and algorithms available today, very complex simulations can hardly be performed at the planned extremely high resolution of one kilometer, because for decades, code development has stalled in computer science terms. Climate Research has benefited from the ability to improve performance through the use of next-generation processors without the need to overhaul its program. This free performance boost with each new generation of processors stopped about 10 years ago. As a result, modern programs can often only use 5% of the maximum performance of conventional processors.
To achieve the necessary improvements, scientists emphasize the need for collaborative design, that is, the joint and simultaneous development of hardware and algorithms, which has been successfully demonstrated by the research team over the past ten years. They propose to pay special attention to general data structures, optimized spatial sampling of the computed grid, and optimization of time step lengths. Scientists also want to decouple codes for solving a scientific problem from codes that perform optimal computation in the corresponding system architecture. This more flexible program structure will enable faster and more efficient switching to future architectures.
The authors also see great potential in artificial intelligence. It can be used, for example, to assimilate data or process observational data, represent undefined physical processes in models, and compress data. Thus, AI can speed up modeling and filter out the most important.