Date: February 7th, 2022 2:57 PM
Author: DrakeMallard (π¦)
The science was settled
https://www.wsj.com/articles/climate-change-global-warming-computer-model-11642191155
Climate Scientists Encounter Limits of Computer Models, Bedeviling Policy
Supercomputer simulations are running up against the complex
physics of programming thousands of weather variables such as the
extensive impact of clouds
BOULDER, Colo.—For almost five years, an international
consortium of scientists was chasing clouds, determined to solve a
problem that bedeviled climate-change forecasts for a generation: How do
these wisps of water vapor affect global warming?
They reworked 2.1 million lines of supercomputer code used to
explore the future of climate change, adding more-intricate equations
for clouds and hundreds of other improvements. They tested the
equations, debugged them and tested again.
The scientists would find that even the best tools at hand can’t
model climates with the sureness the world needs as rising temperatures
impact almost every region.
When they ran the updated simulation in 2018, the conclusion
jolted them: Earth’s atmosphere was much more sensitive to greenhouse
gases than decades of previous models had predicted, and future
temperatures could be much higher than feared—perhaps even beyond hope
of practical remedy.
“We thought this was really strange,” said Gokhan Danabasoglu,
chief scientist for the climate-model project at the Mesa Laboratory in
Boulder at the National Center for Atmospheric Research, or NCAR. “If
that number was correct, that was really bad news.”
At least 20 older, simpler global-climate models disagreed with
the new one at NCAR, an open-source model called the Community Earth
System Model 2, or CESM2, funded mainly by the U.S. National Science
Foundation and arguably the world’s most influential climate program.
Then, one by one, a dozen climate-modeling groups around the world
produced similar forecasts. “It was not just us,” Dr. Danabasoglu said.
‘You solve one problem and create another,’ says Andrew
Gettelman, right, at the NCAR Mesa Laboratory; left, NCAR’s Gokhan
Danabasoglu.
The scientists soon concluded their new calculations had been
thrown off kilter by the physics of clouds in a warming world, which may
amplify or damp climate change. “The old way is just wrong, we know
that,” said Andrew Gettelman, a physicist at NCAR who specializes in
clouds and helped develop the CESM2 model. “I think our higher
sensitivity is wrong too. It’s probably a consequence of other things we
did by making clouds better and more realistic. You solve one problem
and create another.”
Since then the CESM2 scientists have been reworking their
climate-change algorithms using a deluge of new information about the
effects of rising temperatures to better understand the physics at work.
They have abandoned their most extreme calculations of climate
sensitivity, but their more recent projections of future global warming
are still dire—and still in flux.
As world leaders consider how to limit greenhouse gases, they
depend heavily on what computer climate models predict. But as
algorithms and the computer they run on become more powerful—able to
crunch far more data and do better simulations—that very complexity has
left climate scientists grappling with mismatches among competing
computer models.
While vital to calculating ways to survive a warming world,
climate models are hitting a wall. They are running up against the
complexity of the physics involved; the limits of scientific computing;
uncertainties around the nuances of climate behavior; and the challenge
of keeping pace with rising levels of carbon dioxide, methane and other
greenhouse gases. Despite significant improvements, the new models are
still too imprecise to be taken at face value, which means
climate-change projections still require judgment calls.
Scientists are computing future climate change at 5.34
quadrillion calculations on the Cheyenne supercomputer at National
Center for Atmospheric Research.
“We have a situation where the models are behaving strangely,”
said Gavin Schmidt, director of the National Aeronautics and Space
Administration’s Goddard Institute for Space Sciences, a leading center
for climate modeling. “We have a conundrum.”
Policy tools
The United Nations Intergovernmental Panel on Climate Change
collates the latest climate data drawn from thousands of scientific
papers and dozens of climate models, including the CESM2 model, to set
an international standard for evaluating the impacts of climate change.
That provides policy makers in 195 countries with the most up-to-date
scientific consensus related to global warming. Its next major advisory
report, which will serve as a basis for international negotiations, is
expected later this year.
For climate modelers, the difference in projections amounts to a
few degrees of average temperature change in response to levels of
carbon dioxide added to the atmosphere in years ahead. A few degrees
will be more than enough, most scientists say, to worsen storms,
intensify rainfall, boost sea-level rise—and cause more-extreme heat
waves, droughts and other temperature-related consequences such as crop
failures and the spread of infectious diseases.
Climate models put the planet in a digital test tube. When world
leaders in 1992 met in Rio de Janeiro to negotiate the first
comprehensive global climate treaty, there were only four rudimentary
models that could generate global-warming projections for treaty
negotiators.
In November 2021, as leaders met in Glasgow to negotiate limits
on greenhouse gases under the auspices of the 2015 Paris Accords, there
were more than 100 major global climate-change models produced by 49
different research groups, reflecting an influx of people into the
field. During the treaty meeting, U.N. experts presented climate-model
projections of future global-warming scenarios, including data from the
CESM2 model.
There were only four rudimentary models that could generate
global-warming projections for treaty negotiators in 1992, top. Bottom,
the U.N. climate-change conference in Glasgow last year.Photos: ANTONIO
RIBEIRO/GAMMA-RAPHO/GETTY IMAGES, YVES HERMAN/REUTERS
“We’ve made these models into a tool to indicate what could
happen to the world,” said Gerald Meehl, a senior scientist at the NCAR
Mesa Laboratory. “This is information that policy makers can’t get any
other way.”
The Royal Swedish Academy of Sciences in October awarded the
Nobel Prize in Physics to scientists whose work laid the foundation for
computer simulations of global climate change.
Skeptics have scoffed at climate models for decades, saying they
overstate the hazards of carbon dioxide. But a growing body of research
shows many climate models have been uncannily accurate. For one recent
study, scientists at NASA, the Breakthrough Institute in Berkeley,
Calif., and the Massachusetts Institute of Technology evaluated 17
models used between 1970 and 2007 and found most predicted climate
shifts were “indistinguishable from what actually occurred.”
Climate scientist Zeke Hausfather at the Breakthrough Institute,
an environmental-research group, who led the analysis, said: “The fact
that these early models got the future right should give us confidence.”
Still, models remain prone to technical glitches and hampered by
an incomplete understanding of the variables that control how our
planet responds to heat-trapping gases. There are still unanswered
climate questions about the subtle interplay of land, oceans and the
atmosphere. Oceans may be warming faster than previous models predicted.
The effect of airborne dust, soot, grit and aerosols is still hard to
pin down.
In its guidance to governments last year, the U.N.
climate-change panel for the first time played down the most extreme
forecasts.
Before making new climate predictions for policy makers, an
independent group of scientists used a technique called “hind-casting,”
testing how well the models reproduced changes that occurred during the
20th century and earlier. Only models that re-created past climate
behavior accurately were deemed acceptable.
The NCAR Mesa Laboratory in Boulder, Colo.
In the process, the NCAR-consortium scientists checked whether
the advanced models could reproduce the climate during the last Ice Age,
21,000 years ago, when carbon-dioxide levels and temperatures were much
lower than today. CESM2 and other new models projected temperatures
much colder than the geologic evidence indicated. University of Michigan
scientists then tested the new models against the climate 50 million
years ago when greenhouse-gas levels and temperatures were much higher
than today. The new models projected higher temperatures than evidence
suggested.
While accurate across almost all other climate factors, the new
models seemed overly sensitive to changing carbon-dioxide levels and,
for the past several years, scientists have been meticulously
fine-tuning them to narrow the uncertainties.
Computing clouds
Then there is the cloud conundrum.
Because clouds can both reflect solar radiation into space and
trap heat from Earth’s surface, they are among the biggest challenges
for scientists honing climate models.
At any given time, clouds cover more than two-thirds of the
planet. Their impact on climate depends on how reflective they are, how
high they rise and whether it is day or night. They can accelerate
warming or cool it down. They operate at a scale as broad as the ocean,
as small as a hair’s width. Their behavior can be affected, studies
show, by factors ranging from cosmic rays to ocean microbes, which emit
sulfur particles that become the nuclei of water droplets or ice
crystals.
Wind turbines outside Cheyenne, Wyo., last year.
“If you don’t get clouds right, everything is out of whack.”
said Tapio Schneider, an atmospheric scientist at the California
Institute of Technology and the Climate Modeling Alliance, which is
developing an experimental model. “Clouds are crucially important for
regulating Earth’s energy balance.”
Older models, which rely on simpler methods to model clouds’
effects, for decades asserted that doubling the atmosphere’s carbon
dioxide over preindustrial levels would warm the world between 2.7 and 8
degrees Fahrenheit (1.5 and 4.5 degrees Celsius).
New models account for clouds’ physics in greater detail. CESM2
predicted that a doubling of carbon dioxide would cause warming of 9.5
degrees Fahrenheit (5.3 degrees Celsius)—almost a third higher than the
previous version of their model, the consortium scientists said.
In an independent assessment of 39 global-climate models last
year, scientists found that 13 of the new models produced significantly
higher estimates of the global temperatures caused by rising atmospheric
levels of carbon dioxide than the older computer models—scientists
called them the “wolf pack.” Weighed against historical evidence of
temperature changes, those estimates were deemed unrealistic.
By adding far-more-detailed equations to simulate clouds, the
scientists might have introduced small errors that could make their
models less accurate than the blunt-force cloud assumptions of older
models, according to a study by NCAR scientists published in January
2021.
Taking the uncertainties into account, the U.N.’s climate-change
panel narrowed its estimate of climate sensitivity to a range between
4.5 and 7.2 degrees Fahrenheit (2.5 to 4 degrees Celsius) in its most
recent report for policy makers last August. That suggests global
warming could still be high enough to challenge goals set by the 2015
Paris climate agreement, scientists on the panel said.
Dr. Gettelman, who helped develop CESM2, and his colleagues in
their initial upgrade added better ways to model polar ice caps and how
carbon and nitrogen cycle through the environment. To make the ocean
more realistic, they added wind-driven waves. They fine-tuned the
physics in its algorithms and made its vintage Fortran code more
efficient.
To analyze climate change, the Cheyenne supercomputer is linked
to a specialized cluster of 22 graphics processors, top. Redundant power
cables, bottom, guard against failures during calculations that can
consume centuries of simulated computer time.
It is hard to know just where the complexity of clouds waylaid
them, said Dr. Danabasoglu. “With so many lines of code and so much
physics, things can happen,” he said. “Emotionally, we had so much
invested in getting the best model we can put together.”
Even the simplest diagnostic test is challenging. The model
divides Earth into a virtual grid of 64,800 cubes, each 100 kilometers
on a side, stacked in 72 layers. For each projection, the computer must
calculate 4.6 million data points every 30 minutes. To test an upgrade
or correction, researchers typically let the model run for 300 years of
simulated computer time.
In their initial analysis, scientists discovered a flaw in how
CESM2 modeled the way moisture interacts with soot, dust or sea-spray
particles that allow water vapor to condense into cloud droplets. It
took a team of 10 climate experts almost 5 months to track it down to a
flaw in their data and correct it, the scientists said.
Through field experiments, they next learned that bright
low-level clouds off Antarctica’s coast were neither ice crystals nor
cloud drops, as models assumed, but a supercooled liquid that affected
how clouds cooled the surface.
Since releasing the open-source software in 2018, the NCAR
scientists have updated the CESM2 model five times, with more
improvements in development. “We are still digging,” said Jean-Francois
Lamarque, director of NCAR’s climate and global dynamics laboratory, who
was the project’s former chief scientist. “It is going to take quite a
few years.”
The site of the new $40 million Derecho supercomputer in Cheyenne in September 2021.
Moreover, clouds are changing in response to rising global
temperatures in ways that may make warming worse—just as older climate
models had predicted—according to a satellite-data analysis by
scientists at the Scripps Institution of Oceanography in San Diego.
Since the 1980s, the scientists said, the world has become cloudier
toward the poles and less cloudy in the midlatitudes. Thunderclouds have
also grown taller.
As ocean temperatures have risen in recent years, fewer bright,
reflective low-lying clouds have formed over broad areas of open seas,
according to a new study published in September by researchers at
California’s Big Bear Solar Observatory and New York University. That
means more of the sun’s heat is being trapped in the atmosphere, where
it gives rising temperatures a boost—a process that appears to be
accelerating, the researchers said.
Strained supercomputers
The NCAR scientists in Boulder would like to delve more deeply
into the behavior of clouds, ice sheets and aerosols, but they already
are straining their five-year-old Cheyenne supercomputer, according to
NCAR officials. A climate model able to capture the subtle effects of
individual cloud systems, storms, regional wildfires and ocean currents
at a more detailed scale would require a thousand times more computer
power, they said.
“There is this balance between building in all the complexity we
know and being able to run the model for hundreds of years multiple
times,” said Andrew Wood, an NCAR scientist who works on the CESM2
model. “The more complex a model is, the slower it runs.”
Climate models need to link rising temperatures on a global
scale to changing conditions in a local forest, watershed, grassland or
agricultural zone, says NCAR forest ecologist Jacquelyn Shuman, right;
NCAR scientist Gerald Meehl, left.
Researchers now are under pressure to make reliable local
forecasts of future climate changes so that municipal managers and
regional planners can protect heavily populated locales from more
extreme flooding, drought or wildfires. That means the next generation
of climate models need to link rising temperatures on a global scale to
changing conditions in a local forest, watershed, grassland or
agricultural zone, said Jacquelyn Shuman, a forest ecologist at NCAR who
is researching how to model the impact of climate change on regional
wildfires.
“Computer models that contain both large-scale and small-scale
models allow you to really do experiments that you can’t do in the real
world,” she said. “You can really ramp up the temperature, dial down the
precipitation or completely change the amount of fire or lightning
strikes that an area is seeing, so you can really diagnose how it all
works together. That’s the next step. It would be very computationally
expensive.”
The NCAR scientists are installing a new $40 million
supercomputer named Derecho, built by Hewlett Packard Enterprise
designed to run climate-change calculations at three times the speed of
their current machine. Once it becomes operational this year, it is
expected to rank among the world’s top 25 or so fastest supercomputers,
NCAR officials said.
The U.S. Energy Department is developing a supercomputer for
climate research and other applications that the department says is 10
times faster than its most powerful machine, able to perform a
billion-billion calculations a second. Other groups are harnessing
artificial intelligence and machine learning to better capture the
micro-physics of clouds.
“I think the climate models are the best tool we have to
understand the future, even though they are far from perfect,” said Dr.
Gettelman. “I’m not worried that the new models might be wrong. What
scares me is that they might be right.”
More than 2,200 scientists from over 300 universities and
federal labs use the Cheyenne supercomputer to study climate change,
severe weather, air quality and wildfires.
(http://www.autoadmit.com/thread.php?thread_id=5026309&forum_id=2#43924231)
No comments:
Post a Comment