EO (Earth Observation) Topics on Climate Change
EO (Earth Observation) Topics on Climate Change
Since the start of the space age, Earth observation is providing its share of evidence for a better perception and understanding of our Earth System and its response to natural or human-induced changes.
Earth is a complex, dynamic system we do not yet fully understand. The Earth system comprises diverse components that interact in complex ways. We need to understand the Earth's atmosphere, lithosphere, hydrosphere, cryosphere, and biosphere as a single connected system. Our planet is changing on all spatial and temporal scales.
Over the years, the entire Earth Observation community, the space agencies as well as other governmental bodies, and many international organizations (UN, etc.) are cooperating on a global scale to come to grips with the modeling of the Earth system, including a continuous process of re-assessment and improvement of these models. The goal is to provide scientific evidence to help guide society onto a sustainable pathway during rapid global change.
In the second decade of the 21st century, there is alarming evidence that important tipping points, leading to irreversible changes in major ecosystems and the planetary climate system, may already have been reached or passed. Ecosystems as diverse as the Amazon rainforest and the Arctic tundra, may be approaching thresholds of dramatic change through warming and drying. Mountain glaciers are in alarming retreat and the downstream effects of reduced water supply in the driest months will have repercussions that transcend generations. 1)
Table 1: Overview of some major international bodies involved in global-change research programs 2)
The UN Framework Convention on Climate Change (UNFCCC) is an intergovernmental treaty developed to address the problem of climate change. The Convention, which sets out an agreed framework for dealing with the issue, was negotiated from February 1991 to May 1992 and opened for signature at the June 1992 UN Conference on Environment and Development (UNCED) — also known as the Rio Earth Summit. The UNFCCC entered into force on 21 March 1994, ninety days after the 50th country's ratification had been received. By December 2007, the convention had been ratified by 192 countries. 3)
In the meantime, there were many UN conferences on Climate Change, starting with the UN climate conference in Kyoto, Japan, in December 1997. The Kyoto Protocol set standards for certain industrialized countries. Those targets expired in 2012.
Meanwhile, greenhouse gas emissions from both developed and developing countries have been increasing rapidly. Even today, those nations with the highest percentage of environment pollution, are not willing to enforce stricter environmental standards in their countries in order to protect their global business interests. It's a vicious cycle between these national interests and the deteriorating environment, resulting in more frequent and violent catastrophes on a global scale. All people on Earth are effected, even those who abide by their strict environmental rules.
The short descriptions in the following chapters are presented in reverse order on some topics of climate change to give the reader community an overview of research results in this wide field of global climate and environmental change.
Multiyear Study of Arctic Sea Ice Coverage
October 11, 2018: The Arctic Ocean's blanket of sea ice has changed since 1958 from predominantly older, thicker ice to mostly younger, thinner ice, according to new research published by NASA scientist Ron Kwok of the Jet Propulsion Laboratory, Pasadena, California. With so little thick, old ice left, the rate of decrease in ice thickness has slowed. New ice grows faster but is more vulnerable to weather and wind, so ice thickness is now more variable, rather than dominated by the effect of global warming. 4) 5)
Kwok's research combined decades of declassified U.S. Navy submarine measurements with more recent data from four satellites to create the 60-year record of changes in Arctic sea ice thickness. He found that since 1958, Arctic ice cover has lost about two-thirds of its thickness, as averaged across the Arctic at the end of summer. Older ice has shrunk in area by >2 million km2. Today, 70 percent of the ice cover consists of ice that forms and melts within a single year, which scientists call seasonal ice.
Sea ice of any age is frozen ocean water. However, as sea ice survives through several melt seasons, its characteristics change. Multiyear ice is thicker, stronger and rougher than seasonal ice. It is much less salty than seasonal ice; Arctic explorers used it as drinking water. Satellite sensors observe enough of these differences that scientists can use spaceborne data to distinguish between the two types of ice.
Thinner, weaker seasonal ice is innately more vulnerable to weather than thick, multiyear ice. It can be pushed around more easily by wind, as happened in the summer of 2013. During that time, prevailing winds piled up the ice cover against coastlines, which made the ice cover thicker for months.
The ice's vulnerability may also be demonstrated by the increased variation in Arctic sea ice thickness and extent from year to year over the last decade. In the past, sea ice rarely melted in the Arctic Ocean. Each year, some multiyear ice flowed out of the ocean into the East Greenland Sea and melted there, and some ice grew thick enough to survive the melt season and become multiyear ice. As air temperatures in the polar regions have warmed in recent decades, however, large amounts of multiyear ice now melt within the Arctic Ocean itself. Far less seasonal ice now thickens enough over the winter to survive the summer. As a result, not only is there less ice overall, but the proportions of multiyear ice to seasonal ice have also changed in favor of the young ice.
Seasonal ice now grows to a depth of about two meters in winter, and most of it melts in summer. That basic pattern is likely to continue, Kwok said. "The thickness and coverage in the Arctic are now dominated by the growth, melting and deformation of seasonal ice."
The increase in seasonal ice also means record-breaking changes in ice cover such as those of the 1990s and 2000s are likely to be less common, Kwok noted. In fact, there has not been a new record sea ice minimum since 2012, despite years of warm weather in the Arctic. "We've lost so much of the thick ice that changes in thickness are going to be slower due to the different behavior of this ice type," Kwok said.
Kwok used data from U.S. Navy submarine sonars from 1958 to 2000; satellite altimeters on NASA's ICESat and the European CryoSat-2, which span from 2003 to 2018; and scatterometer measurements from NASA's QuikSCAT and the European ASCAT from 1999 to 2017.
Figure 1: Small remnants of thicker, multiyear ice float with thinner, seasonal ice in the Beaufort Sea on 30 September, 2016 (image credit: NASA/GSFC/Alek Petty)
NASA Study Connects Southern California, Mexico Faults
October 8, 2018: A multiyear study has uncovered evidence that a 34-kilometer-long section of a fault links known, longer faults in Southern California and northern Mexico into a much longer continuous system. The entire system is at least 350 km long. Knowing how faults are connected helps scientists understand how stress transfers between faults. Ultimately, this helps researchers understand whether an earthquake on one section of a fault would rupture multiple fault sections, resulting in a much larger earthquake. 6) 7)
A team led by scientist Andrea Donnellan of NASA's Jet Propulsion Laboratory in Pasadena, California, recognized that the south end of California's Elsinore fault is linked to the north end of the Laguna Salada fault system, just north of the international border with Mexico. The short length of the connecting fault segment, which they call the Ocotillo section, is consistent with an immature fault zone that is still developing, where repeated earthquakes have not yet created a smoother, single fault instead of several strands.
The Ocotillo section was the site of a magnitude 5.7 aftershock that ruptured on a 8-kilometer-long fault buried under the California desert two months after the 2010 El Mayor-Cucapah earthquake in Baja California, Mexico. The magnitude 7.2 earthquake caused severe damage in the Mexican city of Mexicali and was felt throughout Southern California. It and its aftershocks caused dozens of faults in the region — including many not previously identified — to move.
Seismic activity in the region is a sign of its complex geology. The Pacific and North American plates are grinding past each other in Southern California. In the Gulf of California, there's a spreading zone where plates are moving apart. "The plate boundary is still sorting itself out," Donnellan said.
Seismic activity in the region is a sign of its complex geology. The Pacific and North American plates are grinding past each other in Southern California. In the Gulf of California, there's a spreading zone where plates are moving apart. "The plate boundary is still sorting itself out," Donnellan said.
In the new study, Donnellan's team was also able to better define where Earth's crust continued slipping or deforming following the El Mayor-Cucapah earthquake and where other factors are important. "The shaking is only part of the earthquake process," she said. "The Earth keeps on moving for years [after the shaking stops]. What's cool about UAVSAR (Uninhabited Aerial Vehicle Synthetic Aperture Radar), an L-band InSAR platform, and GPS is that you can see the rest of the process."
Figure 2: The approximate location of the newly mapped Ocotillo section, which ties together California's Elsinore fault and Mexico's Laguna Salada fault into one continuous fault system (image credit: NASA/JPL-Caltech)
Arctic sea ice extent arrives at its minimum for 2018
September 27, 2018: Arctic sea ice likely reached its lowest seasonal extent for the year on19 and 23 September 2018, according to NASA and the NASA-supported NSIDC (National Snow and Ice Data Center) at the University of Colorado Boulder. Analysis of satellite data by NSIDC and NASA showed that, at 1.77 million square miles (4.59 million km2), 2018 effectively tied with 2008 and 2010 for the sixth lowest summertime minimum extent in the satellite record. 8) 9)
This appears to be the lowest extent of the year. In response to the setting sun and falling temperatures, ice extent will begin expanding through autumn and winter. However, a shift in wind patterns or a period of late season melt could still push the ice extent lower.
The minimum extent was reached 5 and 9 days later than the 1981 to 2010 median minimum date of September 14. The interquartile range of minimum dates is September 11 to September 19. This year's minimum date of September 23 is one of the latest dates to reach the minimum in the satellite record, tying with 1997. The lateness of the minimum appears to be at least partially caused by southerly winds from the East Siberian Sea, which brought warm air into the region and prevented ice from drifting or growing southward.
Figure 3: Arctic sea ice extent for September 23, 2018 was 4.59 million km2 (1.77 million square miles). The orange line shows the 1981 to 2010 average extent for that day (image credit: NSIDC)
Figure 4: The map above compares Arctic sea ice extent on September 19, 2018 and September 23, 2018, when Arctic sea ice reached its minimum extent for the year (image credit: NSIDC)
This year's minimum is relatively high compared to the record low extent we saw in 2012, but it is still low compared to what it used to be in the 1970s, 1980s and even the 1990s," said Claire Parkinson, a climate change senior scientist at NASA's Goddard Space Flight Center in Greenbelt, Maryland.
Parkinson and her colleague Nick DiGirolamo calculated that, since the late 1970s, the Arctic sea ice extent has shrunk on average about 21,000 square miles (54,000 km2) with each passing year. That is equivalent to losing a chunk of sea ice the size of Maryland and New Jersey combined every year for the past four decades.
This summer, the weather conditions across the Arctic have been a mixed bag, with some areas experiencing warmer than average temperatures and rapid melt and other regions remaining cooler than normal, which leads to persistent patches of sea ice. Still, the 2018 minimum sea ice extent is 629,000 square miles (1.63 million km2) below the 1981-2010 average of yearly minimum extents.
One of the most unusual features of this year's melt season has been the reopening of a polynya-like hole in the icepack north of Greenland, where the oldest and thickest sea ice of the Arctic typically resides. In February of this year, a similar opening appeared in the same area, catching the attention of sea ice scientists everywhere. The first appearance of the hole raised concerns about the possibility that the region could became vulnerable if the original, thicker ice cover was replaced with thinner ice as the exposed seawater refroze. NASA's Operation IceBridge mission probed the area in March, finding that the ice was indeed thinner and thus more susceptible to be pushed around by the winds and ocean currents.
"This summer, the combination of thin ice and southerly warm winds helped break up and melt the sea ice in the region, reopening the hole," said Melinda Webster, a sea ice researcher with Goddard. "This opening matters for several reasons; for starters, the newly exposed water absorbs sunlight and warms up the ocean, which affects how quickly sea ice will grow in the following autumn. It also affects the local ecosystem; for example, it impacts seal and polar bear populations that rely on thicker, snow-covered sea ice for denning and hunting.
Measurements of sea ice thickness, an important additional factor in determining the mass and volume changes of the sea ice cover, have been far less complete than the measurements of ice extent and distribution in the past four decades. Now, with the successful launch of NASA's ICESat-2 (Ice, Cloud and land Elevation Satellite-2, on 15 September, scientists will be able to use the data from the spacecraft's advanced laser altimeter to create detailed maps of sea ice thickness in both the Arctic and the Antarctic.
Figure 5: Lowest sea ice minimum extents on record (satellite record, 1979 to present), image credit: NASA
Contrasting effects on deep convective clouds by different types of aerosols
September 24, 2018: Convective clouds produce a significant proportion of the global precipitation and play an important role in the energy and water cycles. A new NASA-led study helps answer decades-old questions about the role of smoke and human-caused air pollution on clouds and rainfall. Looking specifically at deep convective clouds — tall clouds like thunderclouds, formed by warm air rising — the study shows that smoky air makes it harder for these clouds to grow. Pollution, on the other hand, energizes their growth, but only if the pollution isn't heavy. Extreme pollution is likely to shut down cloud growth. 10)
Researchers led by scientist Jonathan Jiang of NASA's Jet Propulsion Laboratory in Pasadena, California, used observational data from two NASA satellites to investigate the effects of smoke and human-made air pollutants at different concentrations on deep convective clouds. 11)
The two satellites — CALIPSO (Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation) and CloudSat — orbited on the same track only a few seconds apart from 2006 until this year. CloudSat uses a radar to measure cloud locations and heights worldwide, and CALIPSO uses an instrument called a LIDAR to measure smoke, dust, pollution and other microscopic particles in the air, which are collectively referred to as aerosols, at the same locations at almost the same time. The combined data sets allow scientists to study how aerosol particles affect clouds.
CALIPSO is able to classify aerosols into several types, a capability which was improved two years ago when the CALIPSO mission team developed improved data-processing techniques. At about the same time, the CloudSat team also improved its classification of the cloud types. Jiang's team knew that these improvements had the potential to clarify how different aerosols affect the ability of clouds to grow. It took him and his colleagues about two years to go through both data sets, choose the best five-year period and Earth regions to study, and do the analysis.
Clouds typically cannot form without some aerosols, because water vapor in the air does not easily condense into liquid water or ice unless it comes in contact with an aerosol particle. But there are many types of aerosols — not only the ones studied here but volcanic ash, sea salt and pollen, for example — with a wide range of sizes, colors, locations and other characteristics. All of these characteristics affect the way aerosols interact with clouds. Even the same type of aerosol may have different effects at different altitudes in the atmosphere or at different concentrations of particles.
Smoke particles absorb heat radiation emitted by the ground. This increases the temperature of the smoke particles, which can then warm the air. At the same time they block incoming sunlight, which keeps the ground cooler. That reduces the temperature difference between the ground and the air. For clouds to form, the ground needs to be warmer and the air cooler so that moisture on the ground can evaporate, rise and condense higher in the atmosphere. By narrowing the temperature gap between the ground and the air, smoke suppresses cloud formation and growth.
Human-pollutant aerosols like sulfates and nitrates, on the other hand, do not absorb much heat radiation. In moderate concentrations, they add more particles to the atmosphere for water to condense onto, enabling clouds to grow taller. If pollution is very heavy, however, the sheer number of particles in the sky blocks incoming sunlight — an effect often visible in the world's most polluted cities. That cools the ground just as smoke aerosols do, inhibiting the formation of clouds.
The scientists also studied dust aerosols and found that their characteristics varied so much from place to place that they could either suppress or energize cloud formation. "It's about the complexity in dust color and size," Jiang said. "Sahara dust may be lighter, while dust from an Asian desert might likely be darker." A blanket of lighter-colored or smaller dust scatters incoming sunlight while not warming the air. Larger or darker dust particles absorb sunlight and warm the air.
Study links natural climate oscillations in north Atlantic to Greenland Ice Sheet Melt
September 18, 2018: Scientists have known for years that warming global climate is melting the Greenland Ice Sheet, the second largest ice sheet in the world. A new study from the Woods Hole Oceanographic Institution (WHOI), however, shows that the rate of melting might be temporarily increased or decreased by two existing climate patterns: the North Atlantic Oscillation (NAO), and the Atlantic Multidecadal Oscillation (AMO). 12)
Both patterns can have a major impact on regional climate. The NAO, which is measured as the atmospheric pressure difference between the Azores and Iceland, can affect the position and strength of the westerly storm track. The study, published in Geophysical Research Letters, found that when the NAO stays in its negative phase (meaning that air pressure is high over Greenland) it can trigger extreme ice melt in Greenland during the summer season. Likewise, the AMO, which alters sea surface temperatures in the North Atlantic, can cause major melting events when it is in its warm phase, raising the temperature of the region as a whole. 13)
If global climate change continues at its current rate, the Greenland ice sheet may eventually melt entirely—but whether it meets this fate sooner rather than later could be determined by these two oscillations, says Caroline Ummenhofer, a climate scientist at WHOI and co-author on the study. Depending on how the AMO and NAO interact, excess melting could happen two decades earlier than expected, or two decades later this century.
"We know the Greenland ice sheet is melting in part because of warming climate, but that's not a linear process," Ummenhofer said. "There are periods where it will accelerate, and periods where it won't."
Scientists like Ummenhofer see a pressing need to understand how natural variability can play a role in speeding up or slowing down the melting process. "The consequences go beyond just the Greenland Ice Sheet—predicting climate on the scale of the next few decades will also be useful for resource management, city planners and other people who will need to adapt to those changes," she added.
Actually forecasting environmental conditions on a decadal scale isn't easy. The NAO can switch between positive and negative phases over the course of a few weeks, but the AMO can take more than 50 years to go through a full cycle. Since scientists first started tracking climate in the late 19th century, only a handful of AMO cycles have been recorded, making it extremely difficult to identify reliable patterns. To complicate things even more, the WHOI scientists needed to tease out how much of the melting effect is caused by human-related climate change, and how much can be attributed to the AMO and NAO.
Figure 6: Scientists stand on the edge of a crevasse formed by meltwater flowing across the top of the Greenland Ice Sheet during a WHOI-led expedition in 2007 (image credit: Sarah Das, Woods Hole Oceanographic Institution)
To do so, the team relied on data from the Community Earth System Model's Large Ensemble, a massive set of climate model simulations at the NCAR (National Center for Atmospheric Research) in Boulder, CO. From that starting point, the researchers looked at 40 different iterations of the model covering 180 years over the 20th and 21st century, with each one using slightly different starting conditions.
Although the simulations all included identical human factors, such as the rise of greenhouse gases over two centuries, they used different conditions at the start—a particularly cold winter, for example, or a powerful Atlantic storm season—that led to distinct variability in the results. The team could then compare those results to each other and statistically remove the effects caused by climate change, letting them isolate the effects of the AMO and NAO.
"Using a large ensemble of model output gave more statistical robustness to our findings," said Lily Hahn, the paper's lead author. "It provided many more data points than a single model run or observations alone. That's very helpful when you're trying to investigate something as complex as atmosphere-ocean-ice interactions."
Hahn was formerly a Summer Student Fellow (SSF) and guest student at WHOI while she was an undergraduate at Yale University. She is currently working on her Ph.D. at the University of Washington. Also collaborating on the study was Young-Oh Kwon, a physical oceanographer at WHOI. This research was supported by WHOI's SSF and guest student programs, and by the U.S. National Science Foundation.
The Woods Hole Oceanographic Institution is a private, non-profit organization on Cape Cod, Mass., dedicated to marine research, engineering, and higher education. Established in 1930 on a recommendation from the National Academy of Sciences, its primary mission is to understand the oceans and their interaction with the Earth as a whole, and to communicate a basic understanding of the oceans' role in the changing global environment.
Three Causes of Earth's Spin Axis Drift Identified
September 19, 2018: A typical desk globe is designed to be a geometric sphere and to rotate smoothly when you spin it. Our actual planet is far less perfect — in both shape and in rotation.
Earth is not a perfect sphere. When it rotates on its spin axis — an imaginary line that passes through the North and South Poles — it drifts and wobbles. These spin-axis movements are scientifically referred to as "polar motion." Measurements for the 20th century show that the spin axis drifted about 10 cm per year. Over the course of a century, that becomes more than 10 meters. 14)
Using observational and model-based data spanning the entire 20th century, NASA scientists have for the first time identified three broadly-categorized processes responsible for this drift — contemporary mass loss primarily in Greenland, glacial rebound, and mantle convection.
"The traditional explanation is that one process, glacial rebound, is responsible for this motion of Earth's spin axis. But recently, many researchers have speculated that other processes could have potentially large effects on it as well," said first author Surendra Adhikari of NASA's Jet Propulsion Laboratory in Pasadena, California. "We assembled models for a suite of processes that are thought to be important for driving the motion of the spin axis. We identified not one but three sets of processes that are crucial — and melting of the global cryosphere (especially Greenland) over the course of the 20th century is one of them."
In general, the redistribution of mass on and within Earth — like changes to land, ice sheets, oceans and mantle flow — affects the planet's rotation. As temperatures increased throughout the 20th century, Greenland's ice mass decreased. In fact, a total of about 7,500 gigatons — the weight of more than 20 million Empire State Buildings — of Greenland's ice melted into the ocean during this time period. This makes Greenland one of the top contributors of mass being transferred to the oceans, causing sea level to rise and, consequently, a drift in Earth's spin axis.
While ice melt is occurring in other places (like Antarctica), Greenland's location makes it a more significant contributor to polar motion.
"There is a geometrical effect that if you have a mass that is 45 degrees from the North Pole — which Greenland is — or from the South Pole (like Patagonian glaciers), it will have a bigger impact on shifting Earth's spin axis than a mass that is right near the Pole," said coauthor Eric Ivins, also of JPL.
Previous studies identified glacial rebound as the key contributor to long-term polar motion. And what is glacial rebound? During the last ice age, heavy glaciers depressed Earth's surface much like a mattress depresses when you sit on it. As that ice melts, or is removed, the land slowly rises back to its original position. In the new study, which relied heavily on a statistical analysis of such rebound, scientists figured out that glacial rebound is likely to be responsible for only about a third of the polar drift in the 20th century.
Figure 7: The observed direction of polar motion, shown as a light blue line, compared with the sum (pink line) of the influence of Greenland ice loss (blue), postglacial rebound (yellow) and deep mantle convection (red). The contribution of mantle convection is highly uncertain (image credit: NASA/ JPL-Caltech)
The authors argue that mantle convection makes up the final third. Mantle convection is responsible for the movement of tectonic plates on Earth's surface. It is basically the circulation of material in the mantle caused by heat from Earth's core. Ivins describes it as similar to a pot of soup placed on the stove. As the pot, or mantle, heats, the pieces of the soup begin to rise and fall, essentially forming a vertical circulation pattern — just like the rocks moving through Earth's mantle.
With these three broad contributors identified, scientists can distinguish mass changes and polar motion caused by long-term Earth processes over which we have little control from those caused by climate change. They now know that if Greenland's ice loss accelerates, polar motion likely will, too.
The paper in Earth and Planetary Science Letters is titled "What drives 20th century polar motion?" 15) Besides JPL, coauthor institutions include the German Research Centre for Geosciences, Potsdam; the University of Oslo, Norway; Technical University of Denmark, Kongens Lyngby; the Geological Survey of Denmark and Greenland, Copenhagen, Denmark; and the University of Bremen, Germany. An interactive simulation of how multiple processes contribute to the wobbles in Earth's spin axis is available at: https://vesl.jpl.nasa.gov/sea-level/polar-motion/
A World on Fire
August 23, 2018: The world is on fire. Or so it appears in this image from NASA's Worldview (Figure 8). The red points overlaid on the image designate those areas that by using thermal bands detect actively burning fires. Africa seems to have the most concentrated fires. This could be due to the fact that these are most likely agricultural fires. The location, widespread nature, and number of fires suggest that these fires were deliberately set to manage land. Farmers often use fire to return nutrients to the soil and to clear the ground of unwanted plants. While fire helps enhance crops and grasses for pasture, the fires also produce smoke that degrades air quality. 16)
Elsewhere the fires, such as in North America are wildfires for the most part. In South America, specifically Chile has had horrendous numbers of wildfires this year. A study conducted by Montana State University found that: "Besides low humidity, high winds and extreme temperatures—some of the same factors contributing to fires raging across the United States—central Chile is experiencing a mega drought and large portions of its diverse native forests have been converted to more flammable tree plantations, the researchers said." 17)
However, in Brazil the fires are both wildfires and man-made fires set to clear crop fields of detritus from the last growing season. Fires are also commonly used during Brazil's dry period to deforest land and clear it for raising cattle or other agricultural or extraction purposes. The problem with these fires is that they grow out of control quickly due to climate issues. Hot, dry conditions coupled with wind drive fires far from their original intended burn area. According to the Global Fire Watch site (between15-22 August) shows: 30,964 fire alerts.
Australia is also where you tend to find large bushfires in its more remote areas. Hotter, drier summers in Australia will mean longer fire seasons – and urban sprawl into bushland is putting more people at risk for when those fires break out. For large areas in the north and west, bushfire season has been brought forward a whole two months to August – well into winter, which officially began 1 June. According to the Australian Bureau of Meteorology (Bom), the January to July period 2018 was the warmest in NSW (New South Wales) since 1910. As the climate continues to change and areas become hotter and drier, more and more extreme bushfires will break out across the entire Australian continent.
NASA's Earth Observing System Data and Information System (EOSDIS) Worldview application provides the capability to interactively browse over 700 global, full-resolution satellite imagery layers and then download the underlying data. Many of the available imagery layers are updated within three hours of observation, essentially showing the entire Earth as it looks "right now. This satellite image was collected on August 22, 2018. Actively burning fires, detected by thermal bands, are shown as red points. Image Courtesy: NASA Worldview, Earth Observing System Data and Information System (EOSDIS).
Figure 8: NASA's Worldview image of fires on a global scale. - In particular, Chile has had horrendous numbers of wildfires this year according to a study by Montana State University (image credit: EOSDIS Worldview) 18)
The results of the Montana Study Team were published in PLoS (Public Library of Science) ONE. 19) Chile has replaced many of its native forests with plantation forests to supply pulp and timber mills that produce paper and wood products. According to David McWethy of Montana State University, the lead author of the study, highly flammable non-native pine and eucalypt forests now cover the region. Eucalypt trees, which are native to Australia, and pine trees native to the United States contain oils and resins in their leaves that, when dry, can easily ignite.
"Chile replaced more heterogeneous, less flammable native forests with structurally homogenous, flammable exotic forest plantations at a time when the climate is becoming warmer and drier," said McWethy. "This situation will likely facilitate future fires to spread more easily and promote more large fires into the future."
Co-author Anibal Pauchard, professor at the University of Concepcion and researcher at the Institute of Ecology and Biodiversity in Chile, said wildfires have been a part of the Chilean landscape for centuries, but they have grown larger and more intense in recent decades, despite costly government efforts to control them. "Unfortunately, fires in central Chile are promoted by increasing human ignitions, drier and hotter climate, and the availability of abundant flammable fuels associated with pine plantations and degraded shrublands dominated by invasive species," Pauchard said.
In 2016-2017 alone, fires burned nearly 1.5 million acres (607,000 hectares)—almost twice the area of the U.S. state of Rhode Island. It was the largest area burned during a single fire season since detailed recordkeeping began in the early 1960s. In 2014, major fires near the cities of Valparaiso and Santiago destroyed thousands of homes and forced more than 10,000 people to evacuate.
The devastation prompted the Chilean government to ask what land-use policies and environmental factors were behind these fires, McWethy said. That led to a national debate about preventing and reducing the consequences of future fires and to the involvement of McWethy and his collaborators.
To better understand the Chilean fires, the researchers compared satellite information with records from the Chilean Forest Service for 2001 through 2017. They studied eight types of vegetation as well as climate conditions, elevation, slope and population density across a wide range of latitudes in Chile.
"Now we have compelling evidence that after climate, landscape composition is crucial in determining fire regimes. In particular, exotic forest plantations need to be managed to purposely reduce fire hazard," Pauchard said. "Which forestry species we plant and how we manage them matters in terms of fire frequency and intensity."
Among other things, the researchers recommended in the paper that Chile try to move away from exotic plantations toward more heterogeneous, less flammable native forests.
Figure 9: A forest of Nothofagus antarctica trees that burned in fire that covered 40,000 acres in Torres del Paine National Park, Chile in 2012 (image credit: David McWethy)
Scientists trace atmospheric rise in CO2 during deglaciation to the deep Pacific Ocean
August 13, 2018: How carbon made it out of the ocean and into the atmosphere has remained one of the most important mysteries of science. A new study, provides some of the most compelling evidence for how it happened — a 'flushing' of the deep Pacific Ocean caused by the acceleration of water circulation patterns that begin around Antarctica. 20)
Long before humans started injecting carbon dioxide into the atmosphere by burning fossil fuels like oil, gas, and coal, the level of atmospheric CO2 rose significantly as the Earth came out of its last ice age. Many scientists have long suspected that the source of that carbon was from the deep sea. - But researchers haven't been able to document just how the carbon made it out of the ocean and into the atmosphere. It has remained one of the most important mysteries of science.
A new study, published today in the journal Nature Geoscience, provides some of the most compelling evidence for how it happened — a "flushing" of the deep Pacific Ocean caused by the acceleration of water circulation patterns that begin around Antarctica. 21)
The concern, researchers say, is that it could happen again, potentially magnifying and accelerating human-caused climate change. "The Pacific Ocean is big and you can store a lot of stuff down there — it's kind of like Grandma's root cellar — stuff accumulates there and sometimes doesn't get cleaned out," said Alan Mix, an Oregon State University oceanographer and co-author on the study.
"We've known that CO2 in the atmosphere went up and down in the past, we know that it was part of big climate changes, and we thought it came out of the deep ocean. But it has not been clear how the carbon actually got out of the ocean to cause the CO2 rise."
Lead author Jianghui Du, a doctoral student in oceanography at Oregon State, said there is a circulation pattern in the Pacific that begins with water around Antarctica sinking and moving northward at great depth a few miles below the surface. It continues all the way to Alaska, where it rises, turns back southward, and flows back to Antarctica where it mixes back up to the sea surface.
It takes a long time for the water's round trip journey in the abyss — almost 1,000 years, Du said. Along with the rest of the OSU team, Du found that flow slowed down during glacial maximums but sped up during deglaciation, as the Earth warmed. This faster flow flushed the carbon from the deep Pacific Ocean — "cleaning out Grandma's root cellar" — and brought the CO2 to the surface near Antarctica. There it was released into the atmosphere.
"It happened roughly in two steps during the last deglaciation — an initial phase from 18,000 to 15,000 years ago, when CO2 rose by about 50 parts per million, and a second pulse later added another 30 parts per million," Du said. That total is just a bit less than the amount CO2 has risen since the industrial revolution. So the ocean can be a powerful source of carbon.
Brian Haley, also an Oregon State University oceanographer and co-author on the study, noted that carbon is always falling down into the deep ocean. Up near the surface, plankton grow, but when they die they sink and decompose. That is a biological pump that is always sending carbon to the bottom. "The slower the circulation," Haley said, "the more time the water spends down there, and carbon can build up."
Du said that during a glacial maximum, the water slows down and accumulates lots of carbon. "When the Earth began warming, the water movement sped up by about a factor of three," he noted, "and that carbon came back to the surface."
The key to the researchers' discovery is the analysis of neodymium isotopes in North Pacific sediment cores. Haley noted that the isotopes are "like a return address label on a letter from the deep ocean." When the ratio of isotope 143 to 144 is higher in the sediments, the water movement during that period was slower. When water movement speeds up during warming events, the ratio of neodymium isotopes reflects that too.
"This finding that the deep circulation sped up is the smoking gun in this mystery story about how CO2 got out to the deep sea," Mix said. "We now know how it happened, and the deep Pacific is the culprit — a partner in crime with Antarctica."
What concerns the researchers is that it could happen again as the climate continues to warm. "We don't know that the circulation will speed up and bring that carbon to the surface, but it seems like a reasonable thing to think about," Du said. "Our evidence that this actually happened in the past will help the people who run climate models figure out whether it is a real risk for the future."
The researchers say their findings should be considered from a policy perspective. "So far the ocean has absorbed about a third of the total carbon emitted from fossil fuels," Mix said. "That has helped slow down warming. The Paris Climate Agreement has set goals of containing warming to 1.5 to 2 degrees (Celsius) and we know pretty well how much carbon can be released to the atmosphere while keeping to that level. "But if the ocean stops absorbing the excess CO2, and instead releases more from the deep sea, that spells trouble. Ocean release would subtract from our remaining emissions budget and that means we're going to have to get our emissions down a heck of a lot faster. We need to figure out how much."
The authors are from the College of Earth, Ocean, and Atmospheric Sciences at OSU (Oregon State University), and from USGS (United States Geological Survey). The study was supported by the NSF (National Science Foundation).
Climate change is making night-shining clouds more visible
July 2, 2018: Increased water vapor in Earth's atmosphere due to human activities is making shimmering high-altitude clouds more visible, a new study finds. The results suggest these strange but increasingly common clouds seen only on summer nights are an indicator of human-caused climate change, according to the study's authors. 22)
Noctilucent, or night-shining, clouds are the highest clouds in Earth's atmosphere. They form in the middle atmosphere, or mesosphere, roughly 80 km (50 miles) above Earth's surface. The clouds form when water vapor freezes around specks of dust from incoming meteors.
Humans first observed noctilucent clouds in 1885, after the eruption of Krakatoa volcano in Indonesia spewed massive amounts of water vapor in the air. Sightings of the clouds became more common during the 20th century, and in the 1990s scientists began to wonder whether climate change was making them more visible.
In a new study, researchers used satellite observations and climate models to simulate how the effects of increased greenhouse gases from burning fossil fuels have contributed to noctilucent cloud formation over the past 150 years. Extracting and burning fossil fuels delivers carbon dioxide, methane and water vapor into the atmosphere, all of which are greenhouse gases.
The study's results suggest methane emissions have increased water vapor concentrations in the mesosphere by about 40 percent since the late 1800s, which has more than doubled the amount of ice that forms in the mesosphere. They conclude human activities are the main reason why noctilucent clouds are significantly more visible now than they were 150 years ago.
"We speculate that the clouds have always been there, but the chance to see one was very, very poor, in historical times," said Franz-Josef Lübken, an atmospheric scientist at the Leibniz Institute of Atmospheric Physics in Kühlungsborn, Germany and lead author of the new study in Geophysical Research Letters, a journal of the American Geophysical Union. 23)
The results suggest noctilucent clouds (NLCs) are a sign that human-caused climate change is affecting the middle atmosphere, according to the authors. Whether thicker, more visible noctilucent clouds could influence Earth's climate themselves is the subject of future research, Lübken said.
"Our methane emissions are impacting the atmosphere beyond just temperature change and chemical composition," said Ilissa Seroka, an atmospheric scientist at the Environmental Defense Fund in Washington, D.C. who was not connected to the new study. "We now detect a distinct response in clouds."
Figure 10: Noctilucent clouds form only in the summertime and are only visible at dawn and dusk. New research suggests they are becoming more visible and forming more frequently due to climate change (image credit: NASA)
Studying cloud formation over time: Conditions must be just right for noctilucent clouds to be visible. The clouds can only form at mid to high latitudes in the summertime, when mesospheric temperatures are cold enough for ice crystals to form. And they're only visible at dawn and dusk, when the Sun illuminates them from below the horizon.
Humans have injected massive amounts of greenhouse gases into the atmosphere by burning fossil fuels since the start of the industrial period 150 years ago. Researchers have wondered what effect, if any, this has had on the middle atmosphere and the formation of noctilucent clouds.
Figure 11: This diagram shows the major layers of Earth's atmosphere. Noctilucent clouds form in the mesosphere, high above where normal weather clouds form image credit: Randy Russel/UCAR)
In the new study, Lübken and colleagues ran computer simulations to model the Northern Hemisphere's atmosphere and noctilucent clouds from 1871 to 2008. They wanted to simulate the effects of increased greenhouse gases, including water vapor, on noctilucent cloud formation over this time period.
The researchers found the presence of noctilucent clouds fluctuates from year to year and even from decade to decade, depending on atmospheric conditions and the solar cycle. But over the whole study period, the clouds have become significantly more visible.
The reasons for this increased visibility were surprising, according to Lübken. Carbon dioxide warms Earth's surface and the lower part of the atmosphere, but actually cools the middle atmosphere where noctilucent clouds form. In theory, this cooling effect should make noctilucent clouds form more readily.
But the study's results showed increasing carbon dioxide concentrations since the late 1800s have not made noctilucent clouds more visible. It seems counterintuitive, but when the middle atmosphere becomes colder, more ice particles form but they are smaller and therefore harder to see, Lübken explained. "Keeping water vapor constant and making it just colder means that we would see less ice particles," he said.
Figure 12: Noctilucent clouds over the city of Wismar, Germany in July 2015. Tropospheric clouds are visible as dark patches near the horizon (image credit: Leibniz Institute of Atmospheric Physics)
On the contrary, the study found more water vapor in the middle atmosphere is making ice crystals larger and noctilucent clouds more visible. Water vapor in the middle atmosphere comes from two sources: water vapor from Earth's surface that is transported upward, and methane, a potent greenhouse gas that produces water vapor through chemical reactions in the middle atmosphere.
The study found the increase in atmospheric methane since the late 1800s has significantly increased the amount of water vapor in the middle atmosphere. This more than doubled the amount of mesospheric ice present in the mid latitudes from 1871 to 2008, according to the study.
People living in the mid to high latitudes now have a good chance of seeing noctilucent clouds several times each summer, Lübken said. In the 19th century, they were probably visible only once every several decades or so, he said. "The result was rather surprising that, yes, on these time scales of 100 years, we would expect to see a big change in the visibility of clouds," according to Lübken.
NASA study solves glacier puzzle in northwest Greenland
June 21, 2018: A new NASA study explains why the Tracy and Heilprin glaciers, which flow side by side into Inglefield Gulf in northwest Greenland, are melting at radically different rates. 24)
Using ocean data from NASA's OMG (Oceans Melting Greenland) campaign, the study documents a plume of warm water flowing up Tracy's underwater face, and a much colder plume in front of Heilprin. Scientists have assumed plumes like these exist for glaciers all around Greenland, but this is the first time their effects have been measured.
The finding highlights the critical role of oceans in glacial ice loss and their importance for understanding future sea level rise. A paper on the research was published June 21 in the journal Oceanography. 25)
Tracy and Heilprin were first observed by explorers in 1892 and have been measured sporadically ever since. Even though the adjoining glaciers experience the same weather and ocean conditions, Heilprin has retreated upstream less than 4 km in 125 years, while Tracy has retreated more than 15 km. That means Tracy is losing ice almost four times faster than its next-door neighbor.
This is the kind of puzzle OMG was designed to explain. The five-year campaign is quantifying ice loss from all glaciers that drain the Greenland Ice Sheet with an airborne survey of ocean and ice conditions around the entire coastline, collecting data through 2020. OMG is making additional boat-based measurements in areas where the seafloor topography and depths are inadequately known.
About a decade ago, NASA's Operation IceBridge campaign used ice-penetrating radar to document a major difference between the glaciers: Tracy is seated on bedrock at a depth of about 610 m below the ocean surface, while Heilprin extends only 350 m beneath the waves.
Scientists would expect this difference to affect the melt rates, because the top ocean layer around Greenland is colder than the deep water, which has traveled north from the midlatitudes in ocean currents. The warm water layer starts about 200 m down from the surface, and the deeper the water, the warmer it is. Naturally, a deeper glacier would be exposed to more of this warm water than a shallower glacier would.
When OMG Principal Investigator Josh Willis of NASA's Jet Propulsion Laboratory in Pasadena, California, looked for more data to quantify the difference between Tracy and Heilprin, "I couldn't find any previous observations of ocean temperature and salinity in the fjord at all," he said. There was also no map of the seafloor in the gulf.
OMG sent a research boat into the Inglefield Gulf in the summer of 2016 to fill in the data gap. The boat's soundings of ocean temperature and salinity showed a river of meltwater draining out from under Tracy. Because freshwater is more buoyant than the surrounding seawater, as soon as the water escapes from under the glacier, it swirls upward along the glacier's icy face. The turbulent flow pulls in surrounding subsurface water, which is warm for a polar ocean at about 0.5 degree Celsius. As it gains volume, the plume spreads like smoke rising from a smokestack.
"Most of the melting happens as the water rises up Tracy's face," Willis said. "It eats away at a huge chunk of the glacier."
Figure 13: Tracy and Heilprin glaciers in northwest Greenland. The two glaciers flow into a fjord that appears black in this image (image credit: NASA)
Heilprin also has a plume, but its shallower depth limits the plume's damage in two ways: the plume has a shorter distance to rise and gathers less seawater; and the shallow seawater it pulls in has a temperature of only about minus 0.5 degree Celsius. As a result, even though Heilprin is a bigger glacier and more water drains from underneath it than from Tracy, its plume is smaller and colder.
The study produced another surprise by first mapping a ridge, called a sill, only about 250 m below the ocean surface in front of Tracy, and then proving that this sill did not keep warm water from the ocean depths away from the glacier. "In fact, quite a lot of warm water comes in from offshore, mixes with the shallower layers and comes over the sill," Willis said. Tracy's destructive plume is evidence of that.
Figure 14: This figure shows estimated ice flow velocities of Tracy and Heilprin glaciers (right) and the depths of the fjord in front of the glaciers. The approximate location of the sill in front of Tracy is shown as a dashed yellow line. Research ship cruise tracks are shown in orange (image credit: NASA/JPL-Caltech)
Ice losses from Antarctica speed Sea Level Rise - IMBIE Study
June 13, 2018: Ice losses from Antarctica have tripled since 2012, increasing global sea levels by 3 mm in that timeframe alone, according to a major new international climate assessment funded by NASA and ESA (European Space Agency). 26)
In a major collaborative effort, scientists from around the world have used information from satellites to reveal that ice melting in Antarctica has not only raised sea levels by 7.6 mm since 1992, but, critically, almost half of this rise has occurred in the last five years. 27)
According to the study, ice losses from Antarctica are causing sea levels to rise faster today than at any time in the past 25 years. Results of the IMBIE (Ice Sheet Mass Balance Inter-comparison Exercise) study were published on 13 June 2018 in the journal Nature. 28)
"This is the most robust study of the ice mass balance of Antarctica to date," said assessment team co-lead Erik Ivins at NASA's Jet Propulsion Laboratory (JPL). "It covers a longer period than our 2012 IMBIE study, has a larger pool of participants, and incorporates refinements in our observing capability and an improved ability to assess uncertainties." Andrew Shepherd from the University of Leeds in the UK and Erik Ivins from NASA/JPL led a group of 84 scientists from 44 international organizations in research that has resulted in the most complete picture to date of how Antarctica's ice sheet is changing. This latest IMBIE is the most complete assessment of Antarctic ice mass changes to date, combining 24 satellite surveys of Antarctica.
ESA's CryoSat-2 and the Copernicus Sentinel-1 mission were particularly useful for the study assessment. Carrying a radar altimeter, CryoSat-2 is designed to measure changes in the height of the ice, which is used to calculate changes in the volume of the ice. It is also especially designed to measure changes around the margins of ice sheets where ice is calved as icebergs. The two-satellite Sentinel-1 radar mission, which is used to monitor ice motion, can image Earth regardless of the weather or whether is day or night – which is essential during the dark polar winters.
Figure 15: ESA's Earth Explorer CryoSat mission is dedicated to precise monitoring of changes in the thickness of marine ice floating in the polar oceans and variations in the thickness of the vast ice sheets that blanket Greenland and Antarctica (image credit: ESA/AOES Medialab)
The team looked at the mass balance of the Antarctic ice sheet from 1992 to 2017 and found ice losses from Antarctica raised global sea levels by 7.6 mm, with a sharp uptick in ice loss in recent years. They attribute the threefold increase in ice loss from the continent since 2012 to a combination of increased rates of ice melt in West Antarctica and the Antarctic Peninsula, and reduced growth of the East Antarctic ice sheet.
Prior to 2012, ice was lost at a steady rate of about 76 billion metric tons per year, contributing about 0.2 mm a year to sea level rise. Since 2012, the amount of ice loss per year has tripled to 219 billion metric tons – equivalent to about 0.6 mm of sea level rise.
West Antarctica experienced the greatest recent change, with ice loss rising from 53 billion metric tons per year in the 1990s, to 159 billion metric tons a year since 2012. Most of this loss came from the huge Pine Island and Thwaites Glaciers, which are retreating rapidly due to ocean-induced melting.
At the northern tip of the continent, ice-shelf collapse at the Antarctic Peninsula has driven an increase of 25 billion metric tons in ice loss per year since the early 2000s. Meanwhile, the team found the East Antarctic ice sheet has remained relatively balanced during the past 25 years, gaining an average of 5 billion metric tons of ice per year.
Figure 16: Changes in the Antarctic ice sheet's contribution to global sea level, 1992 to 2017 (image credit: IMBIE/Planetary Visions)
Antarctica's potential contribution to global sea level rise from its land-held ice is almost 7.5 times greater than all other sources of land-held ice in the world combined. The continent stores enough frozen water to raise global sea levels by 58 meters, if it were to melt entirely. Knowing how much ice it's losing is key to understanding the impacts of climate change now and its pace in the future.
"The datasets from IMBIE are extremely valuable for the ice sheet modeling community," said study co-author Sophie Nowicki of NASA's Goddard Space Flight Center. "They allow us to test whether our models can reproduce present-day change and give us more confidence in our projections of future ice loss."
The satellite missions providing data for this study are NASA's Ice, Cloud and land Elevation Satellite (ICESat); the joint NASA/German Aerospace Center Gravity Recovery and Climate Experiment (GRACE); ESA's first and second European Remote Sensing satellites (ERS-1 and -2), Envisat and CryoSat-2; the European Union's Sentinel-1 and Sentinel-2 missions; the Japan Aerospace Exploration Agency's Advanced Land Observatory System (ALOS); the Canadian Space Agency's RADARSAT-1 and RADARSAT-2 satellites; the Italian Space Agency's COSMO-SkyMed satellites; and the German Aerospace Center's TerraSAR-X satellite.
Tom Wagner, cryosphere program manager at NASA Headquarters, hopes to welcome a new era of Antarctic science with the May 2018 launch of the Gravity Recovery and Climate Experiment Follow-on (GRACE-FO) mission and the upcoming launch of NASA's Ice, Cloud and land Elevation Satellite-2 (ICESat-2). "Data from these missions will help scientists connect the environmental drivers of change with the mechanisms of ice loss to improve our projections of sea level rise in the coming decades," Wagner said.
Invisible barrier on ocean surface reduces carbon uptake by half
Scientists from Exeter, Heriot-Watt and Newcastle universities (UK) published their research in the journal Nature Geoscience, and say the findings have major implications for predicting our future climate. 31)
The world's oceans currently absorb around a quarter of all anthropogenic carbon dioxide emissions, making them the largest long-term sink of carbon on Earth. Atmosphere-ocean gas exchange is controlled by turbulence at the sea surface, the main cause of which is waves generated by wind. Greater turbulence means increased gas exchange and, until now, it was difficult to calculate the effect of biological surfactants on this exchange.
The Natural Environment Research Council (NERC, Swindon, UK), Leverhulme Trust and ESA (European Space Agency) funded team developed a novel experimental system that directly compares "the surfactant effect" between different sea waters collected along oceanographic cruises, in real time. Using this and satellite observations, the team then found that surfactants can reduce carbon dioxide exchange by up to 50 percent.
Dr Ryan Pereira, a Lyell Research Fellow at Heriot-Watt University in Edinburgh, said: "As surface temperatures rise, so too do surfactants, which is why this is such a critical finding. The warmer the ocean surface gets, the more surfactants we can expect, and an even greater reduction in gas exchange. What we discovered at 13 sites across the Atlantic Ocean is that biological surfactants suppress the rate of gas exchange caused by the wind. We made unique measurements of gas transfer using a purpose-built tank that could measure the relative exchange of gases impacted only by surfactants present at these sites. These natural surfactants aren't necessarily visible like an oil slick, or a foam, and they are even difficult to identify from the satellites monitoring our ocean's surface. We need to be able to measure and identify the organic matter on the surface microlayer of the ocean so that we can reliably estimate rates of gas exchange of climate active gases, such as carbon dioxide and methane."
Professor Rob Upstill-Goddard, professor of marine biogeochemistry at Newcastle University, said: "These latest results build on our previous findings that, contrary to conventional wisdom, large sea surface enrichments of natural surfactants counter the effects of high winds. The suppression of carbon dioxide uptake across the ocean basin due to surfactants, as revealed by our work, implies slower removal of anthropogenic carbon dioxide from the atmosphere and thus has implications for predicting future global climate."
The University of Exeter team, Drs Jamie Shutler (Geography) and Ian Ashton (Renewable Energy) led the satellite component of the work. Ian Ashton said: "Combining this new research with a wealth of satellite data available allows us to consider the effect of surfactants on gas exchange across the entire Atlantic Ocean, helping us to monitor carbon dioxide on a global scale."
The team collected samples across the Atlantic Ocean in 2014, during a NERC study on the Atlantic Meridional Transect (AMT). Each year the AMT cruise undertakes biological, chemical and physical oceanographic research between the UK and the Falkland Islands, South Africa or Chile, a distance of up to 13,500 km, to study the health and function of our oceans.
The research cruise crosses a range of ecosystems from sub-polar to tropical and from coastal and shelf seas and upwelling systems to oligotrophic mid-ocean gyres.
NOAA finds rising emissions of ozone-destroying chemical banned by Montreal Protocol
• May 16, 2018: CFCs (Chlorofluorocarbons) were once considered a triumph of modern chemistry. Stable and versatile, these chemicals were used in hundreds of products, from military systems to the ubiquitous can of hairspray. Then in 1987, NOAA scientists were part of an international team that proved this family of wonder chemicals was damaging Earth's protective ozone layer and creating the giant hole in the ozone layer that forms over Antarctica each September. The Montreal Protocol, signed later that year, committed the global community to phasing out their use. Production of the second-most abundant CFC, CFC-11, would end completely by 2010. 32)
A new analysis of long-term atmospheric measurements by NOAA scientists shows emissions of the chemical CFC-11 are rising again, most likely from new, unreported production from an unidentified source in East Asia. The results are published today in the journal Nature. 33)
"We're raising a flag to the global community to say, ‘This is what's going on, and it is taking us away from timely recovery of the ozone layer,'" said NOAA scientist Stephen Montzka, the study's lead author. "Further work is needed to figure out exactly why emissions of CFC-11 are increasing, and if something can be done about it soon."
The findings of Montzka and his team of researchers from CIRES [Cooperative Institute for Research in Environmental Sciences (University of Boulder, and at NOAA, Boulder, CO, USA)] offsite link, the UK, and the Netherlands, represent the first time that emissions of one of the three most abundant, long-lived CFCs have increased for a sustained period since production controls took effect in the late 1980s.
CFC-11 is the second-most abundant ozone-depleting gas in the atmosphere because of its long life and continuing emissions from a large reservoir of the chemical in foam building insulation and appliances manufactured before the mid-1990s. A smaller amount of CFC-11 also exists today in older refrigerators and freezers.
The Montreal Protocol has been effective in reducing ozone-depleting gases in the atmosphere because all countries in the world agreed to legally binding controls on the production of most human-produced gases known to destroy ozone. As a result, CFC-11 concentrations have declined by 15% from peak levels measured in 1993.
Though concentrations of CFC-11 in the atmosphere are still declining, they're declining more slowly than they would if there were no new sources, Montzka said.
The results from the new analysis of NOAA atmospheric measurements explain why. From 2014 to 2016, emissions of CFC-11 increased by 25 percent above the average measured from 2002 to 2012.
Scientists had been predicting that by the mid- to late century, the abundance of ozone-depleting gases would fall to levels last seen before the Antarctic ozone hole began to appear in the early 1980s.
Montzka said the new analysis can't definitively explain why emissions of CFC-11 are increasing, but in the paper, the team discusses potential reasons why. "In the end, we concluded that it's most likely that someone may be producing the CFC-11 that's escaping to the atmosphere," he said. "We don't know why they might be doing that and if it is being made for some specific purpose, or inadvertently as a side product of some other chemical process."
If the source of these new emissions can be identified and controlled soon, the damage to the ozone layer should be minor, Montzka said. If not remedied soon, however, substantial delays in ozone layer recovery could be expected.
Emerging Trends in Global Freshwater Availability
• May 16, 2018: In a first-of-its-kind study, scientists have combined an array of NASA satellite observations of Earth with data on human activities to map locations where freshwater is changing around the globe and to determine why. 34)
The study, published on 16 May in the journal Nature, finds that Earth's wet land areas are getting wetter and dry areas are getting drier due to a variety of factors, including human water management, climate change and natural cycles. 35)
Figure 17: This map depicts a time series of data collected by NASA's GRACE (Gravity Recovery and Climate Experiment) mission from 2002 to 2016, showing where freshwater storage was higher (blue) or lower (red) than the average for the 14-year study period (image credit: GRACE study team,NASA)
A team led by Matt Rodell of NASA/GSFC (Goddard Space Flight Center) in Greenbelt, Maryland, used 14 years of observations from the U.S./German-led GRACE spacecraft mission to track global trends in freshwater in 34 regions around the world (Figure 18). To understand why these trends emerged, they needed to pull in satellite precipitation data from the Global Precipitation Climatology Project, NASA/USGS (U.S. Geological Survey) Landsat imagery, irrigation maps, and published reports of human activities related to agriculture, mining and reservoir operations. Only through analysis of the combined data sets were the scientists able to get a full understanding of the reasons for Earth's freshwater changes as well as the sizes of those trends.
"This is the first time that we've used observations from multiple satellites in a thorough assessment of how freshwater availability is changing, everywhere on Earth," said Rodell. "A key goal was to distinguish shifts in terrestrial water storage caused by natural variability – wet periods and dry periods associated with El Niño and La Niña, for example – from trends related to climate change or human impacts, like pumping groundwater out of an aquifer faster than it is replenished."
"What we are witnessing is major hydrologic change," said co-author Jay Famiglietti of NASA/JPL in Pasadena, California. "We see a distinctive pattern of the wet land areas of the world getting wetter – those are the high latitudes and the tropics – and the dry areas in between getting dryer. Embedded within the dry areas we see multiple hotspots resulting from groundwater depletion."
Famiglietti noted that while water loss in some regions, like the melting ice sheets and alpine glaciers, is clearly driven by warming climate, it will require more time and data to determine the driving forces behind other patterns of freshwater change. "The pattern of wet-getting-wetter, dry-getting-drier during the rest of the 21st century is predicted by the Intergovernmental Panel on Climate Change models, but we'll need a much longer dataset to be able to definitively say whether climate change is responsible for the emergence of any similar pattern in the GRACE data," he said.
The twin GRACE satellites, launched in 2002 as a joint mission with DLR (German Aerospace Center), precisely measured the distance between the two spacecraft to detect changes in Earth's gravity field caused by movements of mass on the planet below. Using this method, GRACE tracked monthly variations in terrestrial water storage until its science mission ended in October 2017.
Groundwater, soil moisture, surface waters, snow and ice are dynamic components of the terrestrial water cycle. Although they are not static on an annual basis (as early water-budget analyses supposed), in the absence of hydroclimatic shifts or substantial anthropogenic stresses they typically remain range-bound. Recent studies have identified locations where TWS (Terrestrial Water Storage) appears to be trending below previous ranges, notably where ice sheets or glaciers are diminishing in response to climate change and where groundwater is being withdrawn at an unsustainable rate.
Figure 18: Trends in TWS (Terrestrial Water Storage, in cm/year) obtained on the basis of GRACE observations from April 2002 to March 2016. The cause of the trend in each outlined study region is briefly explained and color-coded by category. The trend map was smoothed with a 150-km-radius Gaussian filter for the purpose of visualization; however, all calculations were performed at the native 3º resolution of the data product (image credit: GRACE study team, NASA)
However, the GRACE satellite observations alone couldn't tell Rodell, Famiglietti and their colleagues what was causing the apparent trends. "We examined information on precipitation, agriculture and groundwater pumping to find a possible explanation for the trends estimated from GRACE," said co-author Hiroko Beaudoing of Goddard and the University of Maryland in College Park, MD.
For instance, although pumping groundwater for agricultural uses is a significant contributor to freshwater depletion throughout the world, groundwater levels are also sensitive to cycles of persistent drought or rainy conditions. Famiglietti noted that such a combination was likely the cause of the significant groundwater depletion observed in California's Central Valley from 2007 to 2015, when decreased groundwater replenishment from rain and snowfall combined with increased pumping for agriculture.
Southwestern California lost 4 gigatons (equivalent to 4 x 109 m3 or 4 km3) of freshwater per year during the same period. A gigaton of water would fill 400,000 Olympic swimming pools. A majority of California's freshwater comes in the form of rainfall and snow that collect in the Sierra Nevada snowpack and then is managed as it melts into surface waters through a series of reservoirs. When natural cycles led to less precipitation and caused diminished snowpack and surface waters, people relied on groundwater more heavily.
Downward trends in freshwater seen in Saudi Arabia also reflect agricultural pressures. From 2002 to 2016, the region lost 6.1 gigatons per year of stored groundwater. Imagery from Landsat satellites shows an explosive growth of irrigated farmland in the arid landscape from 1987 to the present, which may explain the increased drawdown.
The team's analyses also identified large, decade-long trends in terrestrial freshwater storage that do not appear to be directly related to human activities. Natural cycles of high or low rainfall can cause a trend that is unlikely to persist, Rodell said. An example is Africa's western Zambezi basin and Okavango Delta, a vital watering hole for wildlife in northern Botswana. In this region, water storage increased at an average rate of 29 gigatons per year from 2002 to 2016. This wet period during the GRACE mission followed at least two decades of dryness. Rodell believes it is a case of natural variability that occurs over decades in this region of Africa.
The researchers found that a combination of natural and human pressures can lead to complex scenarios in some regions. Xinjiang province in northwestern China, about the size of Kansas, is bordered by Kazakhstan to the west and the Taklamakan desert to the south and encompasses the central portion of the Tien Shan Mountains. During the first decades of this century, previously undocumented water declines occurred in Xinjiang.
Rodell and his colleagues pieced together multiple factors to explain the loss of 5.5 gigatons of terrestrial water storage per year in Xinjiang province. Less rainfall was not the culprit. Additions to surface water were also occurring from climate change-induced glacier melt, and the pumping of groundwater out of coal mines. But these additions were more than offset by depletions caused by an increase in water consumption by irrigated cropland and evaporation of river water from the desert floor.
The successor to GRACE, called GRACE-FO (GRACE Follow-On), a joint mission with the GFZ (German Research Center for Geosciences), currently is at Vandenberg Air Force Base in California undergoing final preparations for launch no earlier than 22 May 2018.
Earth's magnetic field is NOT about to reverse
April 30, 2018: A study of the most recent near-reversals of the Earth's magnetic field by an international team of researchers, including the University of Liverpool, has found it is unlikely that such an event will take place anytime soon. 36)
There has been speculation that the Earth's geomagnetic fields may be about to reverse , with substantial implications, due to a weakening of the magnetic field over at least the last two hundred years, combined with the expansion of an identified weak area in the Earth's magnetic field called the South Atlantic Anomaly, which stretches from Chile to Zimbabwe.
In a paper published in the Proceedings of the National Academy of Sciences (PNAS), a team of international researchers model observations of the geomagnetic field of the two most recent geomagnetic excursion events, the Laschamp, approximately 41,000 years ago, and Mono Lake, around 34,000 years ago, where the field came close to reversing but recovered its original structure (Figure 19). 37)
The model reveals a field structures comparable to the current geomagnetic field at both approximately 49,000 and 46,000 years ago, with an intensity structure similar to, but much stronger than, today's South Atlantic Anomaly (SAA); their timing and severity is confirmed by records of cosmogenic nuclides. However, neither of these SAA-like fields developed into an excursion or reversal.
Richard Holme, Professor of Geomagnetism at the University of Liverpool, said: "There has been speculation that we are about to experience a magnetic polar reversal or excursion. However, by studying the two most recent excursion events, we show that neither bear resemblance to current changes in the geomagnetic field and therefore it is probably unlikely that such an event is about to happen. - Our research suggests instead that the current weakened field will recover without such an extreme event, and therefore is unlikely to reverse."
The strength and structure of the Earth's magnetic field has varied at different times throughout geological history. At certain periods, the geomagnetic field has weakened to such an extent that it was able to swap the positions of magnetic north and magnetic south, whilst geographic north and geographic south remain the same.
Called a geomagnetic reversal, the last time this happened was 780,000 years ago. However, geomagnetic excursions, where the field comes close to reversing but recovers its original structure, have occurred more recently.
The magnetic field shields the Earth from solar winds and harmful cosmic radiation. It also aids in human navigation, animal migrations and protects telecommunication and satellite systems. It is generated deep within the Earth in a fluid outer core of iron, nickel and other metals that creates electric currents, which in turn produce magnetic fields.
Figure 19: Intensity at Earth's surface (left) and radial field (Br) at the CMB (right). Top: mid-point of the Laschamp excursion; bottom: mid-point of the Mono Lake excursion. The field is truncated at spherical harmonic degree five (image credit: University of Liverpool)
Legend to Figure 19: The geomagnetic field has been decaying at a rate of ~5% per century from at least 1840, with indirect observations suggesting a decay since 1600 or even earlier. This has led to the assertion that the geomagnetic field may be undergoing a reversal or an excursion. The study team has derived a model of the geomagnetic field spanning 30–50 ka (where ka stands for kilo anni; hence, 40 ka are 40,000 years), constructed to study the behavior of the two most recent excursions: the Laschamp and Mono Lake, centered at 41 and 34 ka, respectively.
The research also involved the University of Iceland and GFZ German Research Centre for Geosciences.
West Greenland Ice Sheet melting at the fastest rate in centuries
April 3, 2018: The West Greenland Ice Sheet melted at a dramatically higher rate over the last twenty years than at any other time in the modern record, according to a study led by Dartmouth College (Hanover, NH, USA). The research, appearing in the journal Geophysical Research Letters, shows that melting in west Greenland since the early 1990s is at the highest levels in at least 450 years. 38) 39)
While natural patterns of certain atmospheric and ocean conditions are already known to influence Greenland melt, the study highlights the importance of a long-term warming trend to account for the unprecedented west Greenland melt rates in recent years. The researchers suggest that climate change most likely associated with human greenhouse gas emissions is the probable cause of the additional warming.
"We see that west Greenland melt really started accelerating about twenty years ago," said Erich Osterberg, assistant professor of earth sciences at Dartmouth and the lead scientist on the project. "Our study shows that the rapid rise in west Greenland melt is a combination of specific weather patterns and an additional long-term warming trend over the last century."
According to research cited in the study, loss of ice from Greenland is one of the largest contributors to global sea level rise. Although glaciers calving into the ocean cause much of the ice loss in Greenland, other research cited in the study shows that the majority of ice loss in recent years is from increased surface melt and runoff.
Figure 20: Record of melt from two west Greenland ice cores showing that modern melt rates (red) are higher than at any time in the record since at least 1550 CE (black). The record is plotted as the percent of each year's layer represented by refrozen melt water (image credit: Erich Osterberg)
While satellite measurements and climate models have detailed this recent ice loss, there are far fewer direct measurements of melt collected from the ice sheet itself. For this study, researchers from Dartmouth and Boise State University spent two months on snowmobiles to collect seven ice cores from the remote "percolation zone" of the West Greenland Ice Sheet.
When warm temperatures melt snow on the surface of the percolation zone, the melt water trickles down into the deeper snow and refreezes into ice layers. Researchers were easily able to distinguish these ice layers from the surrounding compacted snow in the cores, preserving a history of how much melt occurred back through time. The more melt, the thicker the ice layers.
"Most ice cores are collected from the middle of the ice sheet where it rarely ever melts, or on the ice sheet edge where the meltwater flows into the ocean. We focused on the percolation zone because that's where we find the best record of Greenland melt going back through time in the form of the refrozen ice layers," said Karina Graeter, the lead author of the study as a graduate student in Dartmouth's Department of Earth Sciences.
The cores, some as long as 30 m, were transported to Dartmouth where the research team used a light table to measure the thickness and frequency of the ice layers. The cores were also sampled for chemical measurements in Dartmouth's Ice Core Laboratory to determine the age of each ice layer.
The cores reveal that the ice layers became thicker and more frequent beginning in the 1990s, with recent melt levels that are unmatched since at least the year 1550 CE (Common Era).
"The ice core record ends about 450 years ago, so the modern melt rates in these cores are the highest of the whole record that we can see," said Osterberg. "The advantage of the ice cores is that they show us just how unusual it is for Greenland to be melting this fast."
Year-to-year changes in Greenland melt since 1979 were already known to be closely tied to North Atlantic ocean temperatures and high-pressure systems that sit above Greenland during the summer — known as summer blocking highs. The new study extends the record back in time to show that these were important controls on west Greenland melt going back to at least 1870.
The study also shows that an additional summertime warming factor of 1.2 ºC is needed to explain the unusually strong melting observed since the 1990s. The additional warming caused a near-doubling of melt rates in the twenty-year period from 1995 to 2015 compared to previous times when the same blocking and ocean conditions were present.
"It is striking to see how a seemingly small warming of only 1.2 ºC can have such a large impact on melt rates in west Greenland," said Graeter.
The study concludes that North Atlantic ocean temperatures and summer blocking activity will continue to control year-to-year changes in Greenland melt into the future. Some climate models suggest that summer blocking activity and ocean temperatures around Greenland might decline in the next several decades, but it remains uncertain. However, the study points out that continued warming from human activities would overwhelm those weather patterns over time to further increase melting.
"Cooler North Atlantic ocean temperatures and less summer blocking activity might slow down Greenland melt for a few years or even a couple decades, but it would not help us in the long run," said Osterberg. "Beyond a few decades, Greenland melting will almost certainly increase and raise sea level as long as we continue to emit greenhouse gases."
Landslide Threats in Near Real-Time During Heavy Rains
February /March 2018: For the first time, scientists can look at landslide threats anywhere around the world in near real-time, thanks to satellite data and a new model developed by NASA. The model, developed at NASA/GSFC (Goddard Space Flight Center) in Greenbelt, Maryland, estimates potential landslide activity triggered by rainfall. Rainfall is the most widespread trigger of landslides around the world. If conditions beneath Earth's surface are already unstable, heavy rains act as the last straw that causes mud, rocks or debris — or all combined — to move rapidly down mountains and hillsides. 40)
The model is designed to increase our understanding of where and when landslide hazards are present and improve estimates of long-term patterns. A global analysis of landslides over the past 15 years using the new open source Landslide Hazard Assessment for Situational Awareness model was published in a study released online on March 22 in the journal Earth's Future. 41)
Determining where, when, and how landslide hazards may vary and affect people at the global scale is fundamental to formulating mitigation strategies, appropriate and timely responses, and robust recovery plans. While monitoring systems exist for other hazards, no such system exists for landslides. A near global LHASA (Landslide Hazard Assessment model for Situational Awareness) has been developed to provide an indication of potential landslide activity at the global scale every 30 minutes. This model uses surface susceptibility and satellite rainfall data to provide moderate to high "nowcasts." This research describes the global LHASA currently running in near real-time and discusses the performance and potential applications of this system. LHASA is intended to provide situational awareness of landslide hazards in near real-time. This system can also leverage nearly two decades of satellite precipitation data to better understand long-term trends in potential landslide activity.
"Landslides can cause widespread destruction and fatalities, but we really don't have a complete sense of where and when landslides may be happening to inform disaster response and mitigation," said Dalia Kirschbaum, a landslide expert at Goddard and co-author of the study. "This model helps pinpoint the time, location and severity of potential landslide hazards in near real-time all over the globe. Nothing has been done like this before."
The model estimates potential landslide activity by first identifying areas with heavy, persistent and recent precipitation. Rainfall estimates are provided by a multi-satellite product developed by NASA using the NASA and JAXA (Japan Aerospace Exploration Agency's)GPM ( Global Precipitation Measurement) mission, which provides precipitation estimates around the world every 30 minutes. The model considers when GPM data exceeds a critical rainfall threshold looking back at the last seven days.
In places where precipitation is unusually high, the model then uses a susceptibility map to determine if the area is prone to landslides. This global susceptibility map is developed using five features that play an important role in landslide activity: if roads have been built nearby, if trees have been removed or burned, if a major tectonic fault is nearby, if the local bedrock is weak and if the hillsides are steep.
If the susceptibility map shows the area with heavy rainfall is vulnerable, the model produces a "nowcast" identifying the area as having a high or moderate likelihood of landslide activity. The model produces new nowcasts every 30 minutes.
Figure 21: This animation shows the potential landslide activity by month averaged over the last 15 years as evaluated by NASA's Landslide Hazard Assessment model for Situational Awareness model. Here, you can see landslide trends across the world (image credit: NASA/GSFC / Scientific Visualization Studio)
The study shows long-term trends when the model's output was compared to landslide databases dating back to 2007. The team's analysis showed a global "landslide season" with a peak in the number of landslides in July and August, most likely associated with the Asian monsoon and tropical cyclone seasons in the Atlantic and Pacific oceans.
"The model has been able to help us understand immediate potential landslide hazards in a matter of minutes," said Thomas Stanley, landslide expert with the Universities Space Research Association at Goddard and co-author of the study. "It also can be used to retroactively look at how potential landslide activity varies on the global scale seasonally, annually or even on decadal scales in a way that hasn't been possible before."
Study of Antarctic ice loss
February 20, 2018: A NASA study based on an innovative technique for crunching torrents of satellite data provides the clearest picture yet of changes in Antarctic ice flow into the ocean. The findings confirm accelerating ice losses from the West Antarctic Ice Sheet and reveal surprisingly steady rates of flow from its much larger neighbor to the east. 42)
The computer-vision technique crunched data from hundreds of thousands of NASA- USGS (U.S. Geological Survey )Landsat satellite images to produce a high-precision picture of changes in ice-sheet motion.
The new work provides a baseline for future measurement of Antarctic ice changes and can be used to validate numerical ice sheet models that are necessary to make projections of sea level. It also opens the door to faster processing of massive amounts of data.
"We're entering a new age," said the study's lead author, cryospheric researcher Alex Gardner of NASA's Jet Propulsion Laboratory in Pasadena, California. "When I began working on this project three years ago, there was a single map of ice sheet flow that was made using data collected over 10 years, and it was revolutionary when it was published back in 2011. Now we can map ice flow over nearly the entire continent, every year. With these new data, we can begin to unravel the mechanisms by which the ice flow is speeding up or slowing down in response to changing environmental conditions."
The innovative approach by Gardner and his international team of scientists largely confirms earlier findings, though with a few unexpected twists. - Among the most significant: a previously unmeasured acceleration of glacier flow into Antarctica's Getz Ice Shelf, on the southwestern part of the continent — likely a result of ice-shelf thinning.
Speeding up in the west, steady flow in the east: The research, published in the journal "The Cryosphere," also identified the fastest speed-up of Antarctic glaciers during the seven-year study period. The glaciers feeding Marguerite Bay, on the western Antarctic Peninsula, increased their rate of flow by 400 to 800 m/year, probably in response to ocean warming. 43)
Perhaps the research team's biggest discovery, however, was the steady flow of the East Antarctic Ice Sheet. During the study period, from 2008 to 2015, the sheet had essentially no change in its rate of ice discharge — ice flow into the ocean. While previous research inferred a high level of stability for the ice sheet based on measurements of volume and gravitational change, the lack of any significant change in ice discharge had never been measured directly.
Figure 22: The speed of Antarctic ice flow, derived from Landsat imagery over a seven-year period (image credit: NASA)
The study also confirmed that the flow of West Antarctica's Thwaites and Pine Island glaciers into the ocean continues to accelerate, though the rate of acceleration is slowing.
In all, the study found an overall ice discharge for the Antarctic continent of 1,929 gigatons per year in 2015, with an uncertainty of plus or minus 40 gigatons. That represents an increase of 36 gigatons per year, plus or minus 15, since 2008. A gigaton is one billion tons (109 tons).
The study found that ice flow from West Antarctica — the Amundsen Sea sector, the Getz Ice Shelf and Marguerite Bay on the western Antarctic Peninsula — accounted for 89 percent of the increase.
Computer vision: The science team developed software that processed hundreds of thousands of pairs of images of Antarctic glacier movement from Landsat-7 and Landsat-8, captured from 2013 to 2015. These were compared to earlier radar satellite measurements of ice flow to reveal changes since 2008.
"We're applying computer vision techniques that allow us to rapidly search for matching features between two images, revealing complex patterns of surface motion," Gardner said.
Instead of researchers comparing small sets of very high-quality images from a limited region to look for subtle changes, the novelty of the new software is that it can track features across hundreds of thousands of images/year — even those of varying quality or obscured by clouds — over an entire continent. "We can now automatically generate maps of ice flow annually — a whole year — to see what the whole continent is doing," Gardner said.
The new Antarctic baseline should help ice sheet modelers better estimate the continent's contribution to future sea level rise. "We'll be able to use this information to target field campaigns, and understand the processes causing these changes," Gardner said. "Over the next decade, all this is going to lead to rapid improvement in our knowledge of how ice sheets respond to changes in ocean and atmospheric conditions, knowledge that will ultimately help to inform projections of sea level change."
Seismic footprint study to track Hurricanes and Typhoons
February 15, 2018: Climatologists are often asked, "Is climate change making hurricanes stronger?" but they can't give a definitive answer because the global hurricane record only goes back to the dawn of the satellite era. But now, an intersection of disciplines—seismology, atmospheric sciences, and oceanography—offers an untapped data source: the continuous seismic record, which dates back to the early 20th century.
An international team of researchers has found a new way to identify the movement and intensity of hurricanes, typhoons and other tropical cyclones by tracking the way they shake the seafloor, as recorded on seismometers on islands and near the coast. After looking at 13 years of data from the northwest Pacific Ocean, they have found statistically significant correlations between seismic data and storms. Their work was published Feb. 15 in the journal Earth and Planetary Science Letters. 44) 45)
The group of experts was assembled by Princeton University's Lucia Gualtieri, a postdoctoral research associate in geosciences, and Salvatore Pascale, an associate research scholar in atmospheric and oceanic sciences.
Most people associate seismology with earthquakes, said Gualtieri, but the vast majority of the seismic record shows low-intensity movements from a different source: the oceans. "A seismogram is basically the movement of the ground. It records earthquakes, because an earthquake makes the ground shake. But it also records all the tiny other movements," from passing trains to hurricanes. "Typhoons show up very well in the record," she said.
Because there is no way to know when an earthquake will hit, seismometers run constantly, always poised to record an earthquake's dramatic arrival. In between these earth-shaking events, they track the background rumbling of the planet. Until about 20 years ago, geophysicists dismissed this low-intensity rumbling as noise, Gualtieri said.
"What is noise? Noise is a signal we don't understand," said Pascale, who is also an associate research scientist at the National and Oceanic and Atmospheric Administration's Geophysical Fluid Dynamics Laboratory.
Just as astronomers have discovered that the static between radio stations gives us information about the cosmic background, seismologists have discovered that the low-level "noise" recorded by seismograms is the signature of wind-driven ocean storms, the cumulative effect of waves crashing on beaches all over the planet or colliding with each other in the open sea.
One ocean wave acting alone is not strong enough to generate a seismic signature at the frequencies she was examining, explained Gualtieri, because typical ocean waves only affect the upper few feet of the sea. "The particle motion decays exponentially with depth, so at the seafloor you don't see anything," she said. "The main mechanism to generate seismic abnormalities from a typhoon is to have two ocean waves interacting with each other." When two waves collide, they generate vertical pressure that can reach the seafloor and jiggle a nearby seismometer.
When a storm is large enough—and storms classified as hurricanes or typhoons are—it will leave a seismic record lasting several days. Previous researchers have successfully traced individual large storms on a seismogram, but Gualtieri came at the question from the opposite side: can a seismogram find any large storm in the area?
Figure 23: Lucia Gualtieri, a postdoctoral researcher in geosciences at Princeton University, superimposed an image of the seismogram recording a tropical cyclone above a satellite image showing the storm moving across the northwest Pacific Ocean. Gualtieri and her colleagues have found a way to track the movement and intensity of typhoons and hurricanes by looking at seismic data, which has the potential to extend the global hurricane record by decades and allow a more definitive answer to the question, "Are hurricanes getting stronger?" (image credit: Photo illustration by Lucia Gualtieri, satellite image courtesy of NASA/NOAA)
Gualtieri and her colleagues found a statistically significant agreement between the occurrence of tropical cyclones and large-amplitude, long-lasting seismic signals with short periods, between three and seven seconds, called "secondary microseisms." They were also able to calculate the typhoons' strength from these "secondary microseisms," or tiny fluctuations, which they successfully correlated to the observed intensity of the storms.
In short, the seismic record had enough data to identify when typhoons happened and how strong they were (Figure 23).
So far, the researchers have focused on the ocean off the coast of Asia because of its powerful typhoons and good network of seismic stations. Their next steps include refining their method and examining other storm basins, starting with the Caribbean and the East Pacific.
And then they will tackle the historic seismic record: "When we have a very defined method and have applied this method to all these other regions, we want to start to go back in time," said Gualtieri.
While global storm information goes back only to the early days of the satellite era, in the late 1960s and early 1970s, the first modern seismograms were created in the 1880s. Unfortunately, the oldest records exist only on paper, and few historical records have been digitized.
"If all this data can be made available, we could have records going back more than a century, and then we could try to see any trend or change in intensity of tropical cyclones over a century or more," said Pascale. "It's very difficult to establish trends in the intensity of tropical cyclones—to see the impact of global warming. Models and theories suggest that they should become more intense, but it's important to find observational evidence."
"This new technique, if it can be shown to be valid across all tropical-cyclone prone basins, effectively lengthens the satellite era," said Morgan O'Neill, a T. C. Chamberlin Postdoctoral Fellow in geosciences at the University of Chicago who was not involved in this research. "It extends the period of time over which we have global coverage of tropical cyclone occurrence and intensity," she said.
The researchers' ability to correlate seismic data with storm intensity is vital, said Allison Wing, an assistant professor of earth, ocean and atmospheric science at Florida State University, who was not involved in this research. "When it comes to understanding tropical cyclones—what controls their variability and their response to climate and climate change—having more data is better, in particular data that can tell us about intensity, which their method seems to do. ... It helps us constrain the range of variability that hurricane intensity can have."
This connection between storms and seismicity began when Gualtieri decided to play with hurricane data in her free time, she said. But when she superimposed the hurricane data over the seismic data, she knew she was on to something. "I said, 'Wow, there's something more than just play. Let's contact someone who can help."
Her research team ultimately grew to include a second seismologist, two atmospheric scientists and a statistician. "The most challenging part was establishing communications with scientists coming from different backgrounds," said Pascale. "Often, in different fields in science, we speak different dialects, different scientific dialects." Once they developed a "shared dialect," he said, they began to make exciting discoveries. "This is how science evolves," said Pascale. "Historically, it's always been like that. Disciplines first evolve within their own kingdom, then a new field is born."
New Study Finds Sea Level Rise Accelerating
February 13, 2018: The rate of global sea level rise has been accelerating in recent decades, rather than increasing steadily, according to a new study based on 25 years of NASA and European satellite data. 46) 47) 48)
This acceleration, driven mainly by increased melting in Greenland and Antarctica, has the potential to double the total sea level rise projected for 2100 when compared to projections that assume a constant rate of sea level rise, according to lead author Steve Nerem. Nerem is a professor of Aerospace Engineering Sciences at the University of Colorado Boulder, a fellow at Colorado's CIRES (Cooperative Institute for Research in Environmental Sciences), and a member of NASA's Sea Level Change team.
If the rate of ocean rise continues to change at this pace, sea level will rise 65 cm by 2100 — enough to cause significant problems for coastal cities, according to the new assessment by Nerem and colleagues from NASA/GSFC (Goddard Space Flight Center) in Greenbelt, Maryland; CU Boulder; the University of South Florida in Tampa; and Old Dominion University in Norfolk, Virginia. The team, driven to understand and better predict Earth's response to a warming world, published their work Feb. 12 in the journal PNAS (Proceedings of the National Academy of Sciences). 49)
"This is almost certainly a conservative estimate," Nerem said. "Our extrapolation assumes that the sea level continues to change in the future as it has over the last 25 years. Given the large changes we are seeing in the ice sheets today, that's not likely."
Figure 24: NASA Scientific Visualization Studio image by Kel Elkins, using data from Jason-1, Jason-2, and TOPEX/Poseidon. Story by Katie Weeman, CIRES, and Patrick Lynch, NASA GSFC. Edited by Mike Carlowicz.
Rising concentrations of greenhouse gases in Earth's atmosphere increase the temperature of air and water, which causes sea level to rise in two ways. First, warmer water expands, and this "thermal expansion" of the ocean has contributed about half of the7 cm of global mean sea level rise we've seen over the last 25 years, Nerem said. Second, melting land ice flows into the ocean, also increasing sea level across the globe.
These increases were measured using satellite altimeter measurements since 1992, including the Topex/Poseidon, Jason-1, Jason-2 and Jason-3 satellite missions, which have been jointly managed by multiple agencies, including NASA, CNES (Centre National d'Etudes Spatiales), EUMETSAT (European Organisation for the Exploitation of Meteorological Satellites), and NOAA (National Oceanic and Atmospheric Administration). NASA's Jet Propulsion Laboratory in Pasadena, California, manages the U.S. portion of these missions for NASA's Science Mission Directorate. The rate of sea level rise in the satellite era has risen from about 2.5 mm/ year in the 1990s to about 3.4 mm/year today.
"The Topex/Poseidon/Jason altimetry missions have been essentially providing the equivalent of a global network of nearly half a million accurate tide gauges, providing sea surface height information every 10 days for over 25 years," said Brian Beckley, of NASA Goddard, second author on the new paper and lead of a team that processes altimetry observations into a global sea level data record. "As this climate data record approaches three decades, the fingerprints of Greenland and Antarctic land-based ice loss are now being revealed in the global and regional mean sea level estimates."
Table 2: Significance of global sea level rise
Ozone layer not recovering in lower latitudes, despite ozone hole healing at the poles
February 8, 2018: The ozone layer - which protects us from harmful ultraviolet radiation - is recovering at the poles, but unexpected decreases in part of the atmosphere may be preventing recovery at lower latitudes. Global ozone has been declining since the 1970s owing to certain man-made chemicals. Since these were banned, parts of the layer have been recovering, particularly at the poles. 50)
However, the new result, published in the EGU (European Geosciences Union) journal Atmospheric Chemistry and Physics, finds that the bottom part of the ozone layer at more populated latitudes is not recovering. The cause is currently unknown. 51)
Ozone is a substance that forms in the stratosphere - the region of the atmosphere between about 10 and 50 km altitude, above the troposphere that we live in. It is produced in tropical latitudes and distributed around the globe. A large portion of the resulting ozone layer resides in the lower part of the stratosphere. The ozone layer absorbs much of the UV radiation from the Sun, which, if it reaches the Earth's surface, can cause damage to DNA in plants, animals and humans.
In the 1970s, it was recognized that chemicals called CFCs (Chlorofluorocarbons), used for example in refrigeration and aerosols, were destroying ozone in the stratosphere. The effect was worst in the Antarctic, where an ozone 'hole' formed.
In 1987, the Montreal Protocol was agreed (international treaty), which led to the phase-out of CFCs and, recently, the first signs of recovery of the Antarctic ozone layer. The upper stratosphere at lower latitudes is also showing clear signs of recovery, proving the Montreal Protocol is working well.
However, despite this success, scientists have evidence that stratospheric ozone is likely not recovering at lower latitudes, between 60º N and 60º S, due to unexpected decreases in ozone in the lower part of the stratosphere.
Study co-author Professor Joanna Haigh, Co-Director of the Grantham Institute for Climate Change and the Environment at Imperial College London, said: "Ozone has been seriously declining globally since the 1980s, but while the banning of CFCs is leading to a recovery at the poles, the same does not appear to be true for the lower latitudes. The potential for harm in lower latitudes may actually be worse than at the poles. The decreases in ozone are less than we saw at the poles before the Montreal Protocol was enacted, but UV radiation is more intense in these regions and more people live there."
The cause of this decline is not certain, although the authors suggest a couple of possibilities. One is that climate change is altering the pattern of atmospheric circulation, causing more ozone to be carried away from the tropics.
The other possibility is that very short-lived substances (VSLSs), which contain chlorine and bromine, could be destroying ozone in the lower stratosphere. VSLSs include chemicals used as solvents, paint strippers, and as degreasing agents. One is even used in the production of an ozone-friendly replacement for CFCs.
Dr William Ball from ETH Zürich [Eidgenoessische Technische Hochschule, Zürich (Swiss Federal Institute of Technology, Zürich)] and PMOD/WRC [Physikalisch-Meteorologisches Observatorium Davos, World Radiation Center (Switzerland)], who led the analysis, said: "The finding of declining low-latitude ozone is surprising, since our current best atmospheric circulation models do not predict this effect. Very short-lived substances could be the missing factor in these models."
It was thought that very short-lived substances would not persist long enough in the atmosphere to reach the height of the stratosphere and affect ozone, but more research may be needed.
To conduct the analysis, the team developed new algorithms to combine the efforts of multiple international teams that have worked to connect data from different satellite missions since 1985 and create a robust, long time series.
William Ball said: "The study is an example of the concerted international effort to monitor and understand what is happening with the ozone layer; many people and organizations prepared the underlying data, without which the analysis would not have been possible."
Although individual datasets had previously hinted at a decline, the application of advanced merging techniques and time series analysis has revealed a longer term trend of ozone decrease in the stratosphere at lower altitudes and latitudes.
The researchers say the focus now should be on getting more precise data on the ozone decline, and determining what the cause most likely is, for example by looking for the presence of VSLSs in the stratosphere.
Dr Justin Alsing from the Flatiron Institute in New York, who took on a major role in developing and implementing the statistical technique used to combine the data, said: "This research was only possible because of a great deal of cross-disciplinary collaboration. My field is normally cosmology, but the technique we developed can be used in any science looking at complex datasets."
Table 3: Summary of the published paper (Ref. 51)
Heat loss from Earth's interior triggers Greenland's ice sheet slide towards the sea
January 30, 2018: In North-East Greenland, researchers have measured the loss of heat that comes up from the interior of the Earth. This enormous area is a geothermal 'hot spot' that melts the ice sheet from below and triggers the sliding of glaciers towards the sea. The melting takes place with increased strength and at a speed that no models have previously predicted. 52)
As reported in the journal Scientific Reports, researchers from the Arctic Research Center, Aarhus University (Aarhus, Denmark), and the Greenland Institute of Natural Resources (Nuuk, Greenland) present results that, for the first time, show that the deep bottom water of the north-eastern Greenland fjords is being warmed up by heat gradually lost from the Earth's interior. And the researchers point out that this heat loss triggers the sliding of glaciers from the ice sheet towards the sea. 53)
Icelandic conditions: "North-East Greenland has several hot springs where the water becomes up to 60 degrees warm and, like Iceland, the area has abundant underground geothermal activity," explains Professor Søren Rysgaard, who headed the investigations.
For more than ten years (2005-2015), the researchers have measured the temperature and salinity in the fjord Young Sound, located at Daneborg, north of Scoresbysund, which has many hot springs, and south of the glacier Nioghalvfjerdsfjorden, which melts rapidly and is connected to the North-East Greenland Ice Stream (NEGIS).
By focusing on an isolated basin in the fjord with a depth range between 200 and 340 m, the researchers have measured how the deep water is heated over a ten-year period. Based on the extensive data, researchers have estimated that the loss of heat from the Earth's interior to the fjord is about 100 mW m-2. This corresponds to a 2 MW wind turbine sending electricity to a large heater at the bottom of the fjord all year round.
Heat from the Earth's interior — an important influence: It is not easy to measure the geothermal heat flux — heat emanating from the Earth's interior — below a glacier, but within the area there are several large glaciers connected directly to the ice sheet. If the Earth releases heat to a fjord, heat also seeps up to the bottom part of the glaciers. This means that the glaciers melt from below and thus slide more easily over the terrain on which they sit when moving to the sea.
"It is a combination of higher temperatures in the air and the sea, precipitation from above, local dynamics of the ice sheet and heat loss from the Earth's interior that determines the mass loss from the Greenland ice sheet," explains Søren Rysgaard.
The researchers expect that the new discoveries will improve the models of ice sheet dynamics, allowing better predictions of the stability of the Greenland ice sheet, its melting and the resulting global water rise.
Figure 25: Geothermal vents localities and ice surface speeds (2008–2009) for Greenland. Geothermal vent localities on land with temperatures >10ºC, Boreholes, hydrothermal vent complexes offshore and present study. Reconstructed geothermal anomalies (contours in inserted box). Ice drilling localities are indicated by CC, NGRIP, GRIP and Dye (image credit: Research Team of Aarhus University)
Dust on Snow Controls Springtime River Rise
January 23, 2018: A new study has found that dust, not spring warmth, controls the pace of spring snowmelt that feeds the headwaters of the Colorado River. Contrary to conventional wisdom, the amount of dust on the mountain snowpack controls how fast the Colorado Basin's rivers rise in the spring regardless of air temperature, with more dust correlated with faster spring runoff and higher peak flows. 54)
The finding is valuable for western water managers and advances our understanding of how freshwater resources, in the form of snow and ice, will respond to warming temperatures in the future. By improving knowledge of what controls the melting of snow, it improves understanding of the controls on how much solar heat Earth reflects back into space and how much it absorbs — an important factor in studies of weather and climate.
When snow gets covered by a layer of windblown dust or soot, the dark topcoat increases the amount of heat the snow absorbs from sunlight. Tom Painter of NASA's Jet Propulsion Laboratory in Pasadena, California, has been researching the consequences of dust on snowmelt worldwide. This is the first study to focus on which has a stronger influence on spring runoff: warmer air temperatures or a coating of dust on the snow.
Windblown dust has increased in the U.S. Southwest as a result of changing climate patterns and human land-use decisions. With rainfall decreasing and more disturbances of the land, protective crusts on soil are removed and more bare soil is exposed. Winter and spring winds pick up the dusty soil and drop it on the Colorado Rockies to the northeast. Historical lake sediment analyses show there is currently an annual average of five to seven times more dust falling on the Rocky Mountain snowpack than there was before the mid-1800s.
Painter and colleagues looked at data on air temperature and dust in a mountain basin in southwestern Colorado from 2005 to 2014, and streamflow from three major tributary rivers that carry snowmelt from these mountains to the Colorado River. The Colorado River's basin spans about 246,000 square miles (637,000 km2) in parts of seven western states.
The researchers found that the effects of dust dominated the pace of the spring runoff even in years with unusually warm spring air temperatures. Conversely, there was almost no statistical correlation between air temperature and the pace of runoff.
"We found that when it's clean, the rise to the peak streamflow is slower, and generally you get a smaller peak." Painter said. "When the snowpack is really dusty, water just blasts out of the mountains." The finding runs contrary to the widely held assumption that spring air temperature determines the likelihood of flooding.
Coauthor McKenzie Skiles, an assistant professor in the University of Utah Department of Geography, said that while the impacts of dust in the air, such as reduced air quality, are well known, the impacts of the dust once it's been deposited on the land surface are not as well understood. "Given the reliance of the western U.S. on the natural snow reservoir, and the Colorado River in particular, it is critical to evaluate the impact of increasing dust deposition on the mountain snowpack," she said.
Figure 26: A coating of dust on snow speeds the pace of snowmelt in the spring (image credit: NASA)
Painter pointed out that the new finding doesn't mean air temperatures in the region can be ignored in considering streamflows and flooding, especially in the future. "As air temperature continues to climb, it's going to have more influence," he said. Temperature controls whether precipitation falls as snow or as rain, for example, so ultimately it controls how much snow there is to melt. But, he said, "temperature is unlikely to control the variability in snowmelt rates. That will still be controlled by how dirty or clean the snowpack is."
Skiles noted, "Dust on snow does not only impact the mountains that make up the headwaters of Colorado River. Surface darkening has been observed in mountain ranges all over the world, including the Alps and the Himalaya. What we learn about the role of dust deposition for snowmelt timing and intensity here in the western U.S. has global implications for improved snowmelt forecasting and management of snow water resources."
The study, titled "Variation in rising limb of Colorado River snowmelt runoff hydrograph controlled by dust radiative forcing in snow," was published today in the journal Geophysical Research Letters. Coauthors are from the University of Utah, Salt Lake City; University of Colorado, Boulder; and University of California, Santa Barbara. 55)
Study of Extreme Wintertime Arctic Warm Event
January 16, 2018: In the winter of 2015/16, something happened that had never before been seen on this scale: at the end of December, temperatures rose above zero degrees Celsius for several days in parts of the Arctic. Temperatures of up to eight degrees were registered north of Svalbard. Temperatures this high have not been recorded in the winter half of the year since the beginning of systematic measurements at the end of the 1970s. As a result of this unusual warmth, the sea ice began to melt. 56)
"We heard about this from the media," says Heini Wernli, Professor of Atmospheric Dynamics at ETH Zurich. The news aroused his scientific curiosity, and a team led by his then doctoral student Hanin Binder investigated the issue. In December 2017, they published their analysis of this exceptional event in the journal Geophysical Research Letters. 57)
The researchers show in their paper how these unusual temperatures arose: three different air currents met over the North Sea between Scotland and southern Norway, carrying warm air northwards at high speed as though on a "highway" (Figure 27).
One air current originated in the Sahara and brought near-surface warm air with it. To begin with, temperature of this air was about 20º Celsius. While it cooled off on its way to the Arctic, it was still above zero when it arrived. "It's extremely rare for warm, near-surface subtropical air to be transported as far as the Arctic," says Binder.
The second air current originated in the Arctic itself, a fact that astonished the scientists. To begin with, this air was very cold. However, the air mass – which also lay close to the ground – moved towards the south along a curved path and, while above the Atlantic, was warmed significantly by the heatflux from the ocean before joining the subtropical air current.
The third warm air current started as a cold air mass in the upper troposphere, from an altitude above 5 km. These air masses were carried from west to east and descended in a stationary high-pressure area over Scandinavia. Compression thereby warmed the originally cold air, before it entered the "highway to the Arctic".
Figure 27: Schematic illustration of the unusual processes that led to the Arctic warm event (warm air highway), image credit: Sandro Bösch / ETH Zurich
Poleward warm air transport: This highway of air currents was made possible by a particular constellation of pressure systems over northern Europe. During the period in question, intense low-pressure systems developed over Iceland while an extremely stable high-pressure area formed over Scandinavia. This created a kind of funnel above the North Sea, between Scotland and southern Norway, which channelled the various air currents and steered them northwards to the Arctic.
This highway lasted approximately a week. The pressure systems then decayed and the Arctic returned to its typical frozen winter state. However, the warm period sufficed to reduce the thickness of the sea ice in parts of the Arctic by 30 cm – during a period in which ice usually becomes thicker and more widespread.
"These weather conditions and their effect on the sea ice were really exceptional," says Binder. The researchers were not able to identify a direct link to global warming. "We only carried out an analysis of a single event; we didn't research the long-term climate aspects" emphasizes Binder.
However, the melting of Arctic sea ice during summer is a different story. The long-term trend is clear: the minimum extent and thickness of the sea ice in late summer has been shrinking continually since the end of the 1970s. Sea ice melted particularly severely in 2007 and 2012 – a fact which climate researchers have thus far been unable to fully explain. Along with Lukas Papritz from the University of Bergen, Wernli investigated the causes of these outliers.
According to their research, the severe melting in the aforementioned years was caused by stable high-pressure systems that formed repeatedly throughout the summer months. Under these cloud-free weather conditions, the high level of direct sunlight – the sun shines 24 hours a day at this time of year – particularly intensified the melting of the sea ice.
The extreme event was the result of a very unusual large-scale flow configuration in early winter 2015/2016 that came along with overall anomalously warm conditions in Europe (National Oceanic and Atmospheric Administration, 2016) and other regional extremes, for example, flooding in the UK. 58) In this study (Ref. 57),we focus on the Arctic. At the North Pole, buoys measured maximum surface temperatures of -0.8ºC on 30 December 59), and at the Svalbard airport station values of 8.7ºC were observed, the warmest temperatures ever recorded at that station between November and April (The Norwegian Meteorological Institute, 2016). According to operational analyses from the ECMWF (European Center for Medium-Range Weather Forecasts), the maximum 2 m temperature (T2m) north of 82ºN reached values larger than 0ºC during three short episodes between 29 December 2015 and 4 January 2016—almost 30 K above the winter climatological mean in this region (Figure 28a). They occurred in the Eurasian Arctic sector in the region around Svalbard and over the Kara Sea (purple contour in Figure 28b) and were the highest winter values since 1979 (Figure 28c). The warm event led to a thinning of the sea ice by more than 30 cm in the Barents and Kara Seas, and contributed to the record low Northern Hemisphere sea ice extent observed in January and February 2016 (National Snow and Ice Data Center, 2016).
Figure 28: Illustration of the Arctic warm event and its extremeness. (a) Temporal evolution of the domain maximum (red) and mean (blue) T2m (ºC) between 20 December 2015 and 10 January 2016 at latitudes ≥82ºN and between 120ºW and 120ºE, derived from operational analyses. Also shown are the domain mean December–February 1979–2014 climatological mean T2m (black), and the corresponding ±1 standard deviation envelope (grey) from ERA-Interim reanalysis data. (b) Maximum T2m (ºC) between 00 UTC 30 December 2015 and 18 UTC 4 January 2016 from operational analyses, with the purple contour highlighting the regions ≥82ºN with maximum T2m ≥ 0ºC. (c) Rank of maximum T2m shown in Figure 28b among all 6-hourly values in winter 1979–2014 in the ERA-Interim reanalyses (consisting of a total of 13,232 values), image credit: study team
Long-Term Warming Trend Continued in 2017
January 18, 2018: Earth's global surface temperatures in 2017 ranked as the second warmest since 1880, according to an analysis by NASA. Continuing the planet's long-term warming trend, globally averaged temperatures in 2017 were 1.62 º Fahrenheit (0.90º Celsius) warmer than the 1951 to 1980 mean, according to scientists at NASA's GISS (Goddard Institute for Space Studies) in New York. That is second only to global temperatures in 2016. 60)
In a separate, independent analysis, scientists at NOAA (National Oceanic and Atmospheric Administration) concluded that 2017 was the third-warmest year in their record. The minor difference in rankings is due to the different methods used by the two agencies to analyze global temperatures, although over the long-term the agencies' records remain in strong agreement. Both analyses show that the five warmest years on record all have taken place since 2010.
Because weather station locations and measurement practices change over time, there are uncertainties in the interpretation of specific year-to-year global mean temperature differences. Taking this into account, NASA estimates that 2017's global mean change is accurate to within 0.1º Fahrenheit, with a 95 percent certainty level.
"Despite colder than average temperatures in any one part of the world, temperatures over the planet as a whole continue the rapid warming trend we've seen over the last 40 years," said GISS Director Gavin Schmidt.
The planet's average surface temperature has risen about 2 degrees Fahrenheit (a little more than 1 degree Celsius) during the last century or so, a change driven largely by increased carbon dioxide and other human-made emissions into the atmosphere. Last year was the third consecutive year in which global temperatures were more than 1.8 degrees Fahrenheit (1 degree Celsius) above late nineteenth-century levels.
Phenomena such as El Niño or La Niña, which warm or cool the upper tropical Pacific Ocean and cause corresponding variations in global wind and weather patterns, contribute to short-term variations in global average temperature. A warming El Niño event was in effect for most of 2015 and the first third of 2016. Even without an El Niño event – and with a La Niña starting in the later months of 2017 – last year's temperatures ranked between 2015 and 2016 in NASA's records.
Figure 29: This map shows Earth's average global temperature from 2013 to 2017, as compared to a baseline average from 1951 to 1980, according to an analysis by NASA's Goddard Institute for Space Studies. Yellows, oranges, and reds show regions warmer than the baseline (image credit: NASA's Scientific Visualization Studio)
In an analysis where the effects of the recent El Niño and La Niña patterns were statistically removed from the record, 2017 would have been the warmest year on record.
Weather dynamics often affect regional temperatures, so not every region on Earth experienced similar amounts of warming. NOAA found the 2017 annual mean temperature for the contiguous 48 United States was the third warmest on record.
Warming trends are strongest in the Arctic regions, where 2017 saw the continued loss of sea ice.
NASA's temperature analyses incorporate surface temperature measurements from 6,300 weather stations, ship- and buoy-based observations of sea surface temperatures, and temperature measurements from Antarctic research stations.
These raw measurements are analyzed using an algorithm that considers the varied spacing of temperature stations around the globe and urban heating effects that could skew the conclusions. These calculations produce the global average temperature deviations from the baseline period of 1951 to 1980.
NOAA scientists used much of the same raw temperature data, but with a different baseline period, and different methods to analyze Earth's polar regions and global temperatures. The full 2017 surface temperature data set and the complete methodology used to make the temperature calculation are available.
GISS is a laboratory within the Earth Sciences Division of NASA's Goddard Space Flight Center in Greenbelt, Maryland. The laboratory is affiliated with Columbia University's Earth Institute and School of Engineering and Applied Science in New York.
NASA uses the unique vantage point of space to better understand Earth as an interconnected system. The agency also uses airborne and ground-based monitoring, and develops new ways to observe and study Earth with long-term data records and computer analysis tools to better see how our planet is changing. NASA shares this knowledge with the global community and works with institutions in the United States and around the world that contribute to understanding and protecting our home planet.
Study of Antarctic Ozone Hole Recovery
January 5, 2018: For the first time, scientists have shown through direct observations of the ozone hole by an instrument on NASA's Aura mission, that levels of ozone-destroying chlorine are declining, resulting in less ozone depletion. Measurements show that the decline in chlorine, resulting from an international ban on chlorine-containing human-produce chemicals called chlorofluorocarbons (CFCs), has resulted in about 20 percent less ozone depletion during the Antarctic winter than there was in 2005 — the first year that measurements of chlorine and ozone during the Antarctic winter were made by the Aura satellite. 61)
- "We see very clearly that chlorine from CFCs is going down in the ozone hole, and that less ozone depletion is occurring because of it," said lead author Susan Strahan, an atmospheric scientist from NASA's Goddard Space Flight Center in Greenbelt, Maryland. The study was published in the journal Geophysical Research Letters. 62)
- CFCs are long-lived chemical compounds that eventually rise into the stratosphere, where they are broken apart by the Sun's ultraviolet radiation, releasing chlorine atoms that go on to destroy ozone molecules. Stratospheric ozone protects life on the planet by absorbing potentially harmful ultraviolet radiation that can cause skin cancer and cataracts, suppress immune systems and damage plant life.
- Two years after the discovery of the Antarctic ozone hole in 1985, nations of the world signed the Montreal Protocol on Substances that Deplete the Ozone Layer, which regulated ozone-depleting compounds. Later amendments to the Montreal Protocol completely phased out production of CFCs.
- Past studies have used statistical analyses of changes in the ozone hole's size to argue that ozone depletion is decreasing. This study is the first to use measurements of the chemical composition inside the ozone hole to confirm that not only is ozone depletion decreasing, but that the decrease is caused by the decline in CFCs.
- The Antarctic ozone hole forms during September in the Southern Hemisphere's winter as the returning Sun's rays catalyze ozone destruction cycles involving chlorine and bromine that come primarily from CFCs. To determine how ozone and other chemicals have changed year to year, scientists used data from JPL's MLS (Microwave Limb Sounder) aboard the Aura satellite, which has been making measurements continuously around the globe since mid-2004. While many satellite instruments require sunlight to measure atmospheric trace gases, MLS measures microwave emissions and, as a result, can measure trace gases over Antarctica during the key time of year: the dark southern winter, when the stratospheric weather is quiet and temperatures are low and stable.
Figure 30: Using measurements from NASA's Aura satellite, scientists studied chlorine within the Antarctic ozone hole over the last several years, watching as the amount slowly decreased (image credit: NASA/GSFC, Katy Mersmann)
The change in ozone levels above Antarctica from the beginning to the end of southern winter — early July to mid-September — was computed daily from MLS measurements every year from 2005 to 2016. "During this period, Antarctic temperatures are always very low, so the rate of ozone destruction depends mostly on how much chlorine there is," Strahan said. "This is when we want to measure ozone loss."
They found that ozone loss is decreasing, but they needed to know whether a decrease in CFCs was responsible. When ozone destruction is ongoing, chlorine is found in many molecular forms, most of which are not measured. But after chlorine has destroyed nearly all the available ozone, it reacts instead with methane to form hydrochloric acid, a gas measured by MLS. "By around mid-October, all the chlorine compounds are conveniently converted into one gas, so by measuring hydrochloric acid we have a good measurement of the total chlorine," Strahan said.
Nitrous oxide is a long-lived gas that behaves just like CFCs in much of the stratosphere. The CFCs are declining at the surface but nitrous oxide is not. If CFCs in the stratosphere are decreasing, then over time, less chlorine should be measured for a given value of nitrous oxide. By comparing MLS measurements of hydrochloric acid and nitrous oxide each year, they determined that the total chlorine levels were declining on average by about 0.8 percent annually.
The 20 percent decrease in ozone depletion during the winter months from 2005 to 2016 as determined from MLS ozone measurements was expected. "This is very close to what our model predicts we should see for this amount of chlorine decline," Strahan said. "This gives us confidence that the decrease in ozone depletion through mid-September shown by MLS data is due to declining levels of chlorine coming from CFCs. But we're not yet seeing a clear decrease in the size of the ozone hole because that's controlled mainly by temperature after mid-September, which varies a lot from year to year."
Looking forward, the Antarctic ozone hole should continue to recover gradually as CFCs leave the atmosphere, but complete recovery will take decades. "CFCs have lifetimes from 50 to 100 years, so they linger in the atmosphere for a very long time," said Anne Douglass, a fellow atmospheric scientist at Goddard and the study's co-author. "As far as the ozone hole being gone, we're looking at 2060 or 2080. And even then there might still be a small hole."
Study solves a conflict in the post-2006 atmospheric methane budget concentrations
January 2, 2018: A new NASA-led study has solved a puzzle involving the recent rise in atmospheric methane, a potent greenhouse gas, with a new calculation of emissions from global fires. The new study resolves what looked like irreconcilable differences in explanations for the increase. 63)
Methane emissions have been rising sharply since 2006. Different research teams have produced viable estimates for two known sources of the increase: emissions from the oil and gas industry, and microbial production in wet tropical environments like marshes and rice paddies. But when these estimates were added to estimates of other sources, the sum was considerably more than the observed increase. In fact, each new estimate was large enough to explain the whole increase by itself.
John Worden of NASA's Jet Propulsion Laboratory in Pasadena, California, and colleagues focused on fires because they're also changing globally. The area burned each year decreased about 12 percent between the early 2000s and the more recent period of 2007 to 2014, according to a new study using observations by NASA's MODIS (Moderate Resolution Imaging Spectrometer) satellite instrument. The logical assumption would be that methane emissions from fires have decreased by about the same percentage. Using satellite measurements of methane and carbon monoxide, Worden's team found the real decrease in methane emissions was almost twice as much as that assumption would suggest.
When the research team subtracted this large decrease from the sum of all emissions, the methane budget balanced correctly, with room for both fossil fuel and wetland increases. The research is published in the journal Nature Communications. 64)
Most methane molecules in the atmosphere don't have identifying features that reveal their origin. Tracking down their sources is a detective job involving multiple lines of evidence: measurements of other gases, chemical analyses, isotopic signatures, observations of land use, and more. "A fun thing about this study was combining all this different evidence to piece this puzzle together," Worden said.
Carbon isotopes in the methane molecules are one clue. Of the three methane sources examined in the new study, emissions from fires contain the largest percentage of heavy carbon isotopes, microbial emissions have the smallest, and fossil fuel emissions are in between. Another clue is ethane, which (like methane) is a component of natural gas. An increase in atmospheric ethane indicates increasing fossil fuel sources. Fires emit carbon monoxide as well as methane, and measurements of that gas are a final clue.
Worden's team used carbon monoxide and methane data from the Measurements of Pollutants in the Troposphere instrument on NASA's Terra satellite and the Tropospheric Emission Spectrometer instrument on NASA's Aura to quantify fire emissions of methane. The results show these emissions have been decreasing much more rapidly than expected.
Combining isotopic evidence from ground surface measurements with the newly calculated fire emissions, the team showed that about 17 teragrams per year of the increase is due to fossil fuels, another 12 is from wetlands or rice farming, while fires are decreasing by about 4 teragrams per year. The three numbers combine to net emissions increase of ~25 Tg/year of CH4 — the same as the observed increase.
The magnitude of the global CH4 masses involved are illustrated by: 1 Tg (1 teragram) = 1012 g = 1,000,000 tons. Methane emissions are increasing by about 25 Tg/year, with total emissions currently of ~550 Tg/year budget.
Worden's coauthors are at the NCAR (National Center for Atmospheric Research), Boulder, Colorado; and the Netherlands Institute for Space Research and University of Utrecht, both in Utrecht, the Netherlands.
Figure 31: This time series was created using data from the MODIS instrument data onboard NASA's Terra and Aqua satellites. The burned area is estimated by applying an algorithm that detects rapid changes in visible and infrared surface reflectance imagery. Fires typically darken the surface in the visible part of the electromagnetic spectrum, and brighten the surface in several wavelength bands in the shortwave infrared that are sensitive to the surface water content of vegetation (image credit: NASA/GSFC/SVS)
Legend to Figure 31: Thermal emissions from actively burning fires also are measured by MODIS and are used to improve the burned area estimates in croplands and other areas where the fire sizes are relatively small. This animation portrays burned area between September 2000 and August 2015 as a percent of the 1/4 degree grid cell that was burned each month. The values on the color bar are on a log scale, so the regions shown in blue and green shades indicate small burned areas while those in red and orange represent a larger percent of the region burned. Beneath the burned area, the seasonal Blue Marble landcover shows the advance and retreat of snow in the northern hemisphere.
Trend in CH4 emissions from fires. Figure 32 shows the time series of CH4 emissions that were obtained from GFEDv4s (Global Fire Emissions Database, version 4s) and top-down estimates based on CO emission estimates and GFED4s-based emission ratios. The CO-based fire CH4 emissions estimates amount to 14.8 ± 3.8 Tg CH4 per year for the 2001–2007 time period and 11.1 ± 3 Tg CH4 per year for the 2008–2014 time period, with a 3.7 ± 1.4 Tg CH4 per year decrease between the two time periods. The mean burnt area (a priori)-based estimate from GFED4s is slightly larger and shows a slightly smaller decrease (2.3 Tg CH4 per year) in fire emissions after 2007 relative to the 2001–2006 time period. The range of uncertainties (shown as blue error bars in Figure 32 is determined by the uncertainty in top-down CO emission estimates that are derived empirically using the approaches discussed in the Methods). The red shading describes the range of uncertainty stemming from uncertainties in CH4/CO emission factors (Methods). By assuming temporally constant sector-specific CH4/CO emission factors, we find that mean 2001–2014 emissions average to 12.9 ± 3.3 Tg CH4 per year, and the decrease averages to 3.7 ± 1.4 Tg CH4 per year for 2008–2014, relative to 2001–2007. This decrease is largely accounted for by a 2.9 ± 1.2 Tg CH4 per year decrease during 2006–2008, which is primarily attributable to a biomass burning decrease in Indonesia and South America.
Figure 32: Trend of methane emissions from biomass burning. Expected methane emissions from fires based on the Global Fire Emissions Database (black) and the CO emissions plus CH4/CO ratios shown here (red). The range of uncertainties in blue is due to the calculated errors from the CO emissions estimate and the shaded red describes the range of error from uncertainties in the CH4/CO emission factors (image credit: Methane Study Team)
Industrial-age doubling of snow accumulation in the Alaska Range linked to tropical ocean warming
December 19, 2017: Snowfall on a major summit in North America's highest mountain range has more than doubled since the beginning of the Industrial Age, according to a study from Dartmouth College, the University of Maine, and the University of New Hampshire. The research not only finds a dramatic increase in snowfall, it further explains connections in the global climate system by attributing the record accumulation to warmer waters thousands of miles away in the tropical Pacific and Indian Oceans. 65)
The study demonstrates that modern snowfall in the iconic Alaska Range is unprecedented for at least the past 1200 years and far exceeds normal variability. "We were shocked when we first saw how much snowfall has increased," said Erich Osterberg, an assistant professor of Earth sciences at Dartmouth College and principal investigator for the research. "We had to check and double-check our results to make sure of the findings. Dramatic increases in temperature and air pollution in modern times have been well established in science, but now we're also seeing dramatic increases in regional precipitation with climate change."
According to the research, wintertime snowfall has increased 117 percent since the mid-19th century in southcentral Alaska in the United States. Summer snows also showed a significant increase of 49 percent in the short period ranging less than two hundred years.
The research, appearing in Scientific Reports, is based on analysis of two ice cores (each 208 m long) collected from the Mount Hunter summit plateau (62°56'N, 151°5'W, 3900 m) in Denali National Park, Alaska. A high snow accumulation rate (1.15 m water equivalent [w. e.] average since 1900) and infrequent surface melt (<0.5% of the core is composed of refrozen melt layers and lenses) at the Mt. Hunter drill site preserve robust seasonal oscillations of several chemical parameters (Na, Ca, Mg, NH4 +, MSA (methanesulfonic acid), δ18O, liquid conductivity, dust concentration), facilitating annual layer counting back to 800 CE (Common Era, Figure 33). — According to the authors, accumulation records in the separate samples taken from just below the summit of the mountain known as "Denali's Child" are in nearly complete agreement. 66)
Figure 33: Annual layer counting in the Mt. Hunter ice core. (A) Three chemical series exhibiting annual layers are shown at a representative depth of core: Mg (black), δ18O (blue) and MSA (red). Each vertical dotted line represents the depth of Jan. 1st in a δ18O trough and just below a Mg peak. The distance between each vertical dotted line represents one year's snow accumulation (before thinning correction). The position of these years was selected three times by three independent researchers. We delineate summer (May-August) and winter (September-April) seasons by recording the late summer-fall peak positions of MSA (purple circles) and the spring peak positions of Mg (orange circles).
The annually resolved Denali snow accumulation record (Figure 34) indicates that the post-1950 precipitation increase in the Alaskan weather station records began well before the 20th century, in circa 1840 CE.
"It is now glaringly clear from our ice core record that modern snowfall rates in Alaska are much higher than natural rates before the Industrial Revolution," said Dominic Winski, a research assistant at Dartmouth and the lead author of the report. "This increase in precipitation is also apparent in weather station data from the past 50 years, but ice cores show the scale of the change well above natural conditions."
Once the researchers established snowfall rates, they set out to identify why precipitation has increased so rapidly in such a short amount of time. Scientific models predict as much as a 2 percent increase in global precipitation per degree of warming because warmer air holds more moisture, but this could not account for most of the dramatic increases in Denali snowfall over the studied period.
Figure 34: The Mt. Hunter accumulation record. Annual (light gray line) and 21-year smoothed (black line) accumulation time series from the year 810 CE (Common Era) to present, constrained by 21-year smoothed error envelopes (blue shading) inclusive of stochastic, peak position and layer-thinning model uncertainties, including the total uncertainty range among all four modeling approaches. The inset shows seasonal trends in accumulation since 1867 with 21-year running means (bold lines). Snowfall accumulating between September and April (blue) has more than doubled, with a faster rise since 1976. Summer accumulation (April to August; red) remained comparatively stable except for a baseline shift between 1909 and 1925 (image credit: Dartmouth College, Dominic Winski)
The research suggests that warming tropical oceans have caused a strengthening of the Aleutian Low pressure system with its northward flow of warm, moist air, driving most of the snowfall increases. Previous research has linked the warming tropical ocean temperatures to higher greenhouse gas concentrations.
The analysis includes a series of dramatic graphs that demonstrate extreme shifts in precipitation and reinforce the global climate connections that link snowfall in the high reaches of the North American continent with warm tropical waters. As noted in the paper (Ref. 66), this same atmospheric connection accounts for a decrease in Hawaiian precipitation.
"Everywhere we look in the North Pacific, we're seeing this same fingerprint from warming tropical oceans. One result is that wintertime climate in the North Pacific is very different than it was 200 years ago. This doesn't just affect Alaska, but Hawaii and the entire Pacific Northwest are impacted as well," said Winski.
The research builds on a recent study using the same ice cores that showed that an intensification of winter storm activity in Alaska and Northwestern Canada, driven by the strengthening Aleutian Low, started in 1740 and is unprecedented in magnitude and duration over the past millennium. The new record shows the result of that increase in Aleutian Low storm activity on snow accumulation.
For this analysis, researchers were able to segment the ice core records by seasons and years using markers like magnesium from spring dust to separate winter snow from summer snow. To account for snow layers getting squeezed and thinned under their own weight, the researchers applied four separate equations used in other studies, and in all cases the corrected record shows at least a doubling of snowfall.
According to the paper, while numerous snow accumulation records exist, "to our knowledge, no other alpine ice core accumulation record has been developed with such a thorough characterization of the thinning regime or uncertainties; all of the thinning models produce a robust increase in accumulation since the mid-19th century above late-Holocene background values."
The researchers note that the findings imply that regions that are sensitive to warming tropical ocean waters may continue to experience rain and snowfall variability well outside the natural range of the past millennium.
"Climate change can impact specific regions in much more extreme ways than global averages indicate because of unexpected responses from features like the Aleutian Low," said Osterberg. "The Mount Hunter record captures the dramatic changes that can occur when you get a double whammy from climate change – warming air combined with more storms from warming ocean temperatures."
However, the researchers also note that the regional findings do not necessarily mean that the same level of snowfall increases will occur elsewhere throughout the mid- and high latitudes.
"Scientists keep discovering that on a regional basis, climate change is full of surprises. We need to understand these changes better to help communities prepare for what will come with even more carbon dioxide pollution in the air," said Osterberg.
As part of the analysis, the authors suggest that current climate models underestimate the sensitivity of North Pacific atmospheric connections to warming tropical ocean temperatures. They argue that refining the way the modeled atmosphere responds to tropical ocean temperatures may improve rain and snowfall predictions in a warming world.
This research was supported by the NSF (National Science Foundation) Paleoclimate Program (P2C2).
Arctic sea ice loss could dry out California
December 2017: Arctic sea ice loss of the magnitude expected in the next few decades could impact California's rainfall and exacerbate future droughts, according to new research led by LLNL (Lawrence Livermore National Laboratory) scientists. 67)
Figure 35: Extent of Arctic sea ice in September 2016 versus the 1981-2010 average minimum extent (gold line). Through satellite images, researchers have observed a steep decline in the average extent of Arctic sea ice for every month of the year (image credit: NASA)
The dramatic loss of Arctic sea ice cover observed over the satellite era is expected to continue throughout the 21st century. Over the next few decades, the Arctic Ocean is projected to become ice-free during the summer. A new study by Ivana Cvijanovic and colleagues from LLNL and University of California, Berkeley shows that substantial loss of Arctic sea ice could have significant far-field effects, and is likely to impact the amount of precipitation California receives. The research appears in the Dec. 5 edition of Nature Communications. 68)
The study identifies a new link between Arctic sea ice loss and the development of an atmospheric ridging system in the North Pacific. This atmospheric feature also played a central role in the 2012-2016 California drought and is known for steering precipitation-rich storms northward, into Alaska and Canada, and away from California. The team found that sea ice changes can lead to convection changes over the tropical Pacific. These convection changes can in turn drive the formation of an atmospheric ridge in the North Pacific, resulting in significant drying over California.
"On average, when considering the 20-year mean, we find a 10-15 percent decrease in California's rainfall. However, some individual years could become much drier, and others wetter," Cvijanovic said.
The study does not attribute the 2012-2016 drought to Arctic sea ice loss. However, the simulations indicate that the sea-ice driven precipitation changes resemble the global rainfall patterns observed during that drought, leaving the possibility that Arctic sea-ice loss could have played a role in the recent drought.
"The recent California drought appears to be a good illustration of what the sea-ice driven precipitation decline could look like," she explained.
California's winter precipitation has decreased over the last two decades, with the 2012-2016 drought being one of the most severe on record. The impacts of reduced rainfall have been intensified by high temperatures that have enhanced evaporation. Several studies suggest that recent Californian droughts have a manmade component arising from increased temperatures, with the likelihood of such warming-enhanced droughts expected to increase in the future.
Figure 36: Schematics of the teleconnection through which Arctic sea-ice changes drive precipitation decrease over California. Arctic sea-ice loss induced high-latitude changes first propagate into the tropics, triggering tropical circulation and convection responses. Decreased convection and decreased upper level divergence in the tropical Pacific then drive a northward propagating Rossby wavetrain, with anticyclonic flow forming in the North Pacific. This ridge is responsible for steering the wet tropical air masses away from California (image credit: LLNL, Kathy Seibert)
"Our study identifies one more pathway by which human activities could affect the occurrence of future droughts over California — through human-induced Arctic sea ice decline," Cvijanovic said. "While more research should be done, we should be aware that an increasing number of studies, including this one, suggest that the loss of the Arctic sea ice cover is not only a problem for remote Arctic communities, but could affect millions of people worldwide. Arctic sea ice loss could affect us, right here in California."
Other co-authors on the study include Benjamin Santer, Celine Bonfils, Donald Lucas and Susan Zimmerman from LLNL and John Chiang from the University of California, Berkeley.
The research is funded by DOE (Department of Energy) Office of Science. Cvijanovic and Bonfils were funded by the DOE Early Career Research Program Award and Lucas is funded by the DOE Office of Science through the SciDAC project on Multiscale Methods for Accurate, Efficient and Scale-Aware Models of the Earth System.
Increasing Wildfires in the boreal forests of northern Canada and Alaska due to Lightning
December 2017: Wildfires in the boreal forests of northern Canada and Alaska have been increasing in frequency and the amount of area burned, and the drivers of large fire years are still poorly understood. But recent NASA-funded research offers at least one possible cause: more lightning. As global warming continues, lightning storms and warmer conditions are expected to spread farther north, meaning fire could significantly alter the landscape over time. 69)
A record number of lightning-ignited fires burned in Canada's Northwest Territories in 2014 and in Alaska in 2015. Scientist Sander Veraverbeke (Vrije Universiteit Amsterdam and University of California, Irvine) and colleagues examined data from satellites and from ground-based lightning networks to see if they could figure out why those seasons were so bad.
The team found that the majority of fires in their study areas in 2014 and 2015 were ignited by lightning storms, as opposed to human activity. That is natural, given the remoteness of the region, but it also points to more frequent lightning strikes in an area not known for as many thunderstorms as the tropics or temperate regions. Looking at longer trends, the researchers found that lightning-ignited fires in the region have been increasing by 2 to 5 percent per year since 1975, a trend that is consistent with climate change. The study was published in July 2017 in the journal Nature Climate Change. 70)
"We found that it is not just a matter of more burning with higher temperatures. The reality is more complex," Veraverbeke said. "Higher temperatures also spur more thunderstorms. Lightning from these thunderstorms is what has been igniting many more fires in these recent extreme events."
The map of Figure 37 shows the location and ignition source (lightning or human caused) for forest fires in interior Alaska in 2015. The map of Figure 38 shows the timing of the fires (June, July, or August) within the inset box. Both maps are based on data from the Veraverbeke study, which combined observations from the Alaska Fire Emissions Database, computer models, and fire observations from the MODIS (Moderate Resolution Imaging Spectroradiometer) instruments on NASA's Terra and Aqua satellites.
The fire season in the far north has typically peaked in July, after the spring thaw and the melting of winter snow. As global temperatures continue to rise, especially in the polar regions, thawing and warming tend to happen earlier in the spring and summer and at a more extensive level than in the past. The warmer weather also leads to more atmospheric instability, bringing more thunderstorms. The researchers asserted in the paper that "extreme fire years result when high levels of lightning ignition early in the growing season are followed by persistent warm and dry conditions that accelerate fire spread later in midsummer."
Figure 37: Location and ignition source (lightning or human caused) for forest fires in interior Alaska in 2015, acquired with MODIS on Terra and Aqua and in situ measurements (image credit: NASA Earth Observatory, maps and charts by Jesse Allen using data provided by Sander Veraverbeke (Vrije Universiteit). Story by Mike Carlowicz (NASA Earth Observatory), Alan Buis (Jet Propulsion Laboratory), and Brian Bell (University of California, Irvine)
Figure 38: The timing of the fires (June, July, or August) within the inset box using data of the Veraverbeke study (image credit: NASA Earth Observatory and Lightning Study)
Brendan Rogers of the Woods Hole Research Center said these trends are likely to continue. "We expect an increasing number of thunderstorms, and hence fires, across the high latitudes in the coming decades as a result of climate change."
The researchers also found that wildfires are creeping farther north, closer to the transition zone between boreal forests and Arctic tundra. Together, these areas include at least 30 percent of the world's tree cover and 35 percent of its stored soil carbon.
Figure 39: Ignition density in the Northwest Territories (x 10-5 ignitions/km2) acquired in the timeframe 1975-2015 (image credit: NASA Earth Observatory and Lightning Study)
Figure 40: Ignition density in Alaska (x 10-5 ignitions/km2) acquired in the timeframe 1975-2015 (image credit: NASA Earth Observatory and Lightning Study)
"In these high-latitude ecosystems, permafrost soils store large amounts of carbon that become vulnerable after fires pass through," said James Randerson of UC Irvine. "Exposed mineral soils after tundra fires also provide favorable seedbeds for trees migrating north under a warmer climate."
"Taken together, we discovered a complex feedback loop between climate, lightning, fires, carbon and forests that may quickly alter northern landscapes," Veraverbeke said. "A better understanding of these relationships is critical to better predict future influences from climate on fires and from fires on climate."
Study co-author Charles Miller of NASA/JPL (Jet Propulsion Laboratory) added that while data from the lightning networks were critical to this study, it is challenging to use these data for trend detection because of continuing network upgrades. "A spaceborne sensor that provides high northern latitude lightning data would be a major step forward."
Global Carbon Budget 2017
November 2017: Following three years of no growth, global GHG (Greenhouse Gas) emissions from human activities are projected to increase by 2% by the end of 2017, according to the nongovernmental organization GCP (Global Carbon Project). The increase, to a record 37 billion tons of carbon dioxide equivalent, dashed hopes in the environmental community that CO2 emissions from human activity might have plateaued and begun turning downward. 71)
In a set of three reports published 13 November, GCP said the biggest cause of the increase is the 3.5% growth in China, the world's largest emitter of greenhouse gases. The country experienced higher energy demand, particularly from industry, and a decline in hydroelectric power due to sparse rainfall. 72) 73) 74)
In addition, the decade-long trend in emissions reductions by the US and the European Union, the second- and third-largest emitters, respectively, appears to have slowed this year. The EU's output hasn't declined appreciably since 2015. The US output declined by 0.4%, compared with a 1.2% average annual reduction during the previous 10 years. Coal consumption in the US inched up 0.5%, its first increase in five years.
India, the fourth-largest greenhouse gas emitter, limited its growth to 2% this year, compared with a 6% jump in 2016. Emissions from all other countries increased 2.3% from 2016, to 15.1 gigatons (Figure 41).
Figure 41: The world's four largest carbon dioxide emitters—China, the US, the European Union, and India—account for about 60% of global emissions. Although those countries have made strides recently, their emissions and those globally (expected year-to-year percent change and error bars shown under each country) will probably tick upward in 2017 Image credit: Global Carbon Project, CC BY 4.0)
Despite the 2014–16 hiatus in global emissions growth, CO2 has continued to accumulate in the atmosphere at a faster pace than at any time during the 50 years that measurements have been kept. The elevated global temperatures resulting from the 2015–16 El Niño diminished the capacity of terrestrial ecosystems to take up CO2 from the atmosphere, the GCP reports said.
Corinne Le Quéré of the University of East Anglia (Norwich, UK), lead author of the principal report (Ref. 72) that was published in Earth System Science Data, said in an email that she expects emissions to plateau or grow slightly in the coming years. But they are unlikely to return to the 3% growth levels that were seen regularly in the decade that ended in 2010.
Kelly Levin of the nonprofit WRI (World Resources Institute) cautions against reading too much into a single year's data but also warns about the perilous big picture. "To have a chance of transforming the economy in time to stay below 2 °C, global GHG emissions must peak by 2020," she says. WRI's analysis, and another by the UNEP (United Nations Environment Program), predict on the basis of current trends and treaty commitments that the peak in global emissions won't occur until after 2030. At that point, the probability of limiting global warming to 2 °C could be as low as 50%, even with accelerated national reduction commitments, rapid abandonment of fossil fuel use, and deployment of carbon-removal technologies whose feasibility hasn't yet been demonstrated.
The 2 °C mark is thought by most climate scientists to be the threshold below which the worst impacts of climate change can be avoided. The 2015 Paris climate agreement set an "aspirational" goal of limiting temperature increase to 1.5 °C.
The WRI analysis says the number of countries whose emissions have peaked or are committed to peak will increase from 49 in 2010 to 53 by 2020 and to 57 by 2030. Those countries accounted for 36% of world greenhouse gas emissions in 2010 and will represent 60% of the total in 2030, when China has committed to peak its output.
Despite last year's emissions increase, China's coal consumption this year is still about 8% below its record 2013 high. The Chinese government has projected a near-doubling of the nation's solar energy production over the next two years, to 213 GW. China's nonfossil energy sources make up 14.3% of overall energy production, up by one percentage point in less than a year.
Study of Global Light Pollution at Night
November 22, 2017: They were supposed to bring about an energy revolution—but the popularity of LED (Light-Emitting Diode) lights is driving an increase in light pollution worldwide, with dire consequences for human and animal health, researchers said in their study. Five years of advanced satellite images show that there is more artificial light at night across the globe, and that light at night is getting brighter. The rate of growth is approximately two percent each year in both the amount of areas lit and the radiance of the light. 75) 76) 77)
An international team of scientists reported the results of a landmark study of global light pollution and the rise of LED outdoor lighting technology. The study finds both light pollution and energy consumption by lighting steadily increasing over much of the planet. The findings also challenge the assumption that increases in the energy efficiency of outdoor lighting technologies necessarily lead to an overall decrease in global energy consumption.
The team, led by Christopher Kyba of the GFZ (German Research Center for Geosciences) in Potsdam, Germany, analyzed five years of images from the Suomi NPP (Suomi National Polar-orbiting Partnership) satellite, operated jointly by NASA and NOAA (National Oceanic and Atmospheric Administration). The data show gains of 2% per year in both the amount of the Earth's surface that is artificially lit at night and the quantity of light emitted by outdoor lighting. Increases were seen almost everywhere the team looked into, with some of the largest gains in regions that were previously unlit.
"Light is growing most rapidly in places that didn't have a lot of light to start with," Kyba noted. "That means that the fastest rates of increase are occurring in places that so far hadn't been very strongly affected by light pollution."
The results reported today confirm suggestions in earlier research based on data obtained with U.S. Department of Defense meteorological satellite measurements (DMSP series) going back to the 1970s. However, the better sensitivity of Suomi's cameras to light on the night side of Earth and significantly improved ground resolution led to more robust conclusions about the changing illumination of the world at night.
The study is among the first to examine the effects, as seen from space, of the ongoing worldwide transition to LED lighting. Kyba's team found that the energy saving effects of LED lighting on country-level energy budgets are lower than expected from the increase in the efficiency of LEDs compared to older lamps.
Figure 42: Infographic showing the number of countries experiencing various rates of change of night lights during 2012-2016 (image credit: Kyba and the Study Team)
Environmental Gains Unrealized : LED lighting requires significantly less electricity to yield the same quantity of light as older lighting technologies. Proponents of LED lighting have argued that the high energy efficiency of LEDs would contribute to slowing overall global energy demand, given that outdoor lighting accounts for a significant fraction of the nighttime energy budget of the typical world city.
The team tested this idea by comparing changes in nighttime lighting seen from Earth orbit to changes in countries' GDP (Gross Domestic Product) – a measure of their overall economic output – during the same time period. They concluded that financial savings from the improved energy efficiency of outdoor lighting appear to be invested into the deployment of more lights. As a consequence, the expected large reductions in global energy consumption for outdoor lighting have not been realized.
Kyba expects that the upward global trend in use of outdoor lighting will continue, bringing a host of negative environmental consequences. "There is a potential for the solid-state lighting revolution to save energy and reduce light pollution," he added, "but only if we don't spend the savings on new light".
IDA (International Dark-Sky Association) has campaigned for the last 30 years to bring attention to the known and suspected hazards associated with the use of artificial light at night. IDA Executive Director J. Scott Feierabend pointed out repercussions including harm to wildlife, threats to human wellbeing, and potentially compromised public safety. IDA drew public attention to concerns associated with the strong blue light emissions of LED lighting as early as 2010.
"Today's announcement validates the message IDA has communicated for years," Feierabend explained. "We hope that the results further sound the alarm about the many unintended consequences of the unchecked use of artificial light at night."
Satellite imagery: The VIIRS (Visible Infrared Imaging Radiometer Suite) DNB (Day-Night Band) of the Suomi NPP mission started observations in 2012 -just as outdoor use of LED lighting began in earnest. This sensor provides the first-ever global calibrated nighttime radiance measurements in a spectral band of 500 to 900 nm, which is close to the visible band, with a much higher radiometric sensitivity than the DMSP series, and at a spatial resolution of ~750 m. This improved spatial resolution allows neighborhood (rather than city or national) scale changes in lighting to be investigated for the first time.
The cloud-free DNB data show that over the period of 2012–2016, both lit area and the radiance of previously lit areas increased in most countries (Figure 43) in the 500–900 nm range, with global increases of 2.2% per year for lit area and 2.2% per year for the brightness of continuously lit areas. Overall, the radiance of areas lit above 5 nWcm-2 sr-1 increased by 1.8% per year. These factors decreased in very few countries, including several experiencing warfare. They were also stable in only a few countries, interestingly including some of the world's brightest (for example, Italy, Netherlands, Spain, and the United States). With few exceptions, growth in lighting occurred throughout South America, Africa, and Asia. Because the analysis of lit area and total radiance is not subject to a stability criterion, transient lights such as wildfires can cause large fluctuations.
Australia experienced a major decrease in lit area from 2012 to 2016 for this reason(Figures 43A and 44). However, fire-lit areas failed the stability test and were therefore not included in the radiance change analysis (Figure 43B). A small number of countries have "no data" because of either their extreme latitude (Iceland) or the lack of observed stable lights above 5 nWcm-2 sr-1 in the cloud-free composite (for example, Central African Republic).
Figure 43: Geographic patterns in changes in artificial lighting. Changes are shown as an annual rate for both lit area (A) and radiance of stably lit areas (B). Annual rates are calculated based on changes over the four year period, that is, (A2016/A2012)1/4, where A2016 is the lit area observed in 2016 (image credit: Study Team)
Figure 44: Absolute change in lit area from 2012 to 2016. Pixels increasing in area are shown as red, pixels decreasing in area are shown as blue, and pixels with no change in area are shown as yellow. Each pixel has a near-equal area of 6000 ± 35 km2. To ease interpretation, the color scale cuts off at 200 km2, but some pixels had changes of up to ±2000 km2 (image credit: Study Team)
Comparisons of the VIIRS data with photographs taken from aboard the ISS (International Space Station) show that the instrument on Suomi-NPP sometimes records a dimming of some cities even though these cities are in fact the same in brightness or even more brightly lit. The reason for this is that sensor can't "see" light at wavelengths below 500 nm, i.e. blue light. When cities replace orange lamps with white LED lights that emit considerable radiation below 500 nm, VIIRS mistakes the change for a decrease. In short: The Earth's night-time surface brightness and especially the skyglow over cities is increasing, probably even in the cases where the satellite detects less radiation. 78)
There is, however, hope that things will change for the better. Christopher Kyba says: "Other studies and the experience of cities like Tucson, Arizona, show that well designed LED lamps allow a two-third or more decrease of light emission without any noticeable effect for human perception." Kyba's earlier work has shown that the light emission per capita in the United States of America is 3 to 5 times higher than that in Germany. Kyba sees this as a sign that prosperity, safety, and security can be achieved with conservative light use. "There is a potential for the solid state lighting revolution to save energy and reduce light pollution," adds Kyba, "but only if we don't spend the savings on new light."
November 2017: The Changing Colors of our Living Planet
Life. It's the one thing that, so far, makes Earth unique among the thousands of other planets we've discovered. Since the fall of 1997, NASA satellites have continuously and globally observed all plant life at the surface of the land and ocean. During the week of Nov. 13-17, NASA is sharing stories and videos about how this view of life from space is furthering knowledge of our home planet and the search for life on other worlds. 79)
NASA satellites can see our living Earth breathe. In the Northern Hemisphere, ecosystems wake up in the spring, taking in carbon dioxide and exhaling oxygen as they sprout leaves — and a fleet of Earth-observing satellites tracks the spread of the newly green vegetation.
Meanwhile, in the oceans, microscopic plants drift through the sunlit surface waters and bloom into billions of carbon dioxide-absorbing organisms — and light-detecting instruments on satellites map the swirls of their color.
This fall marks 20 years since NASA has continuously observed not just the physical properties of our planet, but the one thing that makes Earth unique among the thousands of other worlds we've discovered: Life.
Satellites measured land and ocean life from space as early as the 1970s. But it wasn't until the launch of SeaWiFS (Sea-viewing Wide Field-of-view Sensor) in 1997 that the space agency began what is now a continuous, global view of both land and ocean life. A new animation captures the entirety of this 20-year record, made possible by multiple satellites, compressing a decades-long view of life on Earth into a captivating few minutes.
"These are incredibly evocative visualizations of our living planet," said Gene Carl Feldman, an oceanographer at NASA's Goddard Space Flight Center in Greenbelt, Maryland. "That's the Earth, that is it breathing every single day, changing with the seasons, responding to the Sun, to the changing winds, ocean currents and temperatures."
The space-based view of life allows scientists to monitor crop, forest and fisheries health around the globe. But the space agency's scientists have also discovered long-term changes across continents and ocean basins. As NASA begins its third decade of global ocean and land measurements, these discoveries point to important questions about how ecosystems will respond to a changing climate and broad-scale changes in human interaction with the land.
Satellites have measured the Arctic getting greener, as shrubs expand their range and thrive in warmer temperatures. Observations from space help determine agricultural production globally, and are used in famine early warning detection. As ocean waters warm, satellites have detected a shift in phytoplankton populations across the planet's five great ocean basins — the expansion of "biological deserts" where little life thrives. And as concentrations of carbon dioxide in the atmosphere continue to rise and warm the climate, NASA's global understanding of plant life will play a critical role in monitoring carbon as it moves through the Earth system.
Figure 45: From space, satellites can see Earth breathe. A new NASA visualization shows 20 years of continuous observations of plant life on land and at the ocean's surface, from September 1997 to September 2017. On land, vegetation appears on a scale from brown (low vegetation) to dark green (lots of vegetation); at the ocean surface, phytoplankton are indicated on a scale from purple (low) to yellow (high). This visualization was created with data from satellites including SeaWiFS, and instruments including the NASA/NOAA VIIRS (Visible Infrared Imaging Radiometer Suite) and the MODIS (Moderate Resolution Imaging Spectroradiometer (image credit: NASA)
Sixty years ago, people were not sure that Earth's surface could be seen clearly from space. Many thought that the dust particles and other aerosols in the atmosphere would scatter the light, masking the oceans and continents, said Jeffrey Masek, chief of the Biospheric Sciences Laboratory at NASA Goddard.
The Gemini and Apollo programs demonstrated otherwise. Astronauts used specialized cameras to take pictures of Earth that show the beauty and complexity of our living planet, and helped kickstart the era of Earth science research from space. In 1972, the first Landsat mission began its 45-year record of vegetation and land cover. "As the satellite archive expands, you see more and more dynamics emerging," Masek said. "We're now able to look at long-term trends."
The grasslands of Senegal, for example, undergo drastic seasonal changes. Grasses and shrubs flourish during the rainy season from June to November, then dry up when the rain stops. With early weather satellite data in the 1970s and '80s, NASA Goddard scientist Compton Tucker was able to see that greening and die-back from space, measuring the chlorophyll in the plants below. He developed a way of comparing satellite data from two wavelengths, which gives a quantitative measurement of this greenness called the NDVI (Normalized Difference Vegetation Index).
"We were astounded when we saw the first images. They were amazing because they showed how vegetation changed annually, year after year," Tucker said, noting that others were surprised as well when the study came out in 1985. "When we produced this paper, people accused us of ‘painting by numbers,' or fudging data. But for the first time, you could study vegetation from space based on their photosynthetic capacity."
When the temperature is right, and water and sunlight are available, plants photosynthesize and produce vegetative material. Leaves strongly absorb blue and red light but reflect near-infrared light back into space. By comparing the ratio of red to near-infrared light, Tucker and his colleagues could quantify the vegetation covering the land.
Expanding these observations to the rest of the globe, the scientists could track the impact on plants of rainy and dry seasons elsewhere in Africa, see the springtime blooms in North America, and the after-effects of wildfires in forests worldwide.
But land is only part of the story. At the base of the ocean's food web is phytoplankton — tiny organisms that, like land plants, turn water and carbon dioxide into sugar and oxygen, aided by the right combination of nutrients and sunlight.
Satellites that can monitor the subtle changes in color of the ocean have helped scientists track changes in phytoplankton populations across the globe. The first view of ocean color came from the CZCS (Coastal Zone Color Scanner), a proof-of concept instrument launched in 1979. Continuous observations of ocean color began with the launch of SeaWIFS in late 1997. The satellite was just in time to capture the transition from El Niño to La Niña conditions in 1998 — revealing just how quickly and dramatically phytoplankton respond to changing ocean conditions.
"The entire Eastern Pacific, from the coast of South America all the way to the dateline, transitioned from what was the equivalent of a biological desert to a thriving rainforest. And we watched it happen in real time," Feldman said. "For me, that was the first demonstration of the power of this kind of observation, to see how the ocean responds to one of the most significant environmental perturbations it could experience, over the course of just a few weeks. It also showed that the ocean and all the life within it is amazingly resilient — if given half a chance."
Figure 46: The SeaWiFS satellite was launched in late 1997, just in time to capture the phytoplankton that bloomed in the Eastern Equatorial Pacific as conditions changed from El Niño to La Niña, seen here in yellow (image credit: NASA)
Tracking change from satellites: With 20 years of satellite data tracking ocean plant life on a global scale, scientists are investigating how habitats and ecosystems are responding to changing environmental conditions.
Recent studies of ocean life have shown that a long-term trend of rising sea surface temperatures is causing ocean regions known as "biological deserts" to expand. These regions of low phytoplankton growth occur in the center of large, slow-moving currents called gyres.
"As the surface waters warm, it creates a stronger boundary between the deep, cold, nutrient-rich waters and the sunlit, generally nutrient-poor surface waters," Feldman said. This prevents nutrients from reaching phytoplankton at the surface, and could have significant consequences for fisheries and the marine ecosystem.
In the Arctic Ocean, an explosion of phytoplankton indicates change. As seasonal sea ice melts, warming waters and more sunlight will trigger a sudden, massive phytoplankton bloom that feeds birds, sea lions and newly hatched fish. But with warming atmospheric temperatures, that bloom is now happening several weeks early — before the animals are in place to take advantage of it. "It's not just the amount of food, it's the location and timing that are just as critical," Feldman said. "Spring bloom is coming earlier, and that's going to impact the ecosystem in ways we don't yet understand."
The climate is warming fastest in Arctic regions, and the impacts on land are visible from space as well. The tundra of Western Alaska, Quebec and elsewhere is turning greener as shrubs extend their reach northwards.
The neighboring northern forests are changing as well. Massive fires in 2004 and 2015 wiped out millions of acres of forests in Alaska, including spruce forests, noted Chris Potter, a research scientist at NASA's Ames Research Center in California's Silicon Valley. "These fires were amazing in the amount of forest area they burned and how hot they burned," Potter said. "When the air temperature hits 90 degrees Fahrenheit (32ºC) in late May up there, and all these lightning strikes occurred, the forest burned very extensively — close to rivers, close to villages — and nothing could stop it."
Satellites help scientists routinely map fires, deforestation and other changes, and to analyze their impact on the carbon cycle, Potter said. Giant fires release many tons of carbon dioxide into the atmosphere, both from the charred trees and moss but also, especially in northern latitudes, from the soils. Potter and colleagues went to the burned areas of Central Alaska this year to measure the underlying permafrost — the thick mossy layer had burned off, exposing the previously frozen soils. "It's like taking the insulating layer off a cooler," he said. "The ice melts underneath and it becomes a slushy mess."
Forest types can change too, whether it's after wildfires, insect infestations or other disturbance. The Alaskan spruce forests are being replaced with birch. Potter and his colleagues are also keeping an eye on California forests burned in recent fires, where the concern is that pines will be replaced by oaks. "When drought is accentuated with these record high temperatures, nothing good seems to come from that for the existing forest type," he said. "I think we're seeing real clear evidence of climate causing land-cover change."
Keeping an eye on crops: Changing temperatures and rainfall patterns also influence crops, whether they are grown in California or Africa. The "greenness" measurement that scientists use to measure forests and grasslands can also be used for agriculture, to monitor the health of fields throughout the growing season.
Researchers and policy makers realized this potential early. One of the first applications of Landsat data in the 1970s was to predict grain yields in Russia and better understand commodities markets. In 1985, food security analysts from USAID (United States Agency for International Development) approached NASA to incorporate satellite images into their Famine Early Warning Systems Network, to identify regions where food production has been limited by drought. That partnership continues today. With rainfall estimates, vegetation measurements, as well as the recent addition of soil moisture information, NASA scientists can help organizations like USAID direct emergency help.
With improved data from Landsat, the MODIS instruments on NASA's Terra and Aqua spacecraft and other satellites, and by combining data from multiple sensors, researchers are now able to track the growth of crops in individual fields, Tucker said.
The view from space not only helps monitor crops, but can help improve agricultural practices as well. A winery in California, for example, uses individual pixels of Landsat data to determine when to irrigate and how much water to use.
The next step for NASA scientists is actually looking at the process of photosynthesis from space. When plants undergo that chemical process, some of the absorbed energy fluoresces faintly back, notes Joanna Joiner, a NASA Goddard research scientist. With satellites that detect signals in the very specific wavelengths of this fluorescence, and a fine-tuned analysis technique that blocks out background signals, Joiner and her colleagues can see where and when plants start converting sunlight into sugars. - "It was kind of a revelation that yes, you can measure it," Joiner said. An early study looked at the U.S. Corn Belt and found it fluoresces "like crazy," she said. "Those plants have some of the highest fluorescence rates on Earth at their peak."
Joiner and Tucker are using both the fluorescence data and vegetation indices to get the most information possible about plant growth at regional and global scales: "One of the big questions that still remains is how much carbon are the plants taking up, why does it vary year to year, and which areas are contributing to that variability," Joiner said.
Whether it's crops, forests or phytoplankton blooms, NASA scientists are tracking life on Earth. Just as satellites help researchers study the atmosphere, rainfall and other physical characteristics of the planet, the ever-improving view from above will allow them to study the interconnected life of the planet, Feldman said.