EO (Earth Observation) Topics on Climate Change
EO (Earth Observation) Topics on Climate Change
Since the start of the space age, Earth observation is providing its share of evidence for a better perception and understanding of our Earth System and its response to natural or human-induced changes.
Earth is a complex, dynamic system we do not yet fully understand. The Earth system comprises diverse components that interact in complex ways. We need to understand the Earth's atmosphere, lithosphere, hydrosphere, cryosphere, and biosphere as a single connected system. Our planet is changing on all spatial and temporal scales.
Over the years, the entire Earth Observation community, the space agencies as well as other governmental bodies, and many international organizations (UN, etc.) are cooperating on a global scale to come to grips with the modeling of the Earth system, including a continuous process of re-assessment and improvement of these models. The goal is to provide scientific evidence to help guide society onto a sustainable pathway during rapid global change.
In the second decade of the 21st century, there is alarming evidence that important tipping points, leading to irreversible changes in major ecosystems and the planetary climate system, may already have been reached or passed. Ecosystems as diverse as the Amazon rainforest and the Arctic tundra, may be approaching thresholds of dramatic change through warming and drying. Mountain glaciers are in alarming retreat and the downstream effects of reduced water supply in the driest months will have repercussions that transcend generations. 1)
Table 1: Overview of some major international bodies involved in global-change research programs 2)
The UN Framework Convention on Climate Change (UNFCCC) is an intergovernmental treaty developed to address the problem of climate change. The Convention, which sets out an agreed framework for dealing with the issue, was negotiated from February 1991 to May 1992 and opened for signature at the June 1992 UN Conference on Environment and Development (UNCED) — also known as the Rio Earth Summit. The UNFCCC entered into force on 21 March 1994, ninety days after the 50th country's ratification had been received. By December 2007, the convention had been ratified by 192 countries. 3)
In the meantime, there were many UN conferences on Climate Change, starting with the UN climate conference in Kyoto, Japan, in December 1997. The Kyoto Protocol set standards for certain industrialized countries. Those targets expired in 2012.
Meanwhile, greenhouse gas emissions from both developed and developing countries have been increasing rapidly. Even today, those nations with the highest percentage of environment pollution, are not willing to enforce stricter environmental standards in their countries in order to protect their global business interests. It's a vicious cycle between these national interests and the deteriorating environment, resulting in more frequent and violent catastrophes on a global scale. All people on Earth are effected, even those who abide by their strict environmental rules.
The short descriptions in the following chapters are presented in reverse order on some topics of climate change to give the reader community an overview of research results in this wide field of global climate and environmental change.
Study of Extreme Wintertime Arctic Warm Event
January 16, 2018: In the winter of 2015/16, something happened that had never before been seen on this scale: at the end of December, temperatures rose above zero degrees Celsius for several days in parts of the Arctic. Temperatures of up to eight degrees were registered north of Svalbard. Temperatures this high have not been recorded in the winter half of the year since the beginning of systematic measurements at the end of the 1970s. As a result of this unusual warmth, the sea ice began to melt. 4)
"We heard about this from the media," says Heini Wernli, Professor of Atmospheric Dynamics at ETH Zurich. The news aroused his scientific curiosity, and a team led by his then doctoral student Hanin Binder investigated the issue. In December 2017, they published their analysis of this exceptional event in the journal Geophysical Research Letters. 5)
The researchers show in their paper how these unusual temperatures arose: three different air currents met over the North Sea between Scotland and southern Norway, carrying warm air northwards at high speed as though on a "highway" (Figure 1).
One air current originated in the Sahara and brought near-surface warm air with it. To begin with, temperature of this air was about 20º Celsius. While it cooled off on its way to the Arctic, it was still above zero when it arrived. "It's extremely rare for warm, near-surface subtropical air to be transported as far as the Arctic," says Binder.
The second air current originated in the Arctic itself, a fact that astonished the scientists. To begin with, this air was very cold. However, the air mass – which also lay close to the ground – moved towards the south along a curved path and, while above the Atlantic, was warmed significantly by the heatflux from the ocean before joining the subtropical air current.
The third warm air current started as a cold air mass in the upper troposphere, from an altitude above 5 km. These air masses were carried from west to east and descended in a stationary high-pressure area over Scandinavia. Compression thereby warmed the originally cold air, before it entered the "highway to the Arctic".
Figure 1: Schematic illustration of the unusual processes that led to the Arctic warm event (warm air highway), image credit: Sandro Bösch / ETH Zurich
Poleward warm air transport: This highway of air currents was made possible by a particular constellation of pressure systems over northern Europe. During the period in question, intense low-pressure systems developed over Iceland while an extremely stable high-pressure area formed over Scandinavia. This created a kind of funnel above the North Sea, between Scotland and southern Norway, which channelled the various air currents and steered them northwards to the Arctic.
This highway lasted approximately a week. The pressure systems then decayed and the Arctic returned to its typical frozen winter state. However, the warm period sufficed to reduce the thickness of the sea ice in parts of the Arctic by 30 cm – during a period in which ice usually becomes thicker and more widespread.
"These weather conditions and their effect on the sea ice were really exceptional," says Binder. The researchers were not able to identify a direct link to global warming. "We only carried out an analysis of a single event; we didn't research the long-term climate aspects" emphasizes Binder.
However, the melting of Arctic sea ice during summer is a different story. The long-term trend is clear: the minimum extent and thickness of the sea ice in late summer has been shrinking continually since the end of the 1970s. Sea ice melted particularly severely in 2007 and 2012 – a fact which climate researchers have thus far been unable to fully explain. Along with Lukas Papritz from the University of Bergen, Wernli investigated the causes of these outliers.
According to their research, the severe melting in the aforementioned years was caused by stable high-pressure systems that formed repeatedly throughout the summer months. Under these cloud-free weather conditions, the high level of direct sunlight – the sun shines 24 hours a day at this time of year – particularly intensified the melting of the sea ice.
The extreme event was the result of a very unusual large-scale flow configuration in early winter 2015/2016 that came along with overall anomalously warm conditions in Europe (National Oceanic and Atmospheric Administration, 2016) and other regional extremes, for example, flooding in the UK. 6) In this study (Ref. 5),we focus on the Arctic. At the North Pole, buoys measured maximum surface temperatures of -0.8ºC on 30 December 7), and at the Svalbard airport station values of 8.7ºC were observed, the warmest temperatures ever recorded at that station between November and April (The Norwegian Meteorological Institute, 2016). According to operational analyses from the ECMWF (European Center for Medium-Range Weather Forecasts), the maximum 2 m temperature (T2m) north of 82ºN reached values larger than 0ºC during three short episodes between 29 December 2015 and 4 January 2016—almost 30 K above the winter climatological mean in this region (Figure 2a). They occurred in the Eurasian Arctic sector in the region around Svalbard and over the Kara Sea (purple contour in Figure 2b) and were the highest winter values since 1979 (Figure 2c). The warm event led to a thinning of the sea ice by more than 30 cm in the Barents and Kara Seas, and contributed to the record low Northern Hemisphere sea ice extent observed in January and February 2016 (National Snow and Ice Data Center, 2016).
Figure 2: Illustration of the Arctic warm event and its extremeness. (a) Temporal evolution of the domain maximum (red) and mean (blue) T2m (ºC) between 20 December 2015 and 10 January 2016 at latitudes ≥82ºN and between 120ºW and 120ºE, derived from operational analyses. Also shown are the domain mean December–February 1979–2014 climatological mean T2m (black), and the corresponding ±1 standard deviation envelope (grey) from ERA-Interim reanalysis data. (b) Maximum T2m (ºC) between 00 UTC 30 December 2015 and 18 UTC 4 January 2016 from operational analyses, with the purple contour highlighting the regions ≥82ºN with maximum T2m ≥ 0ºC. (c) Rank of maximum T2m shown in Figure 2b among all 6-hourly values in winter 1979–2014 in the ERA-Interim reanalyses (consisting of a total of 13,232 values), image credit: study team
Long-Term Warming Trend Continued in 2017
January 18, 2018: Earth's global surface temperatures in 2017 ranked as the second warmest since 1880, according to an analysis by NASA. Continuing the planet's long-term warming trend, globally averaged temperatures in 2017 were 1.62 º Fahrenheit (0.90º Celsius) warmer than the 1951 to 1980 mean, according to scientists at NASA's GISS (Goddard Institute for Space Studies) in New York. That is second only to global temperatures in 2016. 8)
In a separate, independent analysis, scientists at NOAA (National Oceanic and Atmospheric Administration) concluded that 2017 was the third-warmest year in their record. The minor difference in rankings is due to the different methods used by the two agencies to analyze global temperatures, although over the long-term the agencies' records remain in strong agreement. Both analyses show that the five warmest years on record all have taken place since 2010.
Because weather station locations and measurement practices change over time, there are uncertainties in the interpretation of specific year-to-year global mean temperature differences. Taking this into account, NASA estimates that 2017's global mean change is accurate to within 0.1º Fahrenheit, with a 95 percent certainty level.
"Despite colder than average temperatures in any one part of the world, temperatures over the planet as a whole continue the rapid warming trend we've seen over the last 40 years," said GISS Director Gavin Schmidt.
The planet's average surface temperature has risen about 2 degrees Fahrenheit (a little more than 1 degree Celsius) during the last century or so, a change driven largely by increased carbon dioxide and other human-made emissions into the atmosphere. Last year was the third consecutive year in which global temperatures were more than 1.8 degrees Fahrenheit (1 degree Celsius) above late nineteenth-century levels.
Phenomena such as El Niño or La Niña, which warm or cool the upper tropical Pacific Ocean and cause corresponding variations in global wind and weather patterns, contribute to short-term variations in global average temperature. A warming El Niño event was in effect for most of 2015 and the first third of 2016. Even without an El Niño event – and with a La Niña starting in the later months of 2017 – last year's temperatures ranked between 2015 and 2016 in NASA's records.
Figure 3: This map shows Earth's average global temperature from 2013 to 2017, as compared to a baseline average from 1951 to 1980, according to an analysis by NASA's Goddard Institute for Space Studies. Yellows, oranges, and reds show regions warmer than the baseline (image credit: NASA's Scientific Visualization Studio)
In an analysis where the effects of the recent El Niño and La Niña patterns were statistically removed from the record, 2017 would have been the warmest year on record.
Weather dynamics often affect regional temperatures, so not every region on Earth experienced similar amounts of warming. NOAA found the 2017 annual mean temperature for the contiguous 48 United States was the third warmest on record.
Warming trends are strongest in the Arctic regions, where 2017 saw the continued loss of sea ice.
NASA's temperature analyses incorporate surface temperature measurements from 6,300 weather stations, ship- and buoy-based observations of sea surface temperatures, and temperature measurements from Antarctic research stations.
These raw measurements are analyzed using an algorithm that considers the varied spacing of temperature stations around the globe and urban heating effects that could skew the conclusions. These calculations produce the global average temperature deviations from the baseline period of 1951 to 1980.
NOAA scientists used much of the same raw temperature data, but with a different baseline period, and different methods to analyze Earth's polar regions and global temperatures. The full 2017 surface temperature data set and the complete methodology used to make the temperature calculation are available.
GISS is a laboratory within the Earth Sciences Division of NASA's Goddard Space Flight Center in Greenbelt, Maryland. The laboratory is affiliated with Columbia University's Earth Institute and School of Engineering and Applied Science in New York.
NASA uses the unique vantage point of space to better understand Earth as an interconnected system. The agency also uses airborne and ground-based monitoring, and develops new ways to observe and study Earth with long-term data records and computer analysis tools to better see how our planet is changing. NASA shares this knowledge with the global community and works with institutions in the United States and around the world that contribute to understanding and protecting our home planet.
Study of Antarctic Ozone Hole Recovery
January 5, 2018: For the first time, scientists have shown through direct observations of the ozone hole by an instrument on NASA's Aura mission, that levels of ozone-destroying chlorine are declining, resulting in less ozone depletion. Measurements show that the decline in chlorine, resulting from an international ban on chlorine-containing human-produce chemicals called chlorofluorocarbons (CFCs), has resulted in about 20 percent less ozone depletion during the Antarctic winter than there was in 2005 — the first year that measurements of chlorine and ozone during the Antarctic winter were made by the Aura satellite. 9)
- "We see very clearly that chlorine from CFCs is going down in the ozone hole, and that less ozone depletion is occurring because of it," said lead author Susan Strahan, an atmospheric scientist from NASA's Goddard Space Flight Center in Greenbelt, Maryland. The study was published in the journal Geophysical Research Letters. 10)
- CFCs are long-lived chemical compounds that eventually rise into the stratosphere, where they are broken apart by the Sun's ultraviolet radiation, releasing chlorine atoms that go on to destroy ozone molecules. Stratospheric ozone protects life on the planet by absorbing potentially harmful ultraviolet radiation that can cause skin cancer and cataracts, suppress immune systems and damage plant life.
- Two years after the discovery of the Antarctic ozone hole in 1985, nations of the world signed the Montreal Protocol on Substances that Deplete the Ozone Layer, which regulated ozone-depleting compounds. Later amendments to the Montreal Protocol completely phased out production of CFCs.
- Past studies have used statistical analyses of changes in the ozone hole's size to argue that ozone depletion is decreasing. This study is the first to use measurements of the chemical composition inside the ozone hole to confirm that not only is ozone depletion decreasing, but that the decrease is caused by the decline in CFCs.
- The Antarctic ozone hole forms during September in the Southern Hemisphere's winter as the returning Sun's rays catalyze ozone destruction cycles involving chlorine and bromine that come primarily from CFCs. To determine how ozone and other chemicals have changed year to year, scientists used data from JPL's MLS (Microwave Limb Sounder) aboard the Aura satellite, which has been making measurements continuously around the globe since mid-2004. While many satellite instruments require sunlight to measure atmospheric trace gases, MLS measures microwave emissions and, as a result, can measure trace gases over Antarctica during the key time of year: the dark southern winter, when the stratospheric weather is quiet and temperatures are low and stable.
Figure 4: Using measurements from NASA's Aura satellite, scientists studied chlorine within the Antarctic ozone hole over the last several years, watching as the amount slowly decreased (image credit: NASA/GSFC, Katy Mersmann)
The change in ozone levels above Antarctica from the beginning to the end of southern winter — early July to mid-September — was computed daily from MLS measurements every year from 2005 to 2016. "During this period, Antarctic temperatures are always very low, so the rate of ozone destruction depends mostly on how much chlorine there is," Strahan said. "This is when we want to measure ozone loss."
They found that ozone loss is decreasing, but they needed to know whether a decrease in CFCs was responsible. When ozone destruction is ongoing, chlorine is found in many molecular forms, most of which are not measured. But after chlorine has destroyed nearly all the available ozone, it reacts instead with methane to form hydrochloric acid, a gas measured by MLS. "By around mid-October, all the chlorine compounds are conveniently converted into one gas, so by measuring hydrochloric acid we have a good measurement of the total chlorine," Strahan said.
Nitrous oxide is a long-lived gas that behaves just like CFCs in much of the stratosphere. The CFCs are declining at the surface but nitrous oxide is not. If CFCs in the stratosphere are decreasing, then over time, less chlorine should be measured for a given value of nitrous oxide. By comparing MLS measurements of hydrochloric acid and nitrous oxide each year, they determined that the total chlorine levels were declining on average by about 0.8 percent annually.
The 20 percent decrease in ozone depletion during the winter months from 2005 to 2016 as determined from MLS ozone measurements was expected. "This is very close to what our model predicts we should see for this amount of chlorine decline," Strahan said. "This gives us confidence that the decrease in ozone depletion through mid-September shown by MLS data is due to declining levels of chlorine coming from CFCs. But we're not yet seeing a clear decrease in the size of the ozone hole because that's controlled mainly by temperature after mid-September, which varies a lot from year to year."
Looking forward, the Antarctic ozone hole should continue to recover gradually as CFCs leave the atmosphere, but complete recovery will take decades. "CFCs have lifetimes from 50 to 100 years, so they linger in the atmosphere for a very long time," said Anne Douglass, a fellow atmospheric scientist at Goddard and the study's co-author. "As far as the ozone hole being gone, we're looking at 2060 or 2080. And even then there might still be a small hole."
Study solves a conflict in the post-2006 atmospheric methane budget concentrations
January 2, 2018: A new NASA-led study has solved a puzzle involving the recent rise in atmospheric methane, a potent greenhouse gas, with a new calculation of emissions from global fires. The new study resolves what looked like irreconcilable differences in explanations for the increase. 11)
Methane emissions have been rising sharply since 2006. Different research teams have produced viable estimates for two known sources of the increase: emissions from the oil and gas industry, and microbial production in wet tropical environments like marshes and rice paddies. But when these estimates were added to estimates of other sources, the sum was considerably more than the observed increase. In fact, each new estimate was large enough to explain the whole increase by itself.
John Worden of NASA's Jet Propulsion Laboratory in Pasadena, California, and colleagues focused on fires because they're also changing globally. The area burned each year decreased about 12 percent between the early 2000s and the more recent period of 2007 to 2014, according to a new study using observations by NASA's MODIS (Moderate Resolution Imaging Spectrometer) satellite instrument. The logical assumption would be that methane emissions from fires have decreased by about the same percentage. Using satellite measurements of methane and carbon monoxide, Worden's team found the real decrease in methane emissions was almost twice as much as that assumption would suggest.
When the research team subtracted this large decrease from the sum of all emissions, the methane budget balanced correctly, with room for both fossil fuel and wetland increases. The research is published in the journal Nature Communications. 12)
Most methane molecules in the atmosphere don't have identifying features that reveal their origin. Tracking down their sources is a detective job involving multiple lines of evidence: measurements of other gases, chemical analyses, isotopic signatures, observations of land use, and more. "A fun thing about this study was combining all this different evidence to piece this puzzle together," Worden said.
Carbon isotopes in the methane molecules are one clue. Of the three methane sources examined in the new study, emissions from fires contain the largest percentage of heavy carbon isotopes, microbial emissions have the smallest, and fossil fuel emissions are in between. Another clue is ethane, which (like methane) is a component of natural gas. An increase in atmospheric ethane indicates increasing fossil fuel sources. Fires emit carbon monoxide as well as methane, and measurements of that gas are a final clue.
Worden's team used carbon monoxide and methane data from the Measurements of Pollutants in the Troposphere instrument on NASA's Terra satellite and the Tropospheric Emission Spectrometer instrument on NASA's Aura to quantify fire emissions of methane. The results show these emissions have been decreasing much more rapidly than expected.
Combining isotopic evidence from ground surface measurements with the newly calculated fire emissions, the team showed that about 17 teragrams per year of the increase is due to fossil fuels, another 12 is from wetlands or rice farming, while fires are decreasing by about 4 teragrams per year. The three numbers combine to net emissions increase of ~25 Tg/year of CH4 — the same as the observed increase.
The magnitude of the global CH4 masses involved are illustrated by: 1 Tg (1 teragram) = 1012 g = 1,000,000 tons. Methane emissions are increasing by about 25 Tg/year, with total emissions currently of ~550 Tg/year budget.
Worden's coauthors are at the NCAR (National Center for Atmospheric Research), Boulder, Colorado; and the Netherlands Institute for Space Research and University of Utrecht, both in Utrecht, the Netherlands.
Figure 5: This time series was created using data from the MODIS instrument data onboard NASA's Terra and Aqua satellites. The burned area is estimated by applying an algorithm that detects rapid changes in visible and infrared surface reflectance imagery. Fires typically darken the surface in the visible part of the electromagnetic spectrum, and brighten the surface in several wavelength bands in the shortwave infrared that are sensitive to the surface water content of vegetation (image credit: NASA/GSFC/SVS)
Legend to Figure 5: Thermal emissions from actively burning fires also are measured by MODIS and are used to improve the burned area estimates in croplands and other areas where the fire sizes are relatively small. This animation portrays burned area between September 2000 and August 2015 as a percent of the 1/4 degree grid cell that was burned each month. The values on the color bar are on a log scale, so the regions shown in blue and green shades indicate small burned areas while those in red and orange represent a larger percent of the region burned. Beneath the burned area, the seasonal Blue Marble landcover shows the advance and retreat of snow in the northern hemisphere.
Trend in CH4 emissions from fires. Figure 6 shows the time series of CH4 emissions that were obtained from GFEDv4s (Global Fire Emissions Database, version 4s) and top-down estimates based on CO emission estimates and GFED4s-based emission ratios. The CO-based fire CH4 emissions estimates amount to 14.8 ± 3.8 Tg CH4 per year for the 2001–2007 time period and 11.1 ± 3 Tg CH4 per year for the 2008–2014 time period, with a 3.7 ± 1.4 Tg CH4 per year decrease between the two time periods. The mean burnt area (a priori)-based estimate from GFED4s is slightly larger and shows a slightly smaller decrease (2.3 Tg CH4 per year) in fire emissions after 2007 relative to the 2001–2006 time period. The range of uncertainties (shown as blue error bars in Figure 6 is determined by the uncertainty in top-down CO emission estimates that are derived empirically using the approaches discussed in the Methods). The red shading describes the range of uncertainty stemming from uncertainties in CH4/CO emission factors (Methods). By assuming temporally constant sector-specific CH4/CO emission factors, we find that mean 2001–2014 emissions average to 12.9 ± 3.3 Tg CH4 per year, and the decrease averages to 3.7 ± 1.4 Tg CH4 per year for 2008–2014, relative to 2001–2007. This decrease is largely accounted for by a 2.9 ± 1.2 Tg CH4 per year decrease during 2006–2008, which is primarily attributable to a biomass burning decrease in Indonesia and South America.
Figure 6: Trend of methane emissions from biomass burning. Expected methane emissions from fires based on the Global Fire Emissions Database (black) and the CO emissions plus CH4/CO ratios shown here (red). The range of uncertainties in blue is due to the calculated errors from the CO emissions estimate and the shaded red describes the range of error from uncertainties in the CH4/CO emission factors (image credit: Methane Study Team)
Industrial-age doubling of snow accumulation in the Alaska Range linked to tropical ocean warming
December 19, 2017: Snowfall on a major summit in North America's highest mountain range has more than doubled since the beginning of the Industrial Age, according to a study from Dartmouth College, the University of Maine, and the University of New Hampshire. The research not only finds a dramatic increase in snowfall, it further explains connections in the global climate system by attributing the record accumulation to warmer waters thousands of miles away in the tropical Pacific and Indian Oceans. 13)
The study demonstrates that modern snowfall in the iconic Alaska Range is unprecedented for at least the past 1200 years and far exceeds normal variability. "We were shocked when we first saw how much snowfall has increased," said Erich Osterberg, an assistant professor of Earth sciences at Dartmouth College and principal investigator for the research. "We had to check and double-check our results to make sure of the findings. Dramatic increases in temperature and air pollution in modern times have been well established in science, but now we're also seeing dramatic increases in regional precipitation with climate change."
According to the research, wintertime snowfall has increased 117 percent since the mid-19th century in southcentral Alaska in the United States. Summer snows also showed a significant increase of 49 percent in the short period ranging less than two hundred years.
The research, appearing in Scientific Reports, is based on analysis of two ice cores (each 208 m long) collected from the Mount Hunter summit plateau (62°56'N, 151°5'W, 3900 m) in Denali National Park, Alaska. A high snow accumulation rate (1.15 m water equivalent [w. e.] average since 1900) and infrequent surface melt (<0.5% of the core is composed of refrozen melt layers and lenses) at the Mt. Hunter drill site preserve robust seasonal oscillations of several chemical parameters (Na, Ca, Mg, NH4 +, MSA (methanesulfonic acid), δ18O, liquid conductivity, dust concentration), facilitating annual layer counting back to 800 CE (Common Era, Figure 7). — According to the authors, accumulation records in the separate samples taken from just below the summit of the mountain known as "Denali's Child" are in nearly complete agreement. 14)
Figure 7: Annual layer counting in the Mt. Hunter ice core. (A) Three chemical series exhibiting annual layers are shown at a representative depth of core: Mg (black), δ18O (blue) and MSA (red). Each vertical dotted line represents the depth of Jan. 1st in a δ18O trough and just below a Mg peak. The distance between each vertical dotted line represents one year's snow accumulation (before thinning correction). The position of these years was selected three times by three independent researchers. We delineate summer (May-August) and winter (September-April) seasons by recording the late summer-fall peak positions of MSA (purple circles) and the spring peak positions of Mg (orange circles).
The annually resolved Denali snow accumulation record (Figure 8) indicates that the post-1950 precipitation increase in the Alaskan weather station records began well before the 20th century, in circa 1840 CE.
"It is now glaringly clear from our ice core record that modern snowfall rates in Alaska are much higher than natural rates before the Industrial Revolution," said Dominic Winski, a research assistant at Dartmouth and the lead author of the report. "This increase in precipitation is also apparent in weather station data from the past 50 years, but ice cores show the scale of the change well above natural conditions."
Once the researchers established snowfall rates, they set out to identify why precipitation has increased so rapidly in such a short amount of time. Scientific models predict as much as a 2 percent increase in global precipitation per degree of warming because warmer air holds more moisture, but this could not account for most of the dramatic increases in Denali snowfall over the studied period.
Figure 8: The Mt. Hunter accumulation record. Annual (light gray line) and 21-year smoothed (black line) accumulation time series from the year 810 CE (Common Era) to present, constrained by 21-year smoothed error envelopes (blue shading) inclusive of stochastic, peak position and layer-thinning model uncertainties, including the total uncertainty range among all four modeling approaches. The inset shows seasonal trends in accumulation since 1867 with 21-year running means (bold lines). Snowfall accumulating between September and April (blue) has more than doubled, with a faster rise since 1976. Summer accumulation (April to August; red) remained comparatively stable except for a baseline shift between 1909 and 1925 (image credit: Dartmouth College, Dominic Winski)
The research suggests that warming tropical oceans have caused a strengthening of the Aleutian Low pressure system with its northward flow of warm, moist air, driving most of the snowfall increases. Previous research has linked the warming tropical ocean temperatures to higher greenhouse gas concentrations.
The analysis includes a series of dramatic graphs that demonstrate extreme shifts in precipitation and reinforce the global climate connections that link snowfall in the high reaches of the North American continent with warm tropical waters. As noted in the paper (Ref. 14), this same atmospheric connection accounts for a decrease in Hawaiian precipitation.
"Everywhere we look in the North Pacific, we're seeing this same fingerprint from warming tropical oceans. One result is that wintertime climate in the North Pacific is very different than it was 200 years ago. This doesn't just affect Alaska, but Hawaii and the entire Pacific Northwest are impacted as well," said Winski.
The research builds on a recent study using the same ice cores that showed that an intensification of winter storm activity in Alaska and Northwestern Canada, driven by the strengthening Aleutian Low, started in 1740 and is unprecedented in magnitude and duration over the past millennium. The new record shows the result of that increase in Aleutian Low storm activity on snow accumulation.
For this analysis, researchers were able to segment the ice core records by seasons and years using markers like magnesium from spring dust to separate winter snow from summer snow. To account for snow layers getting squeezed and thinned under their own weight, the researchers applied four separate equations used in other studies, and in all cases the corrected record shows at least a doubling of snowfall.
According to the paper, while numerous snow accumulation records exist, "to our knowledge, no other alpine ice core accumulation record has been developed with such a thorough characterization of the thinning regime or uncertainties; all of the thinning models produce a robust increase in accumulation since the mid-19th century above late-Holocene background values."
The researchers note that the findings imply that regions that are sensitive to warming tropical ocean waters may continue to experience rain and snowfall variability well outside the natural range of the past millennium.
"Climate change can impact specific regions in much more extreme ways than global averages indicate because of unexpected responses from features like the Aleutian Low," said Osterberg. "The Mount Hunter record captures the dramatic changes that can occur when you get a double whammy from climate change – warming air combined with more storms from warming ocean temperatures."
However, the researchers also note that the regional findings do not necessarily mean that the same level of snowfall increases will occur elsewhere throughout the mid- and high latitudes.
"Scientists keep discovering that on a regional basis, climate change is full of surprises. We need to understand these changes better to help communities prepare for what will come with even more carbon dioxide pollution in the air," said Osterberg.
As part of the analysis, the authors suggest that current climate models underestimate the sensitivity of North Pacific atmospheric connections to warming tropical ocean temperatures. They argue that refining the way the modeled atmosphere responds to tropical ocean temperatures may improve rain and snowfall predictions in a warming world.
This research was supported by the NSF (National Science Foundation) Paleoclimate Program (P2C2).
Arctic sea ice loss could dry out California
December 2017: Arctic sea ice loss of the magnitude expected in the next few decades could impact California's rainfall and exacerbate future droughts, according to new research led by LLNL (Lawrence Livermore National Laboratory) scientists. 15)
Figure 9: Extent of Arctic sea ice in September 2016 versus the 1981-2010 average minimum extent (gold line). Through satellite images, researchers have observed a steep decline in the average extent of Arctic sea ice for every month of the year (image credit: NASA)
The dramatic loss of Arctic sea ice cover observed over the satellite era is expected to continue throughout the 21st century. Over the next few decades, the Arctic Ocean is projected to become ice-free during the summer. A new study by Ivana Cvijanovic and colleagues from LLNL and University of California, Berkeley shows that substantial loss of Arctic sea ice could have significant far-field effects, and is likely to impact the amount of precipitation California receives. The research appears in the Dec. 5 edition of Nature Communications. 16)
The study identifies a new link between Arctic sea ice loss and the development of an atmospheric ridging system in the North Pacific. This atmospheric feature also played a central role in the 2012-2016 California drought and is known for steering precipitation-rich storms northward, into Alaska and Canada, and away from California. The team found that sea ice changes can lead to convection changes over the tropical Pacific. These convection changes can in turn drive the formation of an atmospheric ridge in the North Pacific, resulting in significant drying over California.
"On average, when considering the 20-year mean, we find a 10-15 percent decrease in California's rainfall. However, some individual years could become much drier, and others wetter," Cvijanovic said.
The study does not attribute the 2012-2016 drought to Arctic sea ice loss. However, the simulations indicate that the sea-ice driven precipitation changes resemble the global rainfall patterns observed during that drought, leaving the possibility that Arctic sea-ice loss could have played a role in the recent drought.
"The recent California drought appears to be a good illustration of what the sea-ice driven precipitation decline could look like," she explained.
California's winter precipitation has decreased over the last two decades, with the 2012-2016 drought being one of the most severe on record. The impacts of reduced rainfall have been intensified by high temperatures that have enhanced evaporation. Several studies suggest that recent Californian droughts have a manmade component arising from increased temperatures, with the likelihood of such warming-enhanced droughts expected to increase in the future.
Figure 10: Schematics of the teleconnection through which Arctic sea-ice changes drive precipitation decrease over California. Arctic sea-ice loss induced high-latitude changes first propagate into the tropics, triggering tropical circulation and convection responses. Decreased convection and decreased upper level divergence in the tropical Pacific then drive a northward propagating Rossby wavetrain, with anticyclonic flow forming in the North Pacific. This ridge is responsible for steering the wet tropical air masses away from California (image credit: LLNL, Kathy Seibert)
"Our study identifies one more pathway by which human activities could affect the occurrence of future droughts over California — through human-induced Arctic sea ice decline," Cvijanovic said. "While more research should be done, we should be aware that an increasing number of studies, including this one, suggest that the loss of the Arctic sea ice cover is not only a problem for remote Arctic communities, but could affect millions of people worldwide. Arctic sea ice loss could affect us, right here in California."
Other co-authors on the study include Benjamin Santer, Celine Bonfils, Donald Lucas and Susan Zimmerman from LLNL and John Chiang from the University of California, Berkeley.
The research is funded by DOE (Department of Energy) Office of Science. Cvijanovic and Bonfils were funded by the DOE Early Career Research Program Award and Lucas is funded by the DOE Office of Science through the SciDAC project on Multiscale Methods for Accurate, Efficient and Scale-Aware Models of the Earth System.
Increasing Wildfires in the boreal forests of northern Canada and Alaska due to Lightning
December 2017: Wildfires in the boreal forests of northern Canada and Alaska have been increasing in frequency and the amount of area burned, and the drivers of large fire years are still poorly understood. But recent NASA-funded research offers at least one possible cause: more lightning. As global warming continues, lightning storms and warmer conditions are expected to spread farther north, meaning fire could significantly alter the landscape over time. 17)
A record number of lightning-ignited fires burned in Canada's Northwest Territories in 2014 and in Alaska in 2015. Scientist Sander Veraverbeke (Vrije Universiteit Amsterdam and University of California, Irvine) and colleagues examined data from satellites and from ground-based lightning networks to see if they could figure out why those seasons were so bad.
The team found that the majority of fires in their study areas in 2014 and 2015 were ignited by lightning storms, as opposed to human activity. That is natural, given the remoteness of the region, but it also points to more frequent lightning strikes in an area not known for as many thunderstorms as the tropics or temperate regions. Looking at longer trends, the researchers found that lightning-ignited fires in the region have been increasing by 2 to 5 percent per year since 1975, a trend that is consistent with climate change. The study was published in July 2017 in the journal Nature Climate Change. 18)
"We found that it is not just a matter of more burning with higher temperatures. The reality is more complex," Veraverbeke said. "Higher temperatures also spur more thunderstorms. Lightning from these thunderstorms is what has been igniting many more fires in these recent extreme events."
The map of Figure 11 shows the location and ignition source (lightning or human caused) for forest fires in interior Alaska in 2015. The map of Figure 12 shows the timing of the fires (June, July, or August) within the inset box. Both maps are based on data from the Veraverbeke study, which combined observations from the Alaska Fire Emissions Database, computer models, and fire observations from the MODIS (Moderate Resolution Imaging Spectroradiometer) instruments on NASA's Terra and Aqua satellites.
The fire season in the far north has typically peaked in July, after the spring thaw and the melting of winter snow. As global temperatures continue to rise, especially in the polar regions, thawing and warming tend to happen earlier in the spring and summer and at a more extensive level than in the past. The warmer weather also leads to more atmospheric instability, bringing more thunderstorms. The researchers asserted in the paper that "extreme fire years result when high levels of lightning ignition early in the growing season are followed by persistent warm and dry conditions that accelerate fire spread later in midsummer."
Figure 11: Location and ignition source (lightning or human caused) for forest fires in interior Alaska in 2015, acquired with MODIS on Terra and Aqua and in situ measurements (image credit: NASA Earth Observatory, maps and charts by Jesse Allen using data provided by Sander Veraverbeke (Vrije Universiteit). Story by Mike Carlowicz (NASA Earth Observatory), Alan Buis (Jet Propulsion Laboratory), and Brian Bell (University of California, Irvine)
Brendan Rogers of the Woods Hole Research Center said these trends are likely to continue. "We expect an increasing number of thunderstorms, and hence fires, across the high latitudes in the coming decades as a result of climate change."
The researchers also found that wildfires are creeping farther north, closer to the transition zone between boreal forests and Arctic tundra. Together, these areas include at least 30 percent of the world's tree cover and 35 percent of its stored soil carbon.
Figure 13: Ignition density in the Northwest Territories (x 10-5 ignitions/km2) acquired in the timeframe 1975-2015 (image credit: NASA Earth Observatory and Lightning Study)
Figure 14: Ignition density in Alaska (x 10-5 ignitions/km2) acquired in the timeframe 1975-2015 (image credit: NASA Earth Observatory and Lightning Study)
"In these high-latitude ecosystems, permafrost soils store large amounts of carbon that become vulnerable after fires pass through," said James Randerson of UC Irvine. "Exposed mineral soils after tundra fires also provide favorable seedbeds for trees migrating north under a warmer climate."
"Taken together, we discovered a complex feedback loop between climate, lightning, fires, carbon and forests that may quickly alter northern landscapes," Veraverbeke said. "A better understanding of these relationships is critical to better predict future influences from climate on fires and from fires on climate."
Study co-author Charles Miller of NASA/JPL (Jet Propulsion Laboratory) added that while data from the lightning networks were critical to this study, it is challenging to use these data for trend detection because of continuing network upgrades. "A spaceborne sensor that provides high northern latitude lightning data would be a major step forward."
Global Carbon Budget 2017
November 2017: Following three years of no growth, global GHG (Greenhouse Gas) emissions from human activities are projected to increase by 2% by the end of 2017, according to the nongovernmental organization GCP (Global Carbon Project). The increase, to a record 37 billion tons of carbon dioxide equivalent, dashed hopes in the environmental community that CO2 emissions from human activity might have plateaued and begun turning downward. 19)
In a set of three reports published 13 November, GCP said the biggest cause of the increase is the 3.5% growth in China, the world's largest emitter of greenhouse gases. The country experienced higher energy demand, particularly from industry, and a decline in hydroelectric power due to sparse rainfall. 20) 21) 22)
In addition, the decade-long trend in emissions reductions by the US and the European Union, the second- and third-largest emitters, respectively, appears to have slowed this year. The EU's output hasn't declined appreciably since 2015. The US output declined by 0.4%, compared with a 1.2% average annual reduction during the previous 10 years. Coal consumption in the US inched up 0.5%, its first increase in five years.
India, the fourth-largest greenhouse gas emitter, limited its growth to 2% this year, compared with a 6% jump in 2016. Emissions from all other countries increased 2.3% from 2016, to 15.1 gigatons (Figure 15).
Figure 15: The world's four largest carbon dioxide emitters—China, the US, the European Union, and India—account for about 60% of global emissions. Although those countries have made strides recently, their emissions and those globally (expected year-to-year percent change and error bars shown under each country) will probably tick upward in 2017 Image credit: Global Carbon Project, CC BY 4.0)
Despite the 2014–16 hiatus in global emissions growth, CO2 has continued to accumulate in the atmosphere at a faster pace than at any time during the 50 years that measurements have been kept. The elevated global temperatures resulting from the 2015–16 El Niño diminished the capacity of terrestrial ecosystems to take up CO2 from the atmosphere, the GCP reports said.
Corinne Le Quéré of the University of East Anglia (Norwich, UK), lead author of the principal report (Ref. 20) that was published in Earth System Science Data, said in an email that she expects emissions to plateau or grow slightly in the coming years. But they are unlikely to return to the 3% growth levels that were seen regularly in the decade that ended in 2010.
Kelly Levin of the nonprofit WRI (World Resources Institute) cautions against reading too much into a single year's data but also warns about the perilous big picture. "To have a chance of transforming the economy in time to stay below 2 °C, global GHG emissions must peak by 2020," she says. WRI's analysis, and another by the UNEP (United Nations Environment Program), predict on the basis of current trends and treaty commitments that the peak in global emissions won't occur until after 2030. At that point, the probability of limiting global warming to 2 °C could be as low as 50%, even with accelerated national reduction commitments, rapid abandonment of fossil fuel use, and deployment of carbon-removal technologies whose feasibility hasn't yet been demonstrated.
The 2 °C mark is thought by most climate scientists to be the threshold below which the worst impacts of climate change can be avoided. The 2015 Paris climate agreement set an "aspirational" goal of limiting temperature increase to 1.5 °C.
The WRI analysis says the number of countries whose emissions have peaked or are committed to peak will increase from 49 in 2010 to 53 by 2020 and to 57 by 2030. Those countries accounted for 36% of world greenhouse gas emissions in 2010 and will represent 60% of the total in 2030, when China has committed to peak its output.
Despite last year's emissions increase, China's coal consumption this year is still about 8% below its record 2013 high. The Chinese government has projected a near-doubling of the nation's solar energy production over the next two years, to 213 GW. China's nonfossil energy sources make up 14.3% of overall energy production, up by one percentage point in less than a year.
Study of Global Light Pollution at Night
November 22, 2017: They were supposed to bring about an energy revolution—but the popularity of LED (Light-Emitting Diode) lights is driving an increase in light pollution worldwide, with dire consequences for human and animal health, researchers said in their study. Five years of advanced satellite images show that there is more artificial light at night across the globe, and that light at night is getting brighter. The rate of growth is approximately two percent each year in both the amount of areas lit and the radiance of the light. 23) 24) 25)
An international team of scientists reported the results of a landmark study of global light pollution and the rise of LED outdoor lighting technology. The study finds both light pollution and energy consumption by lighting steadily increasing over much of the planet. The findings also challenge the assumption that increases in the energy efficiency of outdoor lighting technologies necessarily lead to an overall decrease in global energy consumption.
The team, led by Christopher Kyba of the GFZ (German Research Center for Geosciences) in Potsdam, Germany, analyzed five years of images from the Suomi NPP (Suomi National Polar-orbiting Partnership) satellite, operated jointly by NASA and NOAA (National Oceanic and Atmospheric Administration). The data show gains of 2% per year in both the amount of the Earth's surface that is artificially lit at night and the quantity of light emitted by outdoor lighting. Increases were seen almost everywhere the team looked into, with some of the largest gains in regions that were previously unlit.
"Light is growing most rapidly in places that didn't have a lot of light to start with," Kyba noted. "That means that the fastest rates of increase are occurring in places that so far hadn't been very strongly affected by light pollution."
The results reported today confirm suggestions in earlier research based on data obtained with U.S. Department of Defense meteorological satellite measurements (DMSP series) going back to the 1970s. However, the better sensitivity of Suomi's cameras to light on the night side of Earth and significantly improved ground resolution led to more robust conclusions about the changing illumination of the world at night.
The study is among the first to examine the effects, as seen from space, of the ongoing worldwide transition to LED lighting. Kyba's team found that the energy saving effects of LED lighting on country-level energy budgets are lower than expected from the increase in the efficiency of LEDs compared to older lamps.
Figure 16: Infographic showing the number of countries experiencing various rates of change of night lights during 2012-2016 (image credit: Kyba and the Study Team)
Environmental Gains Unrealized : LED lighting requires significantly less electricity to yield the same quantity of light as older lighting technologies. Proponents of LED lighting have argued that the high energy efficiency of LEDs would contribute to slowing overall global energy demand, given that outdoor lighting accounts for a significant fraction of the nighttime energy budget of the typical world city.
The team tested this idea by comparing changes in nighttime lighting seen from Earth orbit to changes in countries' GDP (Gross Domestic Product) – a measure of their overall economic output – during the same time period. They concluded that financial savings from the improved energy efficiency of outdoor lighting appear to be invested into the deployment of more lights. As a consequence, the expected large reductions in global energy consumption for outdoor lighting have not been realized.
Kyba expects that the upward global trend in use of outdoor lighting will continue, bringing a host of negative environmental consequences. "There is a potential for the solid-state lighting revolution to save energy and reduce light pollution," he added, "but only if we don't spend the savings on new light".
IDA (International Dark-Sky Association) has campaigned for the last 30 years to bring attention to the known and suspected hazards associated with the use of artificial light at night. IDA Executive Director J. Scott Feierabend pointed out repercussions including harm to wildlife, threats to human wellbeing, and potentially compromised public safety. IDA drew public attention to concerns associated with the strong blue light emissions of LED lighting as early as 2010.
"Today's announcement validates the message IDA has communicated for years," Feierabend explained. "We hope that the results further sound the alarm about the many unintended consequences of the unchecked use of artificial light at night."
Satellite imagery: The VIIRS (Visible Infrared Imaging Radiometer Suite) DNB (Day-Night Band) of the Suomi NPP mission started observations in 2012 -just as outdoor use of LED lighting began in earnest. This sensor provides the first-ever global calibrated nighttime radiance measurements in a spectral band of 500 to 900 nm, which is close to the visible band, with a much higher radiometric sensitivity than the DMSP series, and at a spatial resolution of ~750 m. This improved spatial resolution allows neighborhood (rather than city or national) scale changes in lighting to be investigated for the first time.
The cloud-free DNB data show that over the period of 2012–2016, both lit area and the radiance of previously lit areas increased in most countries (Figure 17) in the 500–900 nm range, with global increases of 2.2% per year for lit area and 2.2% per year for the brightness of continuously lit areas. Overall, the radiance of areas lit above 5 nWcm-2 sr-1 increased by 1.8% per year. These factors decreased in very few countries, including several experiencing warfare. They were also stable in only a few countries, interestingly including some of the world's brightest (for example, Italy, Netherlands, Spain, and the United States). With few exceptions, growth in lighting occurred throughout South America, Africa, and Asia. Because the analysis of lit area and total radiance is not subject to a stability criterion, transient lights such as wildfires can cause large fluctuations.
Australia experienced a major decrease in lit area from 2012 to 2016 for this reason(Figures 17A and 18). However, fire-lit areas failed the stability test and were therefore not included in the radiance change analysis (Figure 17B). A small number of countries have "no data" because of either their extreme latitude (Iceland) or the lack of observed stable lights above 5 nWcm-2 sr-1 in the cloud-free composite (for example, Central African Republic).
Figure 17: Geographic patterns in changes in artificial lighting. Changes are shown as an annual rate for both lit area (A) and radiance of stably lit areas (B). Annual rates are calculated based on changes over the four year period, that is, (A2016/A2012)1/4, where A2016 is the lit area observed in 2016 (image credit: Study Team)
Figure 18: Absolute change in lit area from 2012 to 2016. Pixels increasing in area are shown as red, pixels decreasing in area are shown as blue, and pixels with no change in area are shown as yellow. Each pixel has a near-equal area of 6000 ± 35 km2. To ease interpretation, the color scale cuts off at 200 km2, but some pixels had changes of up to ±2000 km2 (image credit: Study Team)
Comparisons of the VIIRS data with photographs taken from aboard the ISS (International Space Station) show that the instrument on Suomi-NPP sometimes records a dimming of some cities even though these cities are in fact the same in brightness or even more brightly lit. The reason for this is that sensor can't "see" light at wavelengths below 500 nm, i.e. blue light. When cities replace orange lamps with white LED lights that emit considerable radiation below 500 nm, VIIRS mistakes the change for a decrease. In short: The Earth's night-time surface brightness and especially the skyglow over cities is increasing, probably even in the cases where the satellite detects less radiation. 26)
There is, however, hope that things will change for the better. Christopher Kyba says: "Other studies and the experience of cities like Tucson, Arizona, show that well designed LED lamps allow a two-third or more decrease of light emission without any noticeable effect for human perception." Kyba's earlier work has shown that the light emission per capita in the United States of America is 3 to 5 times higher than that in Germany. Kyba sees this as a sign that prosperity, safety, and security can be achieved with conservative light use. "There is a potential for the solid state lighting revolution to save energy and reduce light pollution," adds Kyba, "but only if we don't spend the savings on new light."
November 2017: The Changing Colors of our Living Planet
Life. It's the one thing that, so far, makes Earth unique among the thousands of other planets we've discovered. Since the fall of 1997, NASA satellites have continuously and globally observed all plant life at the surface of the land and ocean. During the week of Nov. 13-17, NASA is sharing stories and videos about how this view of life from space is furthering knowledge of our home planet and the search for life on other worlds. 27)
NASA satellites can see our living Earth breathe. In the Northern Hemisphere, ecosystems wake up in the spring, taking in carbon dioxide and exhaling oxygen as they sprout leaves — and a fleet of Earth-observing satellites tracks the spread of the newly green vegetation.
Meanwhile, in the oceans, microscopic plants drift through the sunlit surface waters and bloom into billions of carbon dioxide-absorbing organisms — and light-detecting instruments on satellites map the swirls of their color.
This fall marks 20 years since NASA has continuously observed not just the physical properties of our planet, but the one thing that makes Earth unique among the thousands of other worlds we've discovered: Life.
Satellites measured land and ocean life from space as early as the 1970s. But it wasn't until the launch of SeaWiFS (Sea-viewing Wide Field-of-view Sensor) in 1997 that the space agency began what is now a continuous, global view of both land and ocean life. A new animation captures the entirety of this 20-year record, made possible by multiple satellites, compressing a decades-long view of life on Earth into a captivating few minutes.
"These are incredibly evocative visualizations of our living planet," said Gene Carl Feldman, an oceanographer at NASA's Goddard Space Flight Center in Greenbelt, Maryland. "That's the Earth, that is it breathing every single day, changing with the seasons, responding to the Sun, to the changing winds, ocean currents and temperatures."
The space-based view of life allows scientists to monitor crop, forest and fisheries health around the globe. But the space agency's scientists have also discovered long-term changes across continents and ocean basins. As NASA begins its third decade of global ocean and land measurements, these discoveries point to important questions about how ecosystems will respond to a changing climate and broad-scale changes in human interaction with the land.
Satellites have measured the Arctic getting greener, as shrubs expand their range and thrive in warmer temperatures. Observations from space help determine agricultural production globally, and are used in famine early warning detection. As ocean waters warm, satellites have detected a shift in phytoplankton populations across the planet's five great ocean basins — the expansion of "biological deserts" where little life thrives. And as concentrations of carbon dioxide in the atmosphere continue to rise and warm the climate, NASA's global understanding of plant life will play a critical role in monitoring carbon as it moves through the Earth system.
Figure 19: From space, satellites can see Earth breathe. A new NASA visualization shows 20 years of continuous observations of plant life on land and at the ocean's surface, from September 1997 to September 2017. On land, vegetation appears on a scale from brown (low vegetation) to dark green (lots of vegetation); at the ocean surface, phytoplankton are indicated on a scale from purple (low) to yellow (high). This visualization was created with data from satellites including SeaWiFS, and instruments including the NASA/NOAA VIIRS (Visible Infrared Imaging Radiometer Suite) and the MODIS (Moderate Resolution Imaging Spectroradiometer (image credit: NASA)
Sixty years ago, people were not sure that Earth's surface could be seen clearly from space. Many thought that the dust particles and other aerosols in the atmosphere would scatter the light, masking the oceans and continents, said Jeffrey Masek, chief of the Biospheric Sciences Laboratory at NASA Goddard.
The Gemini and Apollo programs demonstrated otherwise. Astronauts used specialized cameras to take pictures of Earth that show the beauty and complexity of our living planet, and helped kickstart the era of Earth science research from space. In 1972, the first Landsat mission began its 45-year record of vegetation and land cover. "As the satellite archive expands, you see more and more dynamics emerging," Masek said. "We're now able to look at long-term trends."
The grasslands of Senegal, for example, undergo drastic seasonal changes. Grasses and shrubs flourish during the rainy season from June to November, then dry up when the rain stops. With early weather satellite data in the 1970s and '80s, NASA Goddard scientist Compton Tucker was able to see that greening and die-back from space, measuring the chlorophyll in the plants below. He developed a way of comparing satellite data from two wavelengths, which gives a quantitative measurement of this greenness called the NDVI (Normalized Difference Vegetation Index).
"We were astounded when we saw the first images. They were amazing because they showed how vegetation changed annually, year after year," Tucker said, noting that others were surprised as well when the study came out in 1985. "When we produced this paper, people accused us of ‘painting by numbers,' or fudging data. But for the first time, you could study vegetation from space based on their photosynthetic capacity."
When the temperature is right, and water and sunlight are available, plants photosynthesize and produce vegetative material. Leaves strongly absorb blue and red light but reflect near-infrared light back into space. By comparing the ratio of red to near-infrared light, Tucker and his colleagues could quantify the vegetation covering the land.
Expanding these observations to the rest of the globe, the scientists could track the impact on plants of rainy and dry seasons elsewhere in Africa, see the springtime blooms in North America, and the after-effects of wildfires in forests worldwide.
But land is only part of the story. At the base of the ocean's food web is phytoplankton — tiny organisms that, like land plants, turn water and carbon dioxide into sugar and oxygen, aided by the right combination of nutrients and sunlight.
Satellites that can monitor the subtle changes in color of the ocean have helped scientists track changes in phytoplankton populations across the globe. The first view of ocean color came from the CZCS (Coastal Zone Color Scanner), a proof-of concept instrument launched in 1979. Continuous observations of ocean color began with the launch of SeaWIFS in late 1997. The satellite was just in time to capture the transition from El Niño to La Niña conditions in 1998 — revealing just how quickly and dramatically phytoplankton respond to changing ocean conditions.
"The entire Eastern Pacific, from the coast of South America all the way to the dateline, transitioned from what was the equivalent of a biological desert to a thriving rainforest. And we watched it happen in real time," Feldman said. "For me, that was the first demonstration of the power of this kind of observation, to see how the ocean responds to one of the most significant environmental perturbations it could experience, over the course of just a few weeks. It also showed that the ocean and all the life within it is amazingly resilient — if given half a chance."
Figure 20: The SeaWiFS satellite was launched in late 1997, just in time to capture the phytoplankton that bloomed in the Eastern Equatorial Pacific as conditions changed from El Niño to La Niña, seen here in yellow (image credit: NASA)
Tracking change from satellites: With 20 years of satellite data tracking ocean plant life on a global scale, scientists are investigating how habitats and ecosystems are responding to changing environmental conditions.
Recent studies of ocean life have shown that a long-term trend of rising sea surface temperatures is causing ocean regions known as "biological deserts" to expand. These regions of low phytoplankton growth occur in the center of large, slow-moving currents called gyres.
"As the surface waters warm, it creates a stronger boundary between the deep, cold, nutrient-rich waters and the sunlit, generally nutrient-poor surface waters," Feldman said. This prevents nutrients from reaching phytoplankton at the surface, and could have significant consequences for fisheries and the marine ecosystem.
In the Arctic Ocean, an explosion of phytoplankton indicates change. As seasonal sea ice melts, warming waters and more sunlight will trigger a sudden, massive phytoplankton bloom that feeds birds, sea lions and newly hatched fish. But with warming atmospheric temperatures, that bloom is now happening several weeks early — before the animals are in place to take advantage of it. "It's not just the amount of food, it's the location and timing that are just as critical," Feldman said. "Spring bloom is coming earlier, and that's going to impact the ecosystem in ways we don't yet understand."
The climate is warming fastest in Arctic regions, and the impacts on land are visible from space as well. The tundra of Western Alaska, Quebec and elsewhere is turning greener as shrubs extend their reach northwards.
The neighboring northern forests are changing as well. Massive fires in 2004 and 2015 wiped out millions of acres of forests in Alaska, including spruce forests, noted Chris Potter, a research scientist at NASA's Ames Research Center in California's Silicon Valley. "These fires were amazing in the amount of forest area they burned and how hot they burned," Potter said. "When the air temperature hits 90 degrees Fahrenheit (32ºC) in late May up there, and all these lightning strikes occurred, the forest burned very extensively — close to rivers, close to villages — and nothing could stop it."
Satellites help scientists routinely map fires, deforestation and other changes, and to analyze their impact on the carbon cycle, Potter said. Giant fires release many tons of carbon dioxide into the atmosphere, both from the charred trees and moss but also, especially in northern latitudes, from the soils. Potter and colleagues went to the burned areas of Central Alaska this year to measure the underlying permafrost — the thick mossy layer had burned off, exposing the previously frozen soils. "It's like taking the insulating layer off a cooler," he said. "The ice melts underneath and it becomes a slushy mess."
Forest types can change too, whether it's after wildfires, insect infestations or other disturbance. The Alaskan spruce forests are being replaced with birch. Potter and his colleagues are also keeping an eye on California forests burned in recent fires, where the concern is that pines will be replaced by oaks. "When drought is accentuated with these record high temperatures, nothing good seems to come from that for the existing forest type," he said. "I think we're seeing real clear evidence of climate causing land-cover change."
Keeping an eye on crops: Changing temperatures and rainfall patterns also influence crops, whether they are grown in California or Africa. The "greenness" measurement that scientists use to measure forests and grasslands can also be used for agriculture, to monitor the health of fields throughout the growing season.
Researchers and policy makers realized this potential early. One of the first applications of Landsat data in the 1970s was to predict grain yields in Russia and better understand commodities markets. In 1985, food security analysts from USAID (United States Agency for International Development) approached NASA to incorporate satellite images into their Famine Early Warning Systems Network, to identify regions where food production has been limited by drought. That partnership continues today. With rainfall estimates, vegetation measurements, as well as the recent addition of soil moisture information, NASA scientists can help organizations like USAID direct emergency help.
With improved data from Landsat, the MODIS instruments on NASA's Terra and Aqua spacecraft and other satellites, and by combining data from multiple sensors, researchers are now able to track the growth of crops in individual fields, Tucker said.
The view from space not only helps monitor crops, but can help improve agricultural practices as well. A winery in California, for example, uses individual pixels of Landsat data to determine when to irrigate and how much water to use.
The next step for NASA scientists is actually looking at the process of photosynthesis from space. When plants undergo that chemical process, some of the absorbed energy fluoresces faintly back, notes Joanna Joiner, a NASA Goddard research scientist. With satellites that detect signals in the very specific wavelengths of this fluorescence, and a fine-tuned analysis technique that blocks out background signals, Joiner and her colleagues can see where and when plants start converting sunlight into sugars. - "It was kind of a revelation that yes, you can measure it," Joiner said. An early study looked at the U.S. Corn Belt and found it fluoresces "like crazy," she said. "Those plants have some of the highest fluorescence rates on Earth at their peak."
Joiner and Tucker are using both the fluorescence data and vegetation indices to get the most information possible about plant growth at regional and global scales: "One of the big questions that still remains is how much carbon are the plants taking up, why does it vary year to year, and which areas are contributing to that variability," Joiner said.
Whether it's crops, forests or phytoplankton blooms, NASA scientists are tracking life on Earth. Just as satellites help researchers study the atmosphere, rainfall and other physical characteristics of the planet, the ever-improving view from above will allow them to study the interconnected life of the planet, Feldman said.
Study Bolsters Theory of Heat Source Under West Antarctica
November 7, 2017: A new NASA/JPL study adds evidence that a geothermal heat source, called a mantle plume, lies deep below Antarctica's Marie Byrd Land, explaining some of the melting that creates lakes and rivers under the ice sheet. Although the heat source isn't a new or increasing threat to the West Antarctic ice sheet, it may help explain why the ice sheet collapsed rapidly in an earlier era of rapid climate change, and why it is so unstable today. 28)
The research team was led by Helene Seroussi of the Jet Propulsion Laboratory, with support from researchers from the Department of Earth and Planetary Sciences at Washington University and the Alfred Wegener Institute, Helmholtz Center for Polar and Marine Research in Germany. 29)
Figure 21: Illustration of flowing water under the Antarctic ice sheet. Blue dots indicate lakes, lines show rivers. Marie Byrd Land is part of the bulging "elbow" leading to the Antarctic Peninsula, left center (image credit: NSF/Zina Deretsky)
The stability of an ice sheet is closely related to how much water lubricates it from below, allowing glaciers to slide more easily. Understanding the sources and future of the meltwater under West Antarctica is important for estimating the rate at which ice may be lost to the ocean in the future.
Antarctica's bedrock is laced with rivers and lakes, the largest of which is the size of Lake Erie. Many lakes fill and drain rapidly, forcing the ice sheet to rise and fall by as much as 6 meters. The motion allows scientists to estimate where and how much water must exist at the base.
Some 30 years ago, Wesley E. LeMasurier, a scientist at the University of Colorado Denver suggested that heat from a mantle plume under Marie Byrd Land might explain regional volcanic activity and a topographic dome feature. Very recent seismic imaging has supported this concept. When Hélène Seroussi of NASA's Jet Propulsion Laboratory in Pasadena, California, first heard the idea, however, "I thought it was crazy," she said. "I didn't see how we could have that amount of heat and still have ice on top of it."
With few direct measurements existing from under the ice, the research team concluded the best way to study the mantle plume idea was by numerical modeling. They used the ISSM (Ice Sheet System Model), a numerical depiction of the physics of ice sheets developed by scientists at JPL and the University of California, Irvine. Seroussi enhanced the ISSM to capture natural sources of heating and heat transport from freezing, melting and liquid water; friction; and other processes.
To assure the model was realistic, the scientists drew on observations of changes in the altitude of the ice sheet surface made by NASA's IceSat satellite and airborne Operation IceBridge campaign. "These place a powerful constraint on allowable melt rates — the very thing we wanted to predict," Ivins said. Since the location and size of the possible mantle plume were unknown, they tested a full range of what was physically possible for multiple parameters, producing dozens of different simulations.
They found that the flux of energy from the mantle plume must be no more than 150 mW/m2. For comparison, in U.S. regions with no volcanic activity, the heat flux from Earth's mantle is 40 to 60 mW. Under Yellowstone National Park — a well-known geothermal hot spot — the heat from below is about 200 mW/m2 averaged over the entire park, though individual geothermal features such as geysers are much hotter.
The research team's simulations, using a heat flow higher than 150 mW/m2, showed too much melting to be compatible with the space-based data, except in one location: an area inland of the Ross Sea known for intense flows of water. This region required a heat flow of at least 150-180 mW/m2 to agree with the observations. However, seismic imaging has shown that mantle heat in this region may reach the ice sheet through a rift, that is, a fracture in Earth's crust such as appears in Africa's Great Rift Valley.
Mantle plumes are thought to be narrow streams of hot rock rising through Earth's mantle and spreading out like a mushroom cap under the crust. The buoyancy of the material, some of it molten, causes the crust to bulge upward. The theory of mantle plumes was proposed in the 1970s to explain geothermal activity that occurs far from the boundary of a tectonic plate, such as Hawaii and Yellowstone.
The Marie Byrd Land mantle plume formed 50 to 110 million years ago, long before the West Antarctic ice sheet came into existence. At the end of the last ice age around 11,000 years ago, the ice sheet went through a period of rapid, sustained ice loss when changes in global weather patterns and rising sea levels pushed warm water closer to the ice sheet — just as is happening today. The research team suggests the mantle plume could facilitate this kind of rapid loss.
November 2017: Ozone Hole is Smallest Since 1988
Observations in 2017 showed that the "hole" in Earth's ozone layer—which forms over Antarctica at the end of each southern winter—was the smallest recorded since 1988. - According to NASA satellite estimates, the ozone hole reached its annual peak extent on September 11, spreading across 19.6 million km2, an area about 2.5 times the size of the United States. Ground- and balloon-based measurements from the NOAA (National Oceanic and Atmospheric Administration) agreed with the satellite measurements. The average area of ozone hole maximums since 1991 has been roughly 26 million km2. 30)
Figure 22: The map shows the Antarctic ozone hole at its widest extent for the year, as measured on September 11, 2017. The observations were made by OMI (Ozone Monitoring Instrument) on NASA's Aura satellite (image credit: NASA Earth Observatory, images by Jesse Allen, using visuals provided by the NASA Ozone Watch team. Story by Katy Mersmann, NASA/GSFC, and Theo Stein, NOAA Office of Oceanic and Atmospheric Research, with Mike Carlowicz, Earth Observatory)
"The Antarctic ozone hole was exceptionally weak this year," said Paul Newman, chief scientist for Earth sciences at NASA/GSFC. "This is what we would expect to see given the weather conditions in the Antarctic stratosphere."
The smaller ozone hole in 2017 was strongly influenced by an unstable and warmer-than-usual Antarctic vortex, a low-pressure system that rotates clockwise in the atmosphere over far southern latitudes (similar to polar vortices in the northern hemisphere). The vortex helped minimize the formation of PSCs (Polar Stratospheric Clouds); the formation and persistence of PSCs are important precursors to the chlorine- and bromine reactions that destroy ozone.
Although warmer stratospheric weather conditions have reduced ozone depletion during the past two years, ozone holes are still large because atmospheric concentrations of ozone-depleting substances (primarily chlorine and bromine) remain high enough to produce significant yearly ozone loss. The smaller ozone hole extent in 2017 is due to natural variability and not necessarily a signal of rapid healing.
First detected in 1985, the Antarctic ozone hole forms during late winter in the Southern Hemisphere as returning sunlight catalyzes reactions involving man-made, chemically active forms of chlorine and bromine. These reactions destroy ozone molecules in the stratosphere. At high altitudes, the ozone layer acts like a natural sunscreen, shielding the Earth's surface from harmful ultraviolet radiation that can cause skin cancer and cataracts, suppress immune systems, and damage plants.
Thirty years ago, the international community signed the Montreal Protocol on Substances that Deplete the Ozone Layer and began regulating ozone-depleting compounds. The ozone hole over Antarctica is expected to gradually become less severe as chlorofluorocarbons (CFCs) continue to decline. Scientists expect the Antarctic ozone hole to recover back to 1980 levels by 2070.
Figure 23: This image shows the Antarctic ozone hole on October 12, 2017, as observed by OMI. On that day, the ozone layer reached its annual minimum concentration, which measured 131 Dobson Units, the mildest depletion since 2002 (image credit: NASA Earth Observatory, images by Jesse Allen, using visuals provided by the NASA Ozone Watch team. Story by Katy Mersmann, NASA/GSFC, and Theo Stein, NOAA Office of Oceanic and Atmospheric Research, with Mike Carlowicz, Earth Observatory)
Legend to Figures 22 and 23: The uneven seam in the contours of the data (lower left quadrant) marks the location of the international date line. Ozone data are measured by polar-orbiting satellites that collect observations in a series of swaths over the course of the day; the passes are generally separated by about 90 minutes. Stratospheric circulation slowly shifts the contours of the ozone hole over the course of the day (like winds shift the location of clouds). The contours move little from any one swath to the next, but by the end of the day, the cumulative movement is apparent at the date line.
As both images show, the word hole is not literal; scientists use it as a metaphor for the area in which ozone concentrations drop below the historical threshold of 220 Dobson Units. During the 1960s, long before the Antarctic ozone hole occurred, average ozone concentrations above the South Pole ranged from 260 to 320 Dobson Units. Globally, the ozone layer today ranges from 300 to 500 Dobson units.
"In the past, we've seen ozone at some stratospheric altitudes go to zero ozone by the end of September," said Bryan Johnson, NOAA atmospheric chemist. "This year, our balloon measurements showed the ozone loss rate stalled by the middle of September and ozone levels never reached zero."
Greenland Maps Show More Glaciers at Risk
November 1, 2017: New maps of Greenland's coastal seafloor and bedrock beneath its massive ice sheet show that two to four times as many coastal glaciers are at risk of accelerated melting as previously thought (Figure 24). Researchers at UCI ( University of California at Irvine), NASA and 30 other institutions have published the most comprehensive, accurate and high-resolution relief maps ever made of Greenland's bedrock and coastal seafloor. Among the many data sources incorporated into the new maps are data from NASA's OMG (Ocean Melting Greenland) campaign. 31) 32)
Lead author Mathieu Morlighem of UCI had demonstrated in an earlier paper that data from OMG's survey of the shape and depth, or bathymetry, of the seafloor in Greenland's fjords improved scientists' understanding not only of the coastline, but of the inland bedrock beneath glaciers that flow into the ocean. That's because the bathymetry where a glacier meets the ocean limits the possibilities for the shape of bedrock farther upstream.
The nearer to the shoreline, the more valuable the bathymetry data are for understanding on-shore topography, Morlighem said. "What made OMG unique compared to other campaigns is that they got right into the fjords, as close as possible to the glacier fronts. That's a big help for bedrock mapping." Additionally, the OMG campaign surveyed large sections of the Greenland coast for the first time ever. In fjords for which there are no data, it's difficult to estimate how deep the glaciers extend below sea level.
The OMG data are only one of many datasets Morlighem and his team used in the ice sheet mapper, which is named BedMachine. Another comprehensive source is NASA's Operation IceBridge airborne surveys. IceBridge measures the ice sheet thickness directly along a plane's flight path. This creates a set of long, narrow strips of data rather than a complete map of the ice sheet. Besides NASA, nearly 40 other international collaborators also contributed various types of survey data on different parts of Greenland.
No survey, not even OMG, covers every glacier on Greenland's long, convoluted coastline. To infer the bed topography in sparsely studied areas, BedMachine averages between existing data points using physical principles such as the conservation of mass.
The new maps reveal that two to four times more oceanfront glaciers extend deeper than 200 m below sea level than earlier maps showed. That's bad news, because the top 200 m of water around Greenland comes from the Arctic and is relatively cold. The water below it comes from farther south and is 3º to 4º C warmer than the water above. Deeper-seated glaciers are exposed to this warmer water, which melts them more rapidly.
Morlighem's team used the maps to refine their estimate of Greenland's total volume of ice and its potential to add to global sea level rise, if the ice were to melt completely — which is not expected to occur within the next few hundred years. The new estimate is higher by 7 cm for a total of 7.42 m.
OMG Principal Investigator Josh Willis of JPL, who was not involved in producing the maps, said, "These results suggest that Greenland's ice is more threatened by changing climate than we had anticipated."
On Oct. 23, the five-year OMG campaign completed its second annual set of airborne surveys to measure, for the first time, the amount that warm water around the island is contributing to the loss of the Greenland ice sheet. Besides the one-time bathymetry survey, OMG is collecting annual measurements of the changing height of the ice sheet and the ocean temperature and salinity in more than 200 fjord locations. Morlighem looks forward to improving BedMachine's maps with data from the airborne surveys.
Figure 24: This image shows a stretch of Greenland's coastline as created by BedMachine before and after the inclusion of new OMG data. Left: Color coded Greenland topography color-coded from 1,500 m below sea level (dark blue) to 1500 m above (brown). Right: Regions below sea level connected to the ocean; darker colors are deeper. The thin white line shows the current extent of the ice sheet (image credit: UCI)
Study: Melting snow aids absorption of carbon dioxide
October 30, 2017: It appears that something good can come from something bad. Although rising global temperatures are causing seasonal snow cover to melt earlier in the spring, this allows for the snow-free boreal forests to absorb more carbon dioxide from our atmosphere.
Scientists believe that global warming is primarily caused by carbon dioxide emissions from human activities such as burning coal, the oil and gas industry, transportation and domestic heating. As global temperatures rise, we see changes in Earth's climate such as the accelerated melting of glaciers, rising sea levels and an increase in the frequency of extreme weather conditions.
To predict the increase of carbon dioxide in the atmosphere accurately, scientists need to consider both the sources of emissions as well as the absorption of carbon dioxide both on land and in the oceans. Boreal forests are well known to be an important carbon sink on land, but the amount of carbon these high-latitude northern forests can absorb is influenced by the amount of snow cover.
To help quantify changes in carbon absorption, ESA's GlobSnow project produced daily snow cover maps over the whole northern hemisphere from 1979 to 2015 using satellites. -A team of climate and remote sensing scientists led by the (FMI (Finnish Meteorological Institute) recently analyzed the information and found that the start of plant growth in the spring has shifted earlier by an average of eight days over the last 36 years. 33) 34)
Figure 25: Snow-free conditions: The animation shows when parts of the NH (Northern Hemisphere) became snow-free in the spring each year from 1979 to 2015. Blue represents earlier snow melt (January–March) while red depicts later snow melt (June), image credit: GlobSnow / Finnish Meteorological Institute
By combining this information with ground-based observations of the atmosphere–ecosystem carbon dioxide exchange from forests in Finland, Sweden, Russia and Canada, the team found that this earlier start to spring growth has increased the forest uptake of carbon dioxide from the atmosphere by 3.7% per decade. This acts as a brake on the growth of atmospheric carbon dioxide, helping to mitigate the rapid increase of carbon dioxide from man-made emissions.
The research team also found that the shift in spring recovery is much larger in Eurasian forests, leading to double the increase in carbon uptake compared to North American forests.
"Satellite data played an essential role in providing information on variability in the carbon cycle," said Prof. Jouni Pulliainen, who led the research team at the Finnish Meteorological Institute. "By combining satellite- and ground-based information, we were able to turn observations of melting snow into higher-order information on springtime photosynthetic activity and carbon uptake."
These new results will now be used to improve climate models and help to increase the accuracy in predictions of global warming.
Next year, ESA plans to improve the satellite-based record of global snow cover with the upcoming Snow_cci project of ESA's CCI (Climate Change Initiative).
Figure 26: This graph indicates the start of photosynthetic activity of boreal forests in the spring of each year from 1979 to 2015. Over the 36-year period, the start of photosynthetic activity – or plant growth – has shifted eight days earlier (image credit: GlobSnow / Finnish Meteorological Institute)
USGS study: Future Temperature and Soil Moisture May Alter Location of Agricultural Regions
October 2017: Future high temperature extremes and soil moisture conditions may cause some regions to become more suitable for rainfed, or non-irrigated, agriculture, while causing other areas to lose suitable farmland, according to a new USGS (U.S. Geological Survey study). 35)
These future conditions will cause an overall increase in the area suitable to support rainfed agriculture within dryland areas. Increases are projected in North America, western Asia, eastern Asia and South America. In contrast, suitable areas are projected to decline in European dryland areas.
This study focused on understanding and projecting suitability for rainfed agriculture in temperate, or non-tropical, dryland regions. Drylands make up at least 40 percent of the earth's land area and rainfed croplands account for approximately 75 percent of global cropland. Worldwide, temperate regions account for 31 percent of the area used to grow wheat and 17 percent used for corn. - "Understanding the future potential distribution of rainfed agriculture is important for resource managers in meeting economic and food security needs, especially as the earth's population grows," said USGS scientist and lead author of the study, John Bradford. 36)
Future climate conditions are expected to increase the frequency of high temperature events and alter the seasonality of soil moisture in dryland systems, which are the factors found to be important in predicting regions suitable for agriculture in these water-limited areas. Findings for the temperate regions examined by this study indicate that many areas currently too cold for agriculture, particularly across Asia and North America, will likely become suitable for growing crops. However, some areas that are currently heavily cultivated, including regions of the United States such as the southern Great Plains, are likely to become less suitable for agriculture in the future.
USGS scientists and an international team of collaborators from Switzerland, Germany, China, Canada and several U.S. universities found that rainfed agriculture is abundant in areas with adequate soil moisture but restricted in areas with regular high temperature extremes. Bradford and collaborators simulated future soil moisture and temperature conditions, and utilized these results to identify where rainfed agriculture may be located in the future. Scientists referenced previously published estimates of rainfed agriculture areas generated using satellite remote sensing. Models were used to determine conditions that support current rainfed agriculture, as well as future suitability under altered climate conditions.
"Our results indicate the interaction of soil moisture and temperature extremes provides a powerful yet simple framework for understanding the conditions that define suitability for rainfed agriculture in drylands," said Bradford. "Integrating this framework with long-term projections that include rising temperature and changing soil moisture patterns reveals potentially important future shifts in areas that could support agriculture in the absence of irrigation."
Within the dryland regions that were the focus of this study, areas suitable for agriculture are those that experience relatively long periods of moist soils and reasonably warm temperatures. In contrast, areas that frequently experience extreme air temperatures above 93 degrees Fahrenheit are less suitable for rainfed agriculture, even if sufficient moisture is available. Even for relatively cool dryland areas, periods of high temperatures during the growing season can negatively affect agriculture suitability.
Figure 27: Map showing areas expected to be suitable for rainfed agriculture. These maps illustrate areas that are expected to become more suitable for rainfed agriculture (shown in blue), and areas expected to lose suitable farmland (shown in red), image credit: USGS and the Study Team
October 2017: Atmospheric chemistry and physics study reveals new threat to the ozone layer
"Ozone depletion is a well-known phenomenon and, thanks to the success of the Montreal Protocol, is widely perceived as a problem solved," says University of East Anglia's David Oram. But an international team of researchers, led by Oram, has now found an unexpected, growing danger to the ozone layer from substances not regulated by the treaty. The study is published in Atmospheric Chemistry and Physics, a journal of the EGU (European Geosciences Union). 37)
Thirty years ago, the Montreal Protocol was agreed to phase-out chemicals destroying the ozone layer, the UV-radiation shield in the Earth's stratosphere. The treaty has helped the layer begin the slow process of healing, lessening the impact to human health from increased exposure to damaging solar radiation. But increasing emissions of ozone-destroying substances that are not regulated by the Montreal Protocol are threatening to affect the recovery of the layer, according to the new research.
The substances in question were not considered damaging before as they were "generally thought to be too short-lived to reach the stratosphere in large quantities," explains Oram, a research fellow of the UK's National Centre for Atmospheric Science. The new Atmospheric Chemistry and Physics study raises the alarm over fast-increasing emissions of some of these very short-lived chemicals in East Asia, and shows how they can be carried up into the stratosphere and deplete the ozone layer.
Emissions of ozone-depleting chemicals in places like China are especially damaging because of cold-air surges in East Asia that can quickly carry industrial pollution into the tropics. "It is here that air is most likely to be uplifted into the stratosphere," says co-author Matt Ashfold, a researcher at the University of Nottingham Malaysia Campus. This means the chemicals can reach the ozone layer before they are degraded and while they can still cause damage.
One of the new threats is dichloromethane, a substance with uses varying from paint stripping to agricultural fumigation and the production of pharmaceuticals. The amount of this substance in the atmosphere decreased in the 1990s and early 2000s, but over the past decade dichloromethane became approximately 60% more abundant. "This was a major surprise to the scientific community and we were keen to discover the cause of this sudden increase," says Oram.
"We expected that the new emissions could be coming from the developing world, where industrialization has been increasing rapidly," he says. The team set out to measure air pollution in East Asia to figure out where the increase in dichloromethane was coming from and if it could affect the ozone layer.
"Our estimates suggest that China may be responsible for around 50-60% of current global emissions [of dichloromethane], with other Asian countries, including India, likely to be significant emitters as well," says Oram.
The scientists collected air samples on the ground in Malaysia and Taiwan, in the region of the South China Sea, between 2012 and 2014, and shipped them back to the UK for analysis. They routinely monitor around 50 ozone-depleting chemicals in the atmosphere, some of which are now in decline as a direct consequence of the Montreal Protocol.
Dichloromethane was found in large amounts, and so was 1,2-dichloroethane, an ozone-depleting substance used to make PVC (Polyvinyl chloride). China is the largest producer of PVC, which is used in many construction materials, and its production in the country has increased rapidly in the past couple of decades. But the rise in dichloroethane emissions was unexpected and surprising because the chemical is both a "valuable commodity" and "highly toxic", says Oram. "One would expect that care would be taken not to release [dichloroethane] into the atmosphere."
Data collected from a passenger aircraft that flew over Southeast Asia between December 2012 and January 2014 showed that the substances weren't only present at ground level. "We found that elevated concentrations of these same chemicals were present at altitudes of 12 km over tropical regions, many thousands of kilometers away from their likely source, and in a region where air is known to be transferred into the stratosphere," says Oram.
Sample collection: 38)
Between 2012 and 2014, air samples were collected at various times (1) two coastal sites in Taiwan – Hengchun (22.0547º N, 120.6995º E) and Fuguei Cape (25.297º N, 121.538º E); (2) at the Bachok Marine Research Station on the north-east coast of Peninsular Malaysia (6.009º N, 102.425º E); and (3) during several flights of the IAGOS-CARIBIC aircraft between Germany and Thailand or Malaysia. IAGOS-CARIBIC (In-service Aircraft for a Global Observing System-Civil Aircraft for the Regular Investigation of the atmosphere Based on an Instrument Container) is a European project making regular measurements from an in-service passenger aircraft operated by Lufthansa (Airbus A340-600). The CARIBIC samples were all (n = 179) collected at altitudes between 10 and 12.3 km.
Note: IAGOS-CARIBIC is an innovative scientific project to study and monitor important chemical and physical processes in the Earth's atmosphere. Detailed and extensive measurements are made during long distance flights. CARIBIC deploys a modified airfreight container with automated scientific apparatus which are connected to an air and particle (aerosol) inlet underneath the aircraft. Using a passenger Airbus A340-600 from Lufthansa in total more than 530 flights are successfully completed. 39)
Figure 28: Map of the region showing the location of each CARIBIC sample. The markers have been colored according to their CH2Cl2 concentration to highlight the regions where enhanced levels of VSLSs were observed. Also shown are the approximate locations of the three surface stations (orange crosses).
Legend to Figure 28: CH2Cl2 is one of a large group of halogenated compounds known as VSLSs (very short-lived substances). Owing to their relatively short atmospheric lifetimes (typically less than 6 months) and their correspondingly low OPDs (Ozone Depletion Potentials), VSLSs are not currently regulated by the Montreal Protocol.
Arctic Sea Ice Extent in the Autumn of 2017
• October 18, 2017: Every year, the process is generally the same: the cap of sea ice on the Arctic Ocean melts and retreats through spring and summer to an annual minimum extent. Then, as the ocean and air cool with autumn, ice cover grows again and the cycle continues. But when we take a look at smaller regions within the Arctic, we get a more detailed picture of what's been going on. 40)
The map of Figure 29 shows the extent of Arctic sea ice on September 13, 2017, when the ice reached its minimum extent for the year. Extent is defined as the total area in which the ice concentration is at least 15 percent. The map was compiled from observations by the AMSR-2 (Advanced Microwave Scanning Radiometer 2 instrument on the GCOM -W1 (Global Change Observation Mission 1st-Water)/Shizuku satellite mission, operated by JAXA (Japan Aerospace Exploration Agency). The yellow outline in Figure 29 shows the median sea ice extent observed in September from 1981 through 2010.
- According to the NSIDC (National Snow and Ice Data Center) in Boulder, CO, the Arctic sea ice cover in 2017 shrank to 4.64 million km2, the eighth-lowest extent in the 39-year satellite record. Charting these annual minimums and maximums has revealed a steep decline in overall Arctic sea ice in the satellite era. But the decline is not the same everywhere across the Arctic Ocean. The Beaufort Sea north of Alaska, for example, is the region where sea ice has been retreating the fastest.
Figure 30: Regional ice extent in arctic seas, acquired in the period June 20-October 10, 2017 and analyzed by NSIDC (image credit: NASA Earth Observatory, images by Joshua Stevens, using data from the National Snow and Ice Data Center, story by Kathryn Hansen)
- This year, ice in the Chukchi and Beaufort and seas reached their minimum extents toward the end of September, later than the Arctic as a whole. The graph of Figure 30 shows the ice in these two seas was still declining while other regions had started freezing. The melting persisted the longest in the Beaufort Sea, which finally started to refreeze after reaching a minimum on September 27. Data for the graph come from the NSIDC MASIE (Multi-sensor Analyzed Sea Ice Extent) product, which is based on operational sea ice analyses produced by the U.S. National Ice Center. Note: the MASIE observations included also the SSM/I instrument on the DMSP series. In addition, in situ measurements were used.
- Ice loss is the Beaufort and Chukchi seas was not record-breaking this year, but the extents were much lower than usual. Notice in the map how the ice edge in these seas was farther north than average. According to Walt Meier, a scientist at the NSIDC (National Snow and Ice Data Center), the Chukchi and Beaufort seas entered the melt season with a lot of first-year ice. This ice type is generally thinner than multi-year or perennial ice (which survived the previous melt season); first-year ice tends to melt away more easily.
- Meier also notes that low-pressure weather systems persisted near the North Pole for much of the summer. "Low pressure will keep things cooler overall and generally will lead to a relatively higher ice extent overall," Meier said. "However, the position of the low this year led to winds blowing from the south and west that help move ice out of these regions. Also, the winds may have helped to bring in warmer ocean waters from the Bering Strait region as well."
NASA Study of the Causes of Earth's Recent Record Carbon Dioxide Spike
• October 12, 2017: A new NASA study provides spaceborne evidence that Earth's tropical regions were the cause of the largest annual increases in atmospheric carbon dioxide (CO2) concentration seen in at least 2,000 years. 41)
Scientists suspected the 2015-2016 El Niño — one of the largest on record — was responsible, but exactly how has been a subject of ongoing research. Analyzing the first 28 months of data from NASA's OCO-2 (Orbiting Carbon Observatory-2) satellite, researchers conclude impacts of El Niño-related heat and drought occurring in tropical regions of South America, Africa and Indonesia were responsible for the record spike in global carbon dioxide. The findings are published in the journal Science on 13 Oct. 2017 as part of a collection of five research papers based on OCO-2 data. 42) 43) 44) 45) 46) 47) 48)
Figure 31: The last El Niño in 2015-16 impacted the amount of carbon dioxide that Earth's tropical regions released into the atmosphere, leading to Earth's recent record spike in atmospheric carbon dioxide. The effects of the El Niño were different in each region (image credit: NASA-JPL/Caltech)
"These three tropical regions released 2.5 gigatons more carbon into the atmosphere than they did in 2011," said Junjie Liu of NASA/JPL in Pasadena, California, who is lead author of the study. "Our analysis shows this extra carbon dioxide explains the difference in atmospheric carbon dioxide growth rates between 2011 and the peak years of 2015-2016. OCO-2 data allowed us to quantify how the net exchange of carbon between land and atmosphere in individual regions is affected during El Niño years." A gigaton (Gt = 109 tons) is a billion tons.
In 2015 and 2016, OCO-2 recorded atmospheric carbon dioxide increases that were 50 percent larger than the average increase seen in recent years preceding these observations. These measurements are consistent with those made by NOAA (National Oceanic and Atmospheric Administration). That increase was about 3 parts per million of carbon dioxide per year — or 6.3 gigatons of carbon. In recent years, the average annual increase has been closer to 2 parts per million of carbon dioxide per year — or 4 gigatons of carbon. These record increases occurred even though emissions from human activities in 2015-2016 are estimated to have remained roughly the same as they were prior to the El Niño, which is a cyclical warming pattern of ocean circulation in the central and eastern tropical Pacific Ocean that can affect weather worldwide.
Using OCO-2 data, Liu's team analyzed how Earth's land areas contributed to the record atmospheric carbon dioxide concentration increases. They found the total amount of carbon released to the atmosphere from all land areas increased by 3 gigatons in 2015, due to the El Niño. About 80 percent of that amount — or 2.5 gigatons of carbon — came from natural processes occurring in tropical forests in South America, Africa and Indonesia, with each region contributing roughly the same amount.
The team compared the 2015 findings to those from a reference year — 2011 — using carbon dioxide data from the GOSAT (Greenhouse Gases Observing Satellite) mission of JAXA (Japan Aerospace Exploration Agency). In 2011, weather in the three tropical regions was normal and the amount of carbon absorbed and released by them was in balance.
"Understanding how the carbon cycle in these regions responded to El Niño will enable scientists to improve carbon cycle models, which should lead to improved predictions of how our planet may respond to similar conditions in the future," said OCO-2 Deputy Project Scientist Annmarie Eldering of JPL. "The team's findings imply that if future climate brings more or longer droughts, as the last El Niño did, more carbon dioxide may remain in the atmosphere, leading to a tendency to further warm Earth."
While the three tropical regions each released roughly the same amount of carbon dioxide into the atmosphere, the team found that temperature and rainfall changes influenced by the El Niño were different in each region, and the natural carbon cycle responded differently. Liu combined OCO-2 data with other satellite data to understand details of the natural processes causing each tropical region's response.
In eastern and southeastern tropical South America, including the Amazon rainforest, severe drought spurred by El Niño made 2015 the driest year in the past 30 years. Temperatures also were higher than normal. These drier and hotter conditions stressed vegetation and reduced photosynthesis, meaning trees and plants absorbed less carbon from the atmosphere. The effect was to increase the net amount of carbon released into the atmosphere.
In contrast, rainfall in tropical Africa was at normal levels, based on precipitation analysis that combined satellite measurements and rain gauge data, but ecosystems endured hotter-than-normal temperatures. Dead trees and plants decomposed more, resulting in more carbon being released into the atmosphere. Meanwhile, tropical Asia had the second-driest year in the past 30 years. Its increased carbon release, primarily from Indonesia, was mainly due to increased peat and forest fires — also measured by satellite instruments.
"We knew El Niños were one factor in these variations, but until now we didn't understand, at the scale of these regions, what the most important processes were," said Eldering. "OCO-2's geographic coverage and data density are allowing us to study each region separately."
Scott Denning, professor of atmospheric science at Colorado State University in Fort Collins and an OCO-2 science team member who was not part of this study, noted that while scientists have known for decades that El Niño influences the productivity of tropical forests and, therefore, the forests' net contributions to atmospheric carbon dioxide, researchers have had very few direct observations of the effects. "OCO-2 has given us two revolutionary new ways to understand the effects of drought and heat on tropical forests: directly measuring carbon dioxide over these regions thousands of times a day; and sensing the rate of photosynthesis by detecting fluorescence from chlorophyll in the trees themselves," said Denning. "We can use these data to test our understanding of whether the response of tropical forests is likely to make climate change worse or not."
The concentration of carbon dioxide in Earth's atmosphere is constantly changing. It changes from season to season as plants grow and die, with higher concentrations in the winter and lower amounts in the summer. Annually averaged atmospheric carbon dioxide concentrations have generally increased year over year since the early 1800s — the start of the widespread Industrial Revolution. Before then, Earth's atmosphere naturally contained about 595 gigatons of carbon in the form of carbon dioxide. Currently, that number is 850 gigatons.
The annual increase in atmospheric carbon dioxide levels and the magnitude of the seasonal cycle are determined by a delicate balance between Earth's atmosphere, ocean and land. Each year, the ocean, plants and trees take up and release carbon dioxide. The amount of carbon released into the atmosphere as a result of human activities also changes each year. On average, Earth's land and ocean remove about half the carbon dioxide released from human emissions, with the other half leading to increasing atmospheric concentrations. While natural processes are responsible for the exchange of carbon dioxide between the atmosphere, ocean and land, each year is different. In some years, natural processes remove as little as 20 percent of human emissions, while in other years they scrub as much as 80 percent.
OCO-2, launched in 2014, gathers global measurements of atmospheric carbon dioxide with the resolution, precision and coverage needed to understand how this important greenhouse gas — the principal human-produced driver of climate change — moves through the Earth system at regional scales, and how it changes over time. From its vantage point in space, OCO-2 is able to make roughly 100,000 measurements of atmospheric carbon dioxide each day, around the world.
Institutions involved in the Liu study include JPL; NCAR (National Center for Atmospheric Research) in Boulder, Colorado; the University of Toronto; Colorado State University; Caltech in Pasadena, California; and Arizona State University in Tempe, AZ.
Figure 32: The Science special collection of OCO-2-based papers give an unprecedented view from space of how carbon dioxide emissions vary within individual cities such as Los Angeles and its surroundings, shown here. Concentrations vary from more than 400 parts per million (red) over the city, foreground, to the high 300s (green) over the desert, background (image credit: NASA/JPL-Caltech/Google Earth) 49)
Monitoring of Groundwater Recovery in Silicon Valley
October 3, 2017: A NASA/university study finds aggressive conservation helped region's aquifer rebound quickly from one of the worst droughts in California history. Underground water reserves in California's Silicon Valley rebounded quickly from the state's recent severe drought, demonstrating the success of aggressive conservation measures, according to a new space-based study by NASA and university scientists. 50)
Using satellite data from COSMO-SkyMed, a constellation of four Italian Space Agency (Agenzia Spaziale Italiana, or ASI) satellites, a research team led by Estelle Chaussard at the University at Buffalo in New York, and including scientists from NASA/JPL (Jet Propulsion Laboratory) in Pasadena, California, used a technique called SAR (Synthetic Aperture Radar) interferometry to monitor the entire Santa Clara Valley aquifer near San Jose from 2011 to 2017. This type of radar can capture the subtle up-and-down movements of Earth's surface of just minute fractions, a few millimeters, that occur when water levels rise or fall underground. The scientists used hundreds of radar images obtained under a license from ASI to calculate how much the land surface elevation changed over time. The measurements show the aquifer began to rebound in late 2014, when the drought was still going strong, and that groundwater levels had returned to pre-drought levels by 2017, thanks to conservation measures that intensified in 2014, and heavy winter rains in 2016.
During the 2012-15 drought, the Santa Clara Valley Water District employed an array of conservation measures. These included restricting sprinkler use and asking customers to take shorter showers and convert lawns and pools into less-thirsty landscapes. The district also imported water from outside the region.
Chaussard says the actions may have helped stave off irreversible damage to the aquifer, which measures about 550 km2 and lies beneath a highly urbanized area. She explains when groundwater levels reach a record low, the porous sands and clays in which the reserves reside can dry up so much that the clays don't retain water anymore. The new study shows that thanks to the intensive water management efforts, this did not happen in the Santa Clara Valley.
Chaussard says the aquifer monitoring method her team used can work anywhere where there are soft-rock aquifer systems and where synthetic aperture radar satellite data are available, including in developing nations with few resources for monitoring. - "We wanted to see if we could use a remote sensing method that doesn't require ground monitoring to understand how our aquifers are responding to a changing climate and human activity," she says. "Our study further demonstrates the utility of SAR interferometry, which scientists also use to measure surface deformation related to volcanoes and earthquakes, for tracking ground deformation associated with changes in groundwater levels."
Figure 33: Ground motion in California's Santa Clara Valley from 2011 to 2015 as measured by ASI's Cosmo-SkyMed SAR constellation. Colors denote the speed of ground motion (blues indicate subsidence/sinking and reds indicate uplift). The image contains modified COSMO-SkyMed data (image credit: ASI/University at Buffalo/NASA-JPL/Caltech/Google Earth/U of Basilicata)
"This study further demonstrates a complementary method, in addition to traditional ground-based measurements, for water management districts to monitor ground deformation," added JPL co-author Pietro Milillo. "The technique marks an improvement over traditional methods because it allows scientists to gauge changes in ground deformation across a large region with unprecedented frequency." He said the COSMO-SkyMed satellites provided information for the aquifer as often as once a day.
Underground stockpiles of water — housed in layers of porous rock called aquifers — are one of the world's most important sources of drinking water. Some 2.5 billion people across the globe rely on aquifers for water, and many of these repositories are being drained more quickly than they can be refilled, according to the United Nations Educational, Scientific and Cultural Organization.
Yet keeping tabs on these precious reserves is expensive, says Chaussard. "To monitor aquifers, you need a lot of measurements in both space and time," she says. "Sampling water levels at wells may give you a continuous time series, but only if they are constantly monitored, and automated monitoring may not be common. Also, even a high density of wells may not adequately capture basin-wide spatial patterns of water storage, which is key to understanding processes at stake."
The methods employed in this study provide a more complete picture of how an aquifer responds during a drought and how water conservation methods can have a real and positive impact on sustaining the health and viability of pumped groundwater aquifers. The satellite radar imagery not only fills in data gaps between wells, but provides valuable insights into how aquifers are responding beyond the edges of monitoring well networks so that water agencies can more effectively manage their precious resources.
The upcoming NASA-ISRO (Indian Space Research Organization) Synthetic Aperture Radar (NISAR) satellite mission, planned for launch in 2021, will systematically collect radar imagery over nearly every aquifer in the world, improving our understanding of valuable groundwater resources and our ability to better manage them. In addition to tracking groundwater use in urban settings, NISAR will be able to measure surface motion associated with groundwater pumping and natural recharge in rural communities, in areas with extensive agriculture, and in regions with extensive vegetation, conditions that are typically more challenging.
The research was published Sept. 25 in the Journal of Geophysical Research - Solid Earth. Other participating institutions include the University of California, Berkeley; Purdue University, West Lafayette, Indiana; and the Santa Clara Valley Water District. 51)
Global patterns of drought recovery
September 13, 2017: As global temperatures continue to rise, the prevailing wisdom in the climate science community is that droughts will grow more frequent and more extreme in the 21st century. Though temperatures were already rising in the 20th century, the global trend in drought length and severity was ambiguous, with no clear pattern. However, the impacts of droughts was less ambiguous, particularly in recent decades. 52)
In a study published in August 2017 in the journal Nature, researchers from 17 institutions found that more of Earth's land surface is now being affected by drought and ecosystems are taking longer to recover from dry spells. Recovery is particularly worse in the tropics and at high latitudes, two areas that are already pretty vulnerable to global change. 53)
The map of Figure is based on data from that study, which was led by Christopher Schwalm of WHRC (Woods Hole Research Center). It depicts the average length of time that it took for vegetation to recover from droughts that occurred between 2000 and 2010. The darkest colors mark the areas with the longest drought recovery time. Land areas colored light gray were covered by ice or sand (deserts).
Up until now, most assessments of drought and recovery have focused on the hydrology; that is, has new rain and snowfall made up for the deficit of water in rivers, lakes, and soils? In this new study, researchers focused on the health and resilience of the trees and other plants because full reservoirs and streams do not necessarily mean that vegetation has recovered.
The research team combined observations from the MODIS (Moderate Resolution Imaging Spectroradiometer) on NASA's Terra satellite, ground measurements, and computer models to assess changes in drought. In particular, they measured changes in GPP (Gross Primary Productivity), or how well plants are consuming and storing carbon dioxide through photosynthesis. As the analysis showed, plants in many regions are taking longer to recover from drought, often because weather is more extreme (usually hotter) than in the past.
If the time between droughts grows shorter (as predicted) and the time to recover from them keeps growing longer, some ecosystems could reach a tipping point and change permanently. This could affect how much carbon dioxide is stored on land in trees and other vegetation (the land "carbon sink"). If less carbon is being captured and stored, then more of what humans produce would remain in the atmosphere, creating a feedback loop that amplifies the warming that leads to more drought.
"The most important implication of our study," said Schwalm, "is that under business-as-usual emissions of greenhouse gases, the time between drought events will likely become shorter than the time needed for recovery."
"Using the vantage point of space, we can see all of Earth's forests and other ecosystems getting hit repeatedly and increasingly by droughts," added co-author Josh Fisher of NASA's Jet Propulsion Laboratory. "Some of these ecosystems recover, but, with increasing frequency, others do not."
Figure 34: Recovery time by grid cell across all combinations of GPP and integration time. White areas are water, barren, or did not experience any relevant drought events (NASA Earth Observatory, image by Jesse Allen, using data provided by Christopher Schwalm (WHRC). Story by Michael Carlowicz, with reporting from JPL and WHRC)
Researchers find direct evidence of sea level 'fingerprints'
September 7, 2017: Researchers from NASA/JPL (Jet Propulsion Laboratory) in Pasadena, California, and the UCI (University of California), Irvine, have reported the first detection of sea level "fingerprints" in ocean observations: detectable patterns of sea level variability around the world resulting from changes in water storage on Earth's continents and in the mass of ice sheets. The results will give scientists confidence they can use these data to determine how much the sea level will rise at any point on the global ocean as a result of glacier ice melt. 54) 55)
Figure 35: Sea level rise fingerprints calculated from observations of mass changes in Greenland, Antarctica, continental glaciers and ice caps, and land water storage made by the GRACE satellites, January 2003 to April 2014 (image credit: NASA, UCI)
As ice sheets and glaciers undergo climate-related melting, they alter Earth's gravity field, resulting in sea level changes that aren't uniform around the globe. For example, when a glacier loses ice mass, its gravitational attraction is reduced. Ocean waters nearby move away, causing sea level to rise faster far away from the glacier. The resulting pattern of sea level change is known as a sea level fingerprint. Certain regions, particularly in Earth's middle and low latitudes, are hit harder, and Greenland and Antarctica contribute differently to the process. For instance, sea level rise in California and Florida generated by the melting of the Antarctic ice sheet is up to 52 percent greater than its average effect on the rest of the world.
To calculate sea level fingerprints associated with the loss of ice from glaciers and ice sheets and from changes in land water storage, the team used gravity data collected by the twin satellites of the U.S./German GRACE (Gravity Recovery and Climate Experiment) between April 2002 and October 2014. During that time, the loss of mass from land ice and from changes in land water storage increased global average sea level by about 1.8 mm per year, with 43 percent of the increased water mass coming from Greenland, 16 percent from Antarctica and 30 percent from mountain glaciers. The scientists then verified their calculations of sea level fingerprints using readings of ocean-bottom pressure from stations in the tropics.
Figure 36: Sea level fingerprints (patterns of variation in sea level rise) calculated from GRACE satellite observations, 2002-2014. The blue contour (1.8 mm per year) shows the average sea level rise if all the water added to the ocean were spread uniformly around Earth (image credit: NASA, UCI)
"Scientists have a solid understanding of the physics of sea level fingerprints, but we've never had a direct detection of the phenomenon until now," said co-author Isabella Velicogna, UCI professor of Earth system science and JPL research scientist. "It was very exciting to observe the sea level fingerprints in the tropics, far from the glaciers and ice sheets," said lead author Chia-Wei Hsu, a graduate student researcher at UCI.
GRACE is a joint NASA mission with the German Aerospace Center (DLR) and the German Research Center for Geosciences (GFZ), in partnership with the University of Texas at Austin.
Global Record Temperature Streak Study
August 10, 2017: The year 2016, aided in part by a historically large El Niño event, set a new global temperature record. We have thus witnessed three consecutive record breaking annual mean temperatures (2014, 2015, and 2016) in most global and/or hemispheric surface temperature series for the first time since historical observations began in the nineteenth century. It is reasonable to suspect that such an event would be extremely unlikely in the absence of anthropogenic warming, but it is worthwhile to ask just how unlikely such events actually are both with and without anthropogenic influence on climate. 56) 57)
Temperature records were first broken in 2014, when that year became the hottest year since global temperature records began in 1880. These temperatures were then surpassed in 2015 and 2016, making last year the hottest year ever recorded. In 2016, the average global temperature across land and ocean surface areas was 0.94 degrees Celsius above the 20th century average of 13.9 degrees Celsius,according to NOAA.
Winds Trigger Pond Growth
July 6, 2017: Wind is a force to be reckoned with. It can stir up monsoons, carry dust thousands of miles, and sculpt rock into sinuous arches. But sometimes, the effects of wind go unnoticed for years, like when it carves away slowly at the edges of a pond. 58)
A new study shows that winds are responsible for the widespread growth of ponds in three watersheds along the Mississippi River. The paper, published in April 2017 in Geophysical Research Letters, shows that wind-driven waves can erode pond banks, leading them to migrate in the direction of the wind. In effect, researchers have shown that wind-driven erosion, which nibbles away coastlines and the edges of larger bodies of water, can also happen inland on small scales. 59)
The researchers analyzed roughly 10,000 satellite images taken between 1982 and 2016, examining land and water pixels to look for inland change across the Mississippi River Delta. "Up until now, a lot of focus has been on coastal retreat," said Alejandra Ortiz, a marine geologist at Indiana University, Bloomington. Instead, Ortiz and colleagues focused on internal fragmentation; that is, what happens when land becomes subdivided by inland erosion processes. "Our thinking was, can you see this on large scale?"
Ortiz and her co-authors found that ponds in the Mississippi Delta tended to expand in a southwesterly direction, which is the same direction as the prevailing winds (which blow out of the northeast). This was especially true in Terrebonne and Barataria basins, where 80 percent of the ponds are expanding. The other study basin, the Atchafalaya-Vermillion, was deemed stable, with nearly as many ponds contracting as expanding—roughly 30 percent.
The false-color image of Figure 37 shows the area of study along the Atchafalaya Delta. It was captured on December 1, 2016, by OLI (Operational Land Imager) on Landsat 8. The colors emphasize the difference between land and water while allowing viewers to observe waterborne sediment, which is typically absent from false-color imagery.
Figure 37: OLI image on Landsat-8 of ponds in three watersheds along the Mississippi River south of New Orleans, acquired on December 1, 2016 (image credit: NASA Earth Observatory, images by Joshua Stevens, using Landsat data from the USGS and data from Ortiz, A. C., Roy, S., & Edmonds, D. A. (2017), story by Pola Lem)
The images of Figures 38 and 39 illustrate ponds that have grown (blue) or receded (orange) near the delta. In areas like Houma, Louisiana, the size of ponds increased significantly. The Terrebone and Barataria basins have much higher pond density, making them more susceptible to pond merging—when two or more ponds migrate toward each other and produce one larger body of water.
Figure 38: Landsat images (Landsat-7, Landsat-4, Landsat-5 and Landsat-8) of ponds that have grown (blue) or receded (orange) near the delta, acquired in the period 1982 - 2016 (image credit: NASA Earth Observatory, images by Joshua Stevens, using Landsat data from the USGS and data from Ortiz, A. C., Roy, S., & Edmonds, D. A. (2017), story by Pola Lem)
Some ponds were too small to generate waves strong enough to erode the shoreline. The researchers found that critical pond width was about 300 meters. Ponds at least that wide offer enough open space—for wind to gather momentum, or "fetch" as sailors and meteorologists call it—to create waves big enough to nibble away the shore.
Ortiz said the findings could affect the management of erosion-prone water bodies. For instance, managers could create physical barriers to prevent ponds from growing. "One possibility is thinking about putting in something that stops wave generation," she said.
Figure 39: Landsat images (Landsat-7, Landsat-4, Landsat-5 and Landsat-8) of ponds that have grown (blue) or receded (orange) near the delta, acquired in the period 1982 - 2016 (image credit: NASA Earth Observatory, images by Joshua Stevens, using Landsat data from the USGS and data from Ortiz, A. C., Roy, S., & Edmonds, D. A. (2017), story by Pola Lem)
Increasing rate of GMSL (Global Mean Sea Level) rise during 1993-2014
June 2017: Ocean levels rose 50 percent faster in 2014 than in 1993, with meltwater from the Greenland ice sheet now supplying 25 percent of total sea level increase compared with just five percent 20 years earlier, researchers reported on June 26, 2017. The findings add to growing concern among scientists that the global watermark is climbing more rapidly than forecast only a few years ago, with potentially devastating consequences. 60)
- The findings add to growing concern among scientists that the global watermark is climbing more rapidly than forecast only a few years ago, with potentially devastating consequences. Hundreds of millions of people around the world live in low-lying deltas that are vulnerable, especially when rising seas are combined with land sinking due to depleted water tables, or a lack of ground-forming silt held back by dams.
- Major coastal cities are also threatened, while some small island states are already laying plans for the day their drowning nations will no longer be livable.
- "This result is important because the IPCC (Intergovernmental Panel on Climate Change), the UN science advisory body, makes a very conservative projection of total sea level rise by the end of the century," at 60 to 90 cm, said Peter Wadhams, a professor of ocean physics at the University of Oxford who did not take part in the research.
- That estimate, he added, assumes that the rate at which ocean levels rise will remain constant. "Yet there is convincing evidence — including accelerating losses of mass from Greenland and Antarctica — that the rate is actually increasing, and increasing exponentially."
- Greenland alone contains enough frozen water to lift oceans by about 7 m, though experts disagree on the global warming threshold for irreversible melting, and how long that would take once set in motion.
- "Most scientists now expect total rise to be well over 1 m by the end of the century," Wadhams said.
- The new study, published in Nature Climate Change, reconciles for the first time two distinct measurements of sea level rise. 61)
- GMSL has been rising at a faster rate during the satellite altimetry period (1993–2014) than previous decades, and is expected to accelerate further over the coming century. However, the accelerations observed over century and longer periods2 have not been clearly detected in altimeter data spanning the past two decades. Here we show that the rise, from the sum of all observed contributions to GMSL, increases from 2.2 ± 0.3 mm yr-1 in 1993 to 3.3 ± 0.3 mm yr-1 in 2014. This is in approximate agreement with observed increase in GMSL rise, 2.4 ± 0.2 mm yr-1 (1993) to 2.9 ± 0.3 mm yr-1 (2014), from satellite observations that have been adjusted for small systematic drift, particularly affecting the first decade of satellite observations. 62)
- The mass contributions to GMSL increase from about 50% in 1993 to 70% in 2014 with the largest, and statistically significant, increase coming from the contribution from the Greenland ice sheet, which is less than 5% of the GMSL rate during 1993 but more than 25% during 2014. The suggested acceleration and improved closure of the sea-level budget highlights the importance and urgency of mitigating climate change and formulating coastal adaption plans to mitigate the impacts of ongoing sea-level rise.
Lightning Sparking More Boreal Forest Fires
June 26, 2017: A new NASA-funded study finds that lightning storms were the main driver of recent massive fire years in Alaska and northern Canada, and that these storms are likely to move farther north with climate warming, potentially altering northern landscapes. 63)
The study, led by Vrije Universiteit Amsterdam and the University of California, Irvine, examined the cause of the fires, which have been increasing in number in recent years. There was a record number of lightning-ignited fires in the Canadian Northwest Territories in 2014 and in Alaska in 2015. The team found increases of between two and five percent a year in the number of lightning-ignited fires since 1975. 64)
To study the fires, the team analyzed data from NASA's Terra and Aqua satellites and from ground-based lightning networks.
Lead author Sander Veraverbeke of Vrije Universiteit Amsterdam, who conducted the work while at UC Irvine, said that while the drivers of large fire years in the high north are still poorly understood, the observed trends are consistent with climate change. "We found that it is not just a matter of more burning with higher temperatures. The reality is more complex: higher temperatures also spur more thunderstorms. Lightning from these thunderstorms is what has been igniting many more fires in these recent extreme events," Veraverbeke said.
Study co-author Brendan Rogers at Woods Hole Research Center in Falmouth, Massachusetts, said these trends are likely to continue. "We expect an increasing number of thunderstorms, and hence fires, across the high latitudes in the coming decades as a result of climate change." This is confirmed in the study by different climate model outputs.
Study co-author Charles Miller of NASA's Jet Propulsion Laboratory in Pasadena, California, said while data from the lightning networks were critical to this study, it is challenging to use these data for trend detection because of continuing network upgrades. "A spaceborne sensor that provides high northern latitude lightning data that can be linked with fire dynamics would be a major step forward," he said.
The researchers found that the fires are creeping farther north, near the transition from boreal forests to Arctic tundra. "In these high-latitude ecosystems, permafrost soils store large amounts of carbon that become vulnerable after fires pass through," said co-author James Randerson of the University of California, Irvine. "Exposed mineral soils after tundra fires also provide favorable seedbeds for trees migrating north under a warmer climate."
"Taken together, we discovered a complex feedback loop between climate, lightning, fires, carbon and forests that may quickly alter northern landscapes," Veraverbeke concluded. "A better understanding of these relationships is critical to better predict future influences from climate on fires, and from fires on climate."
Figure 40: A lightning-caused wildfire burns in Alberta, Canada (image credit: The Government of Alberta)
Sea level as a metronome of Earth's history
May 19, 2017: Sedimentary layers contain stratigraphic cycles and patterns that precisely reveal the succession of climatic and tectonic conditions that have occurred over millennia. Researchers have been working on an analytical method that combines observing deep-water sedimentary strata and measuring in them the isotopic ratio between heavy and light carbon. They have discovered that the cycles that punctuate these sedimentary successions are ascribable to sea level changes. 65) 66) 67)
Sedimentary layers record the history of the Earth. They contain stratigraphic cycles and patterns that precisely reveal the succession of climatic and tectonic conditions that have occurred over millennia, thereby enhancing our ability to understand and predict the evolution of our planet.
Researchers at the University of Geneva (UNIGE), Switzerland, - together with colleagues at the University of Lausanne (UNIL) and American and Spanish scientists - have been working on an analytical method that combines observing deep-water sedimentary strata and measuring in them the isotopic ratio between heavy and light carbon.
They have discovered that the cycles that punctuate these sedimentary successions are not, as one might think, due solely to the erosion of mountains that surround the basin, but are more ascribable to sea level changes. This research, which you can read in the journal Geology, paves the way for new uses of isotopic methods in exploration geology.
The area south of the Pyrenees is particularly suitable for studying sedimentary layers. Rocks are exposed over large distances, allowing researchers to undertake direct observation. Turbidites can be seen here: large sediment deposits formed in the past by underwater avalanches consisting of sand and gravel.
The ups and downs of oceans regulate sedimentation cycles:
The variations in the ratio helped us explore the possible link with the sea level". The research team found that the turbidite-rich intervals were associated with high 12C levels, and almost always corresponded to periods when the sea level was low. It seems that sedimentary cycles are mainly caused by the rise and fall of the sea level and not by the episodic growth of mountains.
When the sea level is high, continental margins are flooded under a layer of shallow water. Since the rivers are no longer able to flow, they begin to deposit the sediments they carry there. This is why so little material reaches the deep basins downstream. When the sea level is low, however, rivers erode their beds to lower the elevation of their mouth; they transfer their sediment directly to the continental slopes of the deep basins, creating an avalanche of sand and gravel.
Consequently, if the variations of the sea level are known, it is possible to predict the presence of large sedimentary accumulations created by turbidites, which often contain large volumes of hydrocarbons, one of the holy grails of exploration geology.
Measuring stable carbon isotopes: a new indicator of reservoir rocks:
In addition, this measurement is relatively simple to perform and it provides accurate data - a real asset for science and mining companies. The study also highlights the importance of sea levels, which are a real metronome for the Earth's sedimentary history.
"Of course," concludes Honegger, "tectonic deformation and erosion are important factors in the formation of sedimentary layers; but they play a secondary role in the formation of turbidite accumulations, which are mainly linked to changes in the sea level".
Study of Ice-shelf Channel Formation in Antarctica
May 15, 2017: A team of scientists led by the ULB (Universite Libre de Bruxelles) Belgium and the Bavarian Academy of Sciences (Munich,Germany) have discovered an active hydrological system of water conduits and sediment ridges below the Antarctic ice sheet. Their study reveals that the scale of these subglacial features is five times bigger than those seen in today's deglaciated landscapes. 68)
The newly discovered, oversized sediment ridges actively shape the ice hundreds kilometers downstream, by carving deep incisions at the bottom of the ice. This is of interest for the stability of the floating ice shelves, as numerous studies show that ice shelf thinning has major consequences for ice sheet stability.
Subglacial conduits form under large ice sheets as part of their basal hydrological system. These tunnels have a typical diameter of several meters to tens of meters, and they funnel the subglacial melt water towards the ocean. However, new geophysical observations by the Laboratoire de Glaciologie of the ULB show that these conduits widen considerably the closer they come to the ocean. A new mathematical model explains this widening with the vanishing overburden pressure at the location where the ice becomes afloat on the ocean.
As the conduits widen, the outflow velocity of the subglacial water decreases, which leads to increased sediment deposition at the conduit's portal. Over thousands of years, this process builds up giant sediment ridges - comparable in height with the Eiffel tower - below the ice. Active sedimentation in subglacial water conduits seems to drive the formations of Eskers - elongated ridges of gravel which are commonly observed today in areas where former ice sheets have retreated. However, the remainders of today's Eskers are considerably smaller in size than those now discovered in Antarctica.
Ice-shelf channels are long curvilinear tracts of thin ice found on Antarctic ice shelves. Many of them originate near the grounding line, but their formation mechanisms remain poorly understood. The study team uses ice-penetrating radar data from the Roi Baudouin Ice Shelf, East Antarctica, to infer that the morphology of several ice-shelf channels is seeded upstream of the grounding line by large basal obstacles indenting the ice from below. The team interprets each obstacle as an esker ridge formed from sediments deposited by subglacial water conduits, and calculates that the eskers' size grows towards the grounding line where deposition rates are maximum. Relict features on the shelf indicate that these linked systems of subglacial conduits and ice-shelf channels have been changing over the past few centuries. Because ice-shelf channels are loci where intense melting occurs to thin an ice shelf, these findings expose a novel link between subglacial drainage, sedimentation and ice-shelf stability. 69)
Water beneath the Antarctic Ice Sheet promotes the formation of ice streams that rapidly slide over wet sediments and a lubricated base. Ice streams discharge the majority of Antarctic ice into floating ice shelves, which surround about 74% of the Antarctic perimeter. Ice shelves occupying embayments buttress the continental mass flux. The buttressing strength depends on the pattern of basal mass balance (i.e., the sum of melting and refreezing), which in turn influences ice-shelf geometry. Measurements show that basal melting is concentrated by ice-shelf channels, which are typically a few kilometers wide and extend for up to hundreds of kilometers along the shelf flow. Ice is thinnest along their central axes (sometimes thinner than half of the ice thickness), and basal melt rates are elevated at their onsets near the grounding line. Theory and satelliteborne observations suggest that such ‘subglacially sourced' ice-shelf channels are formed by buoyant melt-water plumes forced by basal melt water exiting from subglacial conduits at the grounding line. Hitherto, no such conduits have been observed, presumably because they are too small to be detected with ice-penetrating radar.
The study team surveyed three hydrologically predicted subglacial water-outlet locations at the Roi Baudouin Ice Shelf in Dronning Maud Land, Antarctica, all with corresponding ice-shelf channels seawards (Sites A–C, Figure 41a,b). Airborne radar data collected upstream of the satellite-inferred grounding line show distinct radar reflectors situated several hundred meters above the adjacent ice-bed interface (reflectors A–C, Figure 42 c). Using additional ground-based radar data from 2016, the team examined the reflectors' geometry in order to deduce their identity and evaluate three different scenarios for ice-shelf channel formation.
Figure 41: Overview of the study area: (a) Location of airborne (2011) and ground-based (2016) radar profiles of the Roi Baudouin Ice Shelf, East Antarctica, with Landsat image in the background. Grounding lines are marked for 1996, 2007 and 2016. The dashed white box delineates the area in b where radar-profile locations are shown with TanDEM-X surface elevation (5m contours), image credit: Study Team)
Figure 42: Overview of the study area: (c) Airborne radar profile EuA-EuA' covering the grounded ice sheet. Internal reflection hyperbolas reaching hundreds of meters above the ice-bed interface are evident (reflectors A–C), and are aligned with ice-shelf channels located seawards (into page). Reflectors A and C are beneath surface ridges (image credit: Study Team)
Giant conduits that can sap the ice from below: The evolving sediment ridges leave scars at the bottom of the ice as the ice flows over them. These scars are transmitted to the floating ice shelves farther downstream forming ice-shelf channels. Ice in these channels is up to half as thin as their surroundings, making them a weak spot when exposed to melting from the warmer ocean.
It was originally thought that ice-shelf channels are carved by melting due to the ocean only, but this seems only part of the story: "Our study shows that ice-shelf channels can already be initiated on land, and that the size of the channels significantly depends on sedimentation processes occurring over hundreds to thousands of years" indicates Reinhard Drews, lead author of the study.
The novel link between the subglacial hydrological system, sedimentation, and ice-shelf stability, offers new opportunities to unravel key processes beneath the Antarctic ice sheet, and also improves our ability to reconstruct the ice-sheet extent in the Northern Hemisphere during the last ice ages.
More information on this topic is provided in Ref. 69).
Glacial lakes grow in the Himalayas as well as the risks
• May 9, 2017: For people living around the Himalayas, the effects of global warming are anything but distant or abstract. As air temperatures have risen in the past half-century, glaciers have melted and retreated in these mountains. Between 1990 and 2015, Landsat satellites have documented a significant increase in both the number and average size of glacial lakes throughout the range. 70) 71) 72)
- Expanding lakes mean greater risks for the people living in valleys downstream. Specifically, there is a greater risk of GOLFs (Glacial Lake Outburst Floods)—a type of flash flood that occurs when ice or sediment dams collapse beneath glacial lakes. Landslides, avalanches, earthquakes, and volcanic eruptions often trigger GLOFs.
- After analyzing hundreds of satellite images, a research team from the Chinese Academy of Sciences and UCLA (University of California, Los Angeles) concluded that the number of Himalayan glacial lakes increased from 4,459 in 1990 to 4,950 in 2015, with a total area gain of 56 km2, or 14 percent.
- The degree of change varied by region. The size and number of lakes in the southern central Himalayas increased the most, particularly in Nepal, at elevations between 4,200 and 5,800 meters. In the map of Figure 43, regions where lakes expanded the most (20 percent or more) are shown with dark blue; regions where lakes grew only slightly (10 percent or less) are light blue. Lakes in the western Himalayas are generally more stable. Some glaciers in the Karakorum, for instance, are advancing. In contrast, rapid warming in the central Himalayas—as well as more soot being deposited on ice—may explain the rapid retreat of glaciers there.
- The researchers observed changes to lakes at both the terminus of glaciers (proglacial lakes) and on top of them (supraglacial lakes). Between 1990 and 2015, the number of proglacial lakes increased by 227; the number of supraglacial lakes rose by 144. About 81 percent of the expansion in lake area was caused by changes to proglacial lakes.
- The researchers also identified 118 proglacial lakes that pose a particularly high risk to people living downstream. These lakes—many of them in the central Himalayas near Kathmandu, Nepal, and in the eastern Himalayas near Thimphu, Bhutan—grew by more than 1 percent each year.
- One rapidly expanding proglacial lake, Cirenmaco, highlights the risks. It stands at the base of Amaciren Glacier in the Zhangzangbo Valley of Nepal, and it was the scene of outburst floods in 1964 and 1981. The flood in 1981 was particularly destructive, killing hundreds of people, knocking out a power plant, and destroying bridges and roads. As seen in the false-color Landsat images (Figure 44), Cirenmaco's size more than doubled between 1988 and 2015.
- While proglacial lakes generally grew steadily each year, the lakes emerging on top of the glaciers were small, short-lived, and fast-changing. Many supraglacial lakes are perched on debris-covered ice, meaning lake water can quickly drain deeper into the glacier when cracks emerge.
Figure 44: Left: Image of Lake Cirenmaco acquired with Landsat-5 on Oct. 12, 1988; Right: Image of the same region acquired with Landsat-8 on Oct. 7, 2015 (image credit: NASA Earth Observatory, images by Jesse Allen, using Landsat data from the USGS, caption by Adam Voiland)
- Nepal's Ngozumpa Glacier (Figure 45), which lies about 25 km west of Mount Everest, has seen a significant increase in the number and size of supraglacial lakes on its surface. The pair of Landsat images below show the surface of the glacier in 1989 and 2015. Many of the lakes that existed in 1990 had drained by 2015, while many new lakes emerged in other areas.
- Researchers point to rising temperatures and melting glaciers as the primary cause for the increase in the size and number of lakes in the Himalayas. Rates of warming vary by region, but match the increases in total lake area. For instance, the team noted that between 1979 and 2014, temperatures rose by 5.9ºC at Nyalam in the central Himalaya; 1.0 ºC at Shiquanhe in the western Himalaya; and 0.4ºC at Yadong in the eastern Himalaya.
- "As the region continues to warm, it is urgent that scientists continue to monitor the most rapidly expanding glacial lakes with satellites because many are remote and very difficult to access," said Yong Nie, lead author of the study. "If we know which lakes pose the greatest risk, then authorities can take steps to develop early warning systems, drain specific high-risk lakes, and educate people on how to minimize their exposure to flash floods."
- Yongwei Sheng, a geography professor at UCLA and one of the study authors, added: "While we have reported on widespread lake expansion across the Himalayas, region-specific driving mechanisms, lake-specific risk assessments, as well as future change prediction all deserve further investigation."
Figure 45: Left: Landsat image of the Ngozumpa Glacier, acquired on Nov. 9, 1089; Right: Landsat-8 image of the same region acquired on Sept. 30, 2015 (image credit: NASA Earth Observatory, images by Jesse Allen, using Landsat data from the USGS, caption by Adam Voiland)
Regional Sea Level Scenarios for Coastal Risk Management
• May 2017: Sea level rise is occurring worldwide, but not at the same rate everywhere. Differences will also likely continue in the future, so decision-makers need local information to assess their community's vulnerability. These new scenarios integrate updated global sea level rise scenarios with regional factors, such as changes in land elevations and ocean circulation, that influence sea level regionally.
"The ocean is not rising like water would in a bathtub," said William Sweet, a NOAA oceanographer and lead author of the report detailing the scenarios. "For example, in some scenarios sea levels in the Pacific Northwest are expected to rise slower than the global average, but in the Northeast they are expected to rise faster. These scenarios will help communities better understand local trends and make decisions about adaptation that are best for them." 73) 74)
In the USA, the "Sea Level Rise and Coastal Flood Hazard Scenarios and Tools Interagency Task Force", jointly convened by the USGCRP (U.S. Global Change Research Program) and the NOC (National Ocean Council), began its work in August 2015. The Task Force has focused its efforts on three primary tasks: 1) updating scenarios of GMSL (Global Mean Sea Level) rise, 2) integrating the global scenarios with regional factors contributing to sea level change for the entire U.S. coastline, and 3) incorporating these regionally appropriate scenarios within coastal risk management tools and capabilities deployed by individual agencies in support of the needs of specific stakeholder groups and user communities.
Long-term sea level rise driven by global climate change presents clear and highly consequential risks to the United States over the coming decades and centuries. Today, millions of people in the United States already live in areas at risk of coastal flooding, with more moving to the coasts every year . Rising seas will dramatically increase the vulnerability of this growing population, along with critical infrastructure related to transportation, energy, trade, military readiness, and coastal ecosystems and the supporting services they provide.
GMSL (Global Mean Sea Level) has increased by about 21 cm to 24 cm since 1880, with about 8 cm occurring since 1993. In addition, the rate of GMSL rise since 1900 has been faster than during any comparable period over at least the last 2800 years. Scientists expect that GMSL will continue to rise throughout the 21st century and beyond, because of global warming that has already occurred and warming that is yet to occur due to the still-uncertain level of future emissions. GMSL rise is a certain impact of climate change; the questions are when, and how much, rather than if. There is also a long-term commitment (persistent trend); even if society sharply reduces emissions in the coming decades, sea level will most likely continue to rise for centuries.
While the long-term, upward shift in sea level is an underlying driver of changes to the nation's coasts, impacts are generally expressed through extreme water levels (short-period, lower-probability events both chronic and acute in nature) occurring against the background of this shifting baseline. Higher sea levels worsen the impacts of storm surge, high tides, and wave action, even absent any changes in storm frequency and intensity. Even the relatively small increases in sea level over the last several decades have led to greater storm impacts at many places along the U.S. coast. Similarly, the frequency of intermittent flooding associated with unusually high tides has increased rapidly (accelerating in many locations) in response to increases in RSL (Relative Sea Level ) as shown in Figure 46. At some locations in the United States, the frequency of tidal flooding (events typically without a local storm present) has increased by an order of magnitude over the past several decades, turning it from a rare event into a recurrent and disruptive problem. Significant, direct impacts of long-term RSL rise, including loss of life, damage to infrastructure and the built environment, permanent loss of land, ecological regime shifts in coastal wetlands and estuary systems, and water quality impairment, also occur when key thresholds in the coastal environment are crossed. Some of these impacts have the potential to ‘feedback' and influence wave impacts and coastal flooding. For example, there is evidence that wave action and flooding of beaches and marshes can induce changes in coastal geomorphology, such as sediment build up, that may iteratively modify the future flood risk profile of communities and ecosystems.
Figure 46: a) Multi-year empirical (smoothed) distributions for daily highest water levels in Norfolk, VA for the 1960s and 2010s, showing extent that local RSL rise has increased the flood probability relative to impact thresholds defined locally by the National Weather Service for minor (~0.5 m nuisance level), moderate (~0.9 m) and major (~1.2 m local level of Hurricane Sandy in 2012) impacts, relative to MHHW (Mean Higher High Water) tidal datum of the National Tidal Datum Epoch (1983–2001). b) Due to RSL rise, annual flood frequencies (based upon 5-year averages) in Norfolk for recurrent nuisance tidal floods with minor impacts are accelerating, as shown by the quadratic trend fit (goodness of fit [R2]=0.84). Flood rates are rapidly increasing in similar fashions along dozens of coastal cities of the U.S. (image credit: NOAA, USGS, EPA)
In this context, there is a clear need—and a clear call from states and coastal communities (White House, 2014)—to support preparedness planning with consistent, accessible, authoritative and more locally appropriate knowledge, data, information, and tools about future changes in sea level and associated coastal risks. In response to this need, the White House Council on Climate Preparedness and Resilience in 2015 called for the establishment of the Federal Interagency Sea Level Rise and Coastal Flood Hazard Scenarios and Tools Task Force, a joint task force of the NOC (National Ocean Council) and the USGCRP (U.S. Global Change Research Program ). The Task Force's charge is to develop and disseminate, through interagency coordination and collaboration, future RSL and associated coastal flood hazard scenarios and tools for the entire United States. These scenarios and tools are intended to serve as a starting point for on-the-ground coastal preparedness planning and risk management processes, including compliance with the new FFRMS (Federal Flood Risk Management Standard).
The Task Force is charged with leveraging the best available science; incorporating regional science and expertise where appropriate; building this information into user-friendly mapping, visualization, and analysis tools; and making it easily accessible through established Federal web portals. Part of the motivation for forming the Task Force was to bring together key efforts within individual agencies, such as the FEMA (Federal Emergency Management Agency), NOAA (National Oceanic and Atmospheric Administration), USACE (U.S. Army Corps of Engineers), USGS (U.S. Geological Survey), DoD (Department of Defense), EPA (Environmental Protection Agency) and NASA (National Aeronautics and Space Administration), that could serve as building blocks of an overall Federal system of sea level information and decision support, and to provide synthesis and coverage of the entire United States coastline.
• May 4, 2017: Thanks in large part to satellite measurements, scientists' skill in measuring how much sea levels are rising on a global scale - currently 3.4 mm per year - has improved dramatically over the past quarter century. But at the local level, it's been harder to estimate specific regional sea level changes 10 or 20 years away - the critical timeframe for regional planners and decision makers. 75)
That's because sea level changes for many reasons, on differing timescales, and is not the same from one place to the next. Developing more accurate regional forecasts of sea level rise will therefore have far-reaching benefits for the more than 30 percent of Americans who currently reside along the Pacific, Atlantic or Gulf Coasts of the contiguous United States.
New research published this week in the Journal of Climate reveals that one key measurement — large-scale upper-ocean temperature changes caused by natural cycles of the ocean — is a good indicator of regional coastal sea level changes on these decadal timescales. Such data may give planners and decision makers a new tool to identify key regions of U.S. coastlines that may be vulnerable to sea level changes on 10- to 20-year timescales. 76)
"Decision makers need a diverse set of tools with different informational needs," said lead author Veronica Nieves of UCLA and NASA's Jet Propulsion Laboratory in Pasadena, California. "Having a better understanding of the chances of local flood damage from rising seas in coastal areas is a key factor in being able to assess vulnerability, risk and adaptation options." Such tools could help planners decide whether a given part of a coastline would be better served by "soft" techniques, such as beach replenishment or preservation of wetlands, or by "hard" techniques, such as construction of sea walls or levees.
Nieves' team, which included participation from the IMEDEA (Mediterranean Institute for Advanced Studies) in Esporles, Spain, set out to detect decadal sea level changes over large U.S. coastal ocean regions. They compared existing NOAA records of upper-ocean temperatures in coastal waters for each U.S. ocean coastline with records of actual sea level changes from 1955 to 2012, and data from U.S./European satellite altimeter missions since 1992. They identified those sea level changes that have a large impact at regional scales in many locations, including largely populated cities. Sea level along the U.S. East Coast and West Coast can rise and fall by an inch or two (several centimeters) over the course of a decade or two because of fluctuations in upper ocean temperatures.
Their method was able to explain about 70 percent of regional sea level variability on decadal time scales for the West Coast, about 80 percent for the East Coast, and about 45 percent for the Gulf Coast. Along the Gulf Coast, the authors say other factors, such as tidal effects and the ongoing subsidence, or sinking, of the land, can play a more important role.
"Our study shows that large-scale upper-ocean temperature changes provide a good way to distinguish decade-long natural ocean signals from longer-term global warming signals," said Nieves. "This is important for regional planning, because it allows policymakers to identify places where climate change dominates the observed sea level rise and places where the climate change signal is masked by shorter-term regional variability caused by natural ocean climate cycles."
Nieves said an example is the U.S. West Coast, where the phase of a multi-decadal ocean climate pattern, called the Pacific Decadal Oscillation, has helped keep sea level rise lower during the past two decades. With the recent shift of this oscillation to its opposite phase, scientists expect sea level rise along the West Coast to accelerate in coming years.
"Scientists have worked hard to understand the really fast changes in sea level, such as storm surges, because they cause major damage, and the really slow changes because long-term sea level rise will shape the coastlines of the future," said study co-author Josh Willis of JPL. "But in between these fast and slow changes, there's a gap in our understanding. The results of our study help fill that gap."
Figure 47: Correlations in U.S. coastal sea level rise between the new sea level indicator tool and reconstructed decade-scale estimates of sea level (image credit: NASA/JPL-Caltech/UCLA/IMEDEA)
Sea Ice Extent Sinks to Record Lows at Both Poles
• March 22, 2017: Arctic sea ice appears to have reached on March 7 a record low wintertime maximum extent, according to scientists at NASA and the NASA-supported NSIDC (National Snow and Ice Data Center ) in Boulder, Colorado. And on the opposite side of the planet, on March 3 sea ice around Antarctica hit its lowest extent ever recorded by satellites at the end of summer in the Southern Hemisphere, a surprising turn of events after decades of moderate sea ice expansion. 77) 78) 79)
Figure 48: On March 7, 2017, Arctic sea ice hit a record low wintertime maximum extent in 2017. At 14.4 million km2, it is the lowest maximum extent in the satellite record, and 1.18 million km2 below the 1981 to 2010 average maximum extent (image credit: NASA/GSFC Scientific Visualization Studio, L. Perkins)
On Feb. 13, 2017, the combined Arctic and Antarctic sea ice numbers were at their lowest point since satellites began to continuously measure sea ice in 1979. Total polar sea ice covered 16.21 million km2, which is 2 million km2 less than the average global minimum extent for 1981-2010 – the equivalent of having lost a chunk of sea ice larger than Mexico.
Figure 49: These line graphs plot monthly deviations and overall trends in polar sea ice from October 1978 to March 7, 2017 as measured by satellites. The top line shows the Arctic; the middle shows Antarctica; and the third shows the global, combined total. The graphs depict how much the sea ice concentration moved above or below the long-term average (they do not plot total sea ice concentration). Arctic and global sea ice totals have moved consistently downward over 38 years. Antarctic trends are more muddled, but they do not offset the great losses in the Arctic (image credit: Joshua Stevens, NASA Earth Observatory)
The ice floating on top of the Arctic Ocean and surrounding seas shrinks in a seasonal cycle from mid-March until mid-September. As the Arctic temperatures drop in the autumn and winter, the ice cover grows again until it reaches its yearly maximum extent, typically in March. The ring of sea ice around the Antarctic continent behaves in a similar manner, with the calendar flipped: it usually reaches its maximum in September and its minimum in February.
This winter, a combination of warmer-than-average temperatures, winds unfavorable to ice expansion, and a series of storms halted sea ice growth in the Arctic. This year's maximum extent, reached on March 7 at 14.42 million km2, is 96,000 km2 below the previous record low, which occurred in 2015, and 1.22 million km2 smaller than the average maximum extent for 1981-2010.
Figure 50: On March 3, 2017, the sea ice cover around the Antarctic continent shrunk to its lowest yearly minimum extent in the satellite record, in a dramatic shift after decades of moderate sea ice expansion (image credit: NASA/GSFC Scientific Visualization Studio, L. Perkins)
Figure 51: Alternate view view of Arctic sea ice extend, acquired on March 7, 2017 (image credit: NASA Earth Observatory, image by Joshua Stevens using AMSR-2 sensor data on GCOM-W1)
"We started from a low September minimum extent," said Walt Meier, a sea ice scientist at NASA/GSFC in Greenbelt, Maryland. "There was a lot of open ocean water and we saw periods of very slow ice growth in late October and into November, because the water had a lot of accumulated heat that had to be dissipated before ice could grow. The ice formation got a late start and everything lagged behind – it was hard for the sea ice cover to catch up."
The Arctic's sea ice maximum extent has dropped by an average of 2.8 % per decade since 1979. The summertime minimum extent losses are nearly five times larger: 13.5% per decade. Besides shrinking in extent, the sea ice cap is also thinning and becoming more vulnerable to the action of ocean waters, winds and warmer temperatures.
This year's record low sea ice maximum extent might not necessarily lead to a new record low summertime minimum extent, since weather has a great impact on the melt season's outcome, Meier said. "But it's guaranteed to be below normal."
In Antarctica, this year's record low annual sea ice minimum of 2.11 million km2 was 184,000 km2 below the previous lowest minimum extent in the satellite record, which occurred in 1997.
Antarctic sea ice saw an early maximum extent in 2016, followed by a very rapid loss of ice starting in early September. Since November, the daily Antarctic sea ice extent has continuously been at its lowest levels in the satellite record. The ice loss slowed down in February.
This year's record low happened just two years after several monthly record high sea ice extents in Antarctica and decades of moderate sea ice growth. "There's a lot of year-to-year variability in both Arctic and Antarctic sea ice, but overall, until last year, the trends in the Antarctic for every single month were toward more sea ice," said Claire Parkinson, a senior sea ice researcher at Goddard. "Last year was stunningly different, with prominent sea ice decreases in the Antarctic. To think that now the Antarctic sea ice extent is actually reaching a record minimum, that's definitely of interest."
"It is tempting to say that the record low we are seeing this year is global warming finally catching up with Antarctica," Meier said. "However, this might just be an extreme case of pushing the envelope of year-to-year variability. We'll need to have several more years of data to be able to say there has been a significant change in the trend."
NASA, NOAA data show 2016 warmest year on record globally
• January 18, 2017: Earth's 2016 surface temperatures were the warmest since modern record-keeping began in 1880, according to independent analyses by NASA and NOAA (National Oceanic and Atmospheric Administration). Globally-averaged temperatures in 2016 were 0.99 degrees Celsius warmer than the mid-20th century mean. This makes 2016 the third year in a row to set a new record for global average surface temperatures. 80)
The 2016 temperatures continue a long-term warming trend, according to analyses by scientists at NASA/GISS ( Goddard Institute for Space Studies) in New York. NOAA scientists concur with the finding that 2016 was the warmest year on record based on separate, independent analyses of the data.
Figure 52: Global temperature anomalies averaged from 2012 through 2016 in degrees Celsius (image credit: NASA/GSFC Scientific Visualization Studio; data provided by Robert B. Schmunk, NASA/GSFC GISS)
Because weather station locations and measurement practices change over time, there are uncertainties in the interpretation of specific year-to-year global mean temperature differences. However, even taking this into account, NASA estimates 2016 was the warmest year with greater than 95 percent certainty. "2016 is remarkably the third record year in a row in this series," said GISS Director Gavin Schmidt. "We don't expect record years every year, but the ongoing long-term warming trend is clear." The planet's average surface temperature has risen about 1.1 degrees Celsius since the late 19th century, a change driven largely by increased carbon dioxide and other human-made emissions into the atmosphere.
Most of the warming occurred in the past 35 years, with 16 of the 17 warmest years on record occurring since 2001. Not only was 2016 the warmest year on record, but eight of the 12 months that make up the year – from January through September, with the exception of June – were the warmest on record for those respective months. October, November, and December of 2016 were the second warmest of those months on record – in all three cases, behind records set in 2015.
Phenomena such as El Niño or La Niña, which warm or cool the upper tropical Pacific Ocean and cause corresponding variations in global wind and weather patterns, contribute to short-term variations in global average temperature. A warming El Niño event was in effect for most of 2015 and the first third of 2016. Researchers estimate the direct impact of the natural El Niño warming in the tropical Pacific increased the annual global temperature anomaly for 2016 by 0.12 degrees Celsius.
Weather dynamics often affect regional temperatures, so not every region on Earth experienced record average temperatures last year. For example, both NASA and NOAA found the 2016 annual mean temperature for the contiguous 48 United States was the second warmest on record. In contrast, the Arctic experienced its warmest year ever, consistent with record low sea ice found in that region for most of the year.
Figure 53: The planet's long-term warming trend is seen in this chart of every year's annual temperature cycle from 1880 to the present, compared to the average temperature from 1880 to 2015. Record warm years are listed in the column on the right (image credit: NASA/Earth Observatory, Joshua Stevens)
NASA's analyses incorporate surface temperature measurements from 6,300 weather stations, ship- and buoy-based observations of sea surface temperatures, and temperature measurements from Antarctic research stations. These raw measurements are analyzed using an algorithm that considers the varied spacing of temperature stations around the globe and urban heating effects that could skew the conclusions. The result of these calculations is an estimate of the global average temperature difference from a baseline period of 1951 to 1980.
NOAA scientists used much of the same raw temperature data, but with a different baseline period, and different methods to analyze Earth's polar regions and global temperatures.
GISS is a laboratory within the Earth Sciences Division of NASA's Goddard Space Flight Center in Greenbelt, Maryland. The laboratory is affiliated with Columbia University's Earth Institute and School of Engineering and Applied Science in New York.
Changing rainfall patterns linked to water security in India
• January 11, 2017: Changing rainfall is the key factor driving changes in groundwater storage in India, according to a new study led by the IIT (Indian Institute of Technology) Gandhinagar published in the journal Nature Geoscience. The study shows that changing monsoon patterns - which are tied to higher temperatures in the Indian Ocean - are an even greater driver of change in groundwater storage than the pumping of groundwater for agriculture. 81) 82)
Agriculture in India relies heavily on groundwater for irrigation, particularly in the dry northern regions where precipitation is scarce. Groundwater withdrawals in the country have increased over tenfold since the 1950's, from 10-20 km3 per year in 1950, to 240-260 km3 per year in 2009. And satellite measurements have shown major declines in groundwater storage in some parts of the country, particularly in northern India.
"Groundwater plays a vital role in food and water security in India. Sustainable use of groundwater resources for irrigation is the key for future food grain production," says study leader Vimal Mishra, of the IIT Gandhinagar. "And with a fast-growing population, managing groundwater sustainably is going become even more important. The linkage between monsoon rainfall and groundwater can suggest ways to enhance groundwater recharge in India and especially in the regions where rainfall has been declining, such as the Indo-Gangetic Plain."
Groundwater acts like a bank for water storage, receiving deposits from surface water and precipitation, and withdrawals as people pump out water for drinking, industry, and irrigating fields. If withdrawals add up to more than the deposits, eventually the accounts could run dry, which could have disastrous consequences.
"This study adds another dimension to the existing water management framework. We need to consider not just the withdrawals, but also the deposits in the system," says Yoshihide Wada, a study coauthor and the deputy director of the Water program at the IIASA (International Institute for Applied Systems Analysis) in Austria.
The issue of groundwater depletion has been a topic of much discussion in India, but most planning has focused on pumping, or the demand side, rather than the deposit side. By looking at water levels in wells around the country, the researchers could track groundwater replenishment following the monsoons. They found that in fact, variability in the monsoons is the key factor driving the changing groundwater storage levels across the country, even as withdrawals increase.
In addition, the researchers found that the monsoon precipitation is correlated with Indian Ocean temperature, a finding which could potentially help to improve precipitation forecasts and aid in water resource planning.
"Weather is uncertain by nature, and the impacts of climate change are extremely difficult to predict at a regional level," says Wada "But our research suggests that we must focus more attention on this side of the equation if we want to sustainably manage water resources for the future."
Figure 54: Changing precipitation in India in the period 1980-2013 (image credit: IIT)
Ozone Hole 2016, and a Historic Climate Agreement
• October 27, 2016: The size and depth of the ozone hole over Antarctica was not remarkable in 2016. As expected, ozone levels have stabilized, but full recovery is still decades away. What is remarkable is that the same international agreement that successfully put the ozone layer on the road to recovery is now being used to address climate change. 83)
The stratospheric ozone layer protects life on Earth by absorbing ultraviolet light, which damages DNA in plants and animals (including humans) and leads to health issues like skin cancer. Prior to 1979, scientists had never observed ozone concentrations below 220 Dobson Units. But in the early 1980s, through a combination of ground-based and satellite measurements, scientists began to realize that Earth's natural sunscreen was thinning dramatically over the South Pole. This large, thin spot in the ozone layer each southern spring came to be known as the ozone hole.
The image of Figure 55 shows the Antarctic ozone hole on October 1, 2016, as observed by the OMI (Ozone Monitoring Instrument) on NASA's Aura satellite. On that day, the ozone layer reached its average annual minimum concentration, which measured 114 Dobson Units. For comparison, the ozone layer in 2015 reached a minimum of 101 Dobson Units. During the 1960s, long before the Antarctic ozone hole occurred, average ozone concentrations above the South Pole ranged from 260 to 320 Dobson Units.
The area of the ozone hole in 2016 peaked on September 28, 2016, at about 23 million km2. "This year we saw an ozone hole that was just below average size," said Paul Newman, ozone expert and chief scientist for Earth Science at NASA's Goddard Space Flight Center. "What we're seeing is consistent with our expectation and our understanding of ozone depletion chemistry and stratospheric weather."
The image of Figure 56 was acquired on October 2 by the OMPS (Ozone Mapping Profiler Suite) instrumentation during a single orbit of the Suomi-NPP satellite. It reveals the density of ozone at various altitudes, with dark orange areas having more ozone and light orange areas having less. Notice that the word hole isn't literal; ozone is still present over Antarctica, but it is thinner and less dense in some areas.
Figure 56: An edge-on (limb) view of Earth's ozone layer, acquired with OMPS on the Suomi-NPP on October 2, 2016 (image credit: NASA Earth Observatory, image by Jesse Allen, using Suomi-NPP OMPS data)
In 2014, an assessment by 282 scientists from 36 countries found that the ozone layer is on track for recovery within the next few decades. Ozone-depleting chemicals such as chlorofluorocarbons (CFCs)—which were once used for refrigerants, aerosol spray cans, insulation foam, and fire suppression—were phased out years ago. The existing CFCs in the stratosphere will take many years to decay, but if nations continue to follow the guidelines of the Montreal Protocol, global ozone levels should recover to 1980 levels by 2050 and the ozone hole over Antarctica should recover by 2070.
The replacement of CFCs with hydrofluorocarbons (HFCs) during the past decade has saved the ozone layer but created a new problem for climate change. HFCs are potent greenhouse gases, and their use — particularly in refrigeration and air conditioning — has been quickly increasing around the world. The HFC problem was recently on the agenda at a United Nations meeting in Kigali, Rwanda. On October 15, 2016, a new amendment greatly expanded the Montreal Protocol by targeting HFCs, the so-called "grandchildren" of the Montreal Protocol.
"The Montreal Protocol is written so that we can control ozone-depleting substances and their replacements," said Paul Newman, who participated in the meeting in Kigali. "This agreement is a huge step forward because it is essentially the first real climate mitigation treaty that has bite to it. It has strict obligations for bringing down HFCs, and is forcing scientists and engineers to look for alternatives."