EO-Topics3 (Time frame: 2018)
World of Change: The shrinking Aral Sea
• December 28, 2019: The Aral Sea was once the fourth-largest lake in the world with a surface area of 68,000 km2 and contained 10grams of salt per liter. The two rivers that fed it were the Amu Darya and Syr Darya rivers, respectively, reaching the Sea through the South and the North (Figure 1). 1)
Figure 1: The Aral Sea is situated in Central Asia, between the Southern part of Kazakhstan and Northern Uzbekistan. The two rivers that feed it are the Amu Darya and Syr Darya rivers, respectively, reaching the Sea through the South and the North. This image portrays the size of the Aral Sea in the 1950's prior to any irrigation projects.
Figure 2: The Aral Sea was once the fourth-largest lake in the world. But in the 1960s, the Soviet Union diverted two major rivers to irrigate farmland, cutting off the inland sea from its source. The Aral Sea has been slowly disappearing ever since. These images show how the Aral Sea and its surrounding landscape has changed over the past few decades (video link: images by Jesse Allen, Lauren Dauphin, Robert Simmon, and Joshua Stevens) 2) 3)
In the 1960s, the Soviet Union undertook a major water diversion project on the arid plains of Kazakhstan, Uzbekistan, and Turkmenistan. The region's two major rivers, fed by snowmelt and precipitation in faraway mountains, were used to transform the desert into farms for cotton and other crops. Before the project, the Syr Darya and the Amu Darya rivers flowed down from the mountains, cut northwest through the Kyzylkum Desert, and finally pooled together in the lowest part of the basin. The lake they made, the Aral Sea, was once the fourth largest in the world.
Although irrigation made the desert bloom, it devastated the Aral Sea. This series of images from the MODIS (Moderate Resolution Imaging Spectroradiometer) instrument on NASA's Terra satellite documents the changes. At the start of the series in 2000, the lake was already a fraction of its 1960 extent (yellow line of Figure 3). The North Aral Sea (sometimes called the Small Aral Sea) had separated from the South (Large) Aral Sea.
The South Aral Sea had split into eastern and western lobes that remained tenuously connected at both ends. By 2001, the southern connection had been severed, and the shallower eastern part retreated rapidly over the next several years. Especially large retreats in the eastern lobe of the South Aral Sea appear to have occurred between 2005 and 2009, when drought limited and then cut off the flow of the Amu Darya. Water levels then fluctuated annually between 2009 and 2018 in alternately dry and wet years. In 2014, the eastern lobe of the South Aral Sea completely disappeared. Water levels in summer 2018 were not as low as they might have been, following a round of seasonal snowmelt in the spring.
As the Aral Sea has dried up, fisheries and the communities that depended on them collapsed. The increasingly salty water became polluted with fertilizer and pesticides. The blowing dust from the exposed lakebed, contaminated with agricultural chemicals, became a public health hazard. The salty dust blew off the lakebed and settled onto fields, degrading the soil. Croplands had to be flushed with larger and larger volumes of river water. The loss of the moderating influence of such a large body of water made winters colder and summers hotter and drier.
In a last-ditch effort to save some of the lake, Kazakhstan built a dam between the northern and southern parts of the Aral Sea. The Kok-Aral dike and dam, finished in 2005, separates the two water bodies and prevents flow out of the North Aral into the lower-elevation South Aral. The dam has led fisheries in the North Aral to rebound, even as it has limited flow into the South Aral. Between 2005 and 2006, water levels in the North Aral rebounded significantly and very small increases are visible throughout the rest of the time period. The differences in water color are due to changes in sediments and water depth.
The implications of the Aral Sea region up to today of this human alteration of the environment is precisely that certain characteristics of the region, from its geography to its population growth, account for dramatic consequences since the canals have been dug. Those consequences range from unexpected climate feedbacks to public health issues, affecting the lives of millions of people in and out of the region.
Global warming did not pause as researchers disentangle hiatus confusion
• December 19, 2018: The reality of ongoing climate warming might seem plainly obvious today, after the four warmest years on record and a summer of weather extremes in the whole northern hemisphere. A few years back however, some media and some experts were entangled in debates about an alleged pause in global warming - even though there never has been statistical evidence of any "hiatus", as new research now confirms. 4)
In two recent studies, a group of international scientists joined forces to thoroughly disentangle any possible "hiatus" confusion, affirming that there was no evidence for a significant pause or even slowdown of global warming in the first place. 5)
"Claims of a presumed slowdown or pause in global warming during the first decade of the 21st century and an alleged divergence between projections from climate models and observations have attracted considerable research attention, even though the Earth's climate has long been known to fluctuate on a range of temporal scales," says James S. Risbey from CSIRO (Commonwealth Science and Industrial Research Organization) in Australia, lead author of one of the new studies.
"However, our findings show there is little or no statistical evidence for a pause in global warming. Neither current nor historical data support it. The alleged pause in global warming was at no time statistically conspicuous or significant, but fully in line with the usual fluctuations", explains Stefan Rahmstorf from the Potsdam Institute for Climate Impact Research, a co-author to both studies.
"The results of our rigorous investigation in both studies are as simple as unambiguous: There was no pause in global warming. And global warming did not fall short of what climate models predicted. Warming continued as predicted, together with the normal short-term variability. There has been no unusual slowing of warming, as our comprehensive data analysis shows."
There was no pause in global warming: Published in Environmental Research Letters, the first new study analyzes variations in global surface temperature in historical context, while the second compares model projections to observations.
They scrutinized all available global temperature data sets in all available earlier and current versions and for all alleged time periods of a "hiatus", looking for statistical significance. In no data set and for no time period could a significant pause or slowing of global warming be detected, nor any discrepancy to climate models.
Statements claiming the contrary were based on premature conclusions, partly without considering statistics at all, partly because statistical analysis were faulty.
A common problem for instance was the so-called selection bias. Simple significance tests generally only apply to randomly drawn samples. But when a particular time interval is chosen out of many possibilities specifically because of its small trend, then this is not a random sample.
"Very few articles on the 'pause' account for or even mention this effect, yet it has profound implications for the interpretation of the statistical results," explains Stephan Lewandowsky from University of Bristol in the UK, lead author of the second study.
Reduced momentum for action to prevent climate change: One reason for the attention that the alleged "global warming pause" received in the public may have been that interest groups used this idea to argue against the urgency of ambitious climate policies to reduce CO2-emissions from the burning of fossil fuels. This in turn may have contributed to delays in action to halt global warming, the scientists argue.
"A final point to consider is why scientists put such emphasis on the 'pause' when the evidence for it was so scant. An explanation lies in the constant public and political pressure from climate contrarians", adds Naomi Oreskes from Harvard University in the USA and co-author of the second study. "This may have caused scientists to take positions they would not have done without such opposition."
Studying climate threats with Sentinel
• December 14, 2018: The most recent trends in sea level size mean the low-lying Camargue region of southern France could be submerged by the sea by the end of the century. The sea walls built along the coast in the 1980s have already been broken by the waves, as a combination of rising waters, slowly sinking landmass, and reduced amounts of sediment from the Rhone river spell trouble for this environment. Anis Guelmami uses Copernicus Sentinel satellites to study wetlands like the Camargue, and he tells us the latest news from space. 6)
Figure 4: Studying climate threats with Sentinel (video credit: ESA/Euronews)
Snow over Antarctica Buffered Sea Level Rise during Last Century
• December 13, 2018: A new NASA-led study has determined that an increase in snowfall accumulation over Antarctica during the 20th century mitigated sea level rise by 0.4 inches (1 cm). However, Antarctica's additional ice mass gained from snowfall makes up for just about a third of its current ice loss. 7)
Figure 5: A new NASA-led study has determined that an increase in snowfall accumulation over Antarctica during the 20th century mitigated sea level rise by 1.0 cm. However, Antarctica's additional ice mass gained from snowfall only makes up for about a third of its current ice loss. These findings don't necessarily mean that Antarctica is growing; it's still losing mass, even with the extra snowfall. However, without these gains, the planet would have experienced even more sea level rise in the 20th century. The polar ice sheets grow via snow accumulation and shrink through melting and the production of icebergs. Presently, both ice sheets are imbalanced –losing more ice annually than they are gaining– and their ice loss is estimated to be currently causing about a half of the observed sea level rise (video credit: NASA Goddard/ L. K. Ward)
"Our findings don't mean that Antarctica is growing; it's still losing mass, even with the extra snowfall," said Brooke Medley, a glaciologist with NASA Goddard Space Flight Center in Greenbelt, Maryland, and lead author of the study, which was published in Nature Climate Change on 10 December. "What it means, however, is that without these gains, we would have experienced even more sea level rise in the 20th century." 8)
The polar ice sheets grow via snow accumulation and shrink through melting and the production of icebergs. Presently, both ice sheets are imbalanced –losing more ice annually than they are gaining– and their ice loss is estimated to be currently causing about a half of the observed sea level rise. Sea level adjusts to changes in snowfall, which modulates how much water is locked into the ice sheets.
Snowfall is very difficult to measure over Antarctica. For starters, there are very few weather stations in the frozen continent, and most of them are installed along the coastline. Secondly, satellites have a hard time measuring snow from space – they basically confuse the snow that's falling down with the snow that's already on the ground. Climate models struggle to replicate the total amount of snow that falls over Antarctica each year. So scientists often have to rely on ice cores, cylinders of ice drilled from the ice sheet whose layers store a trove of information; amongst it, how much snow fell in a certain year or decade. But drilling ice cores is logistically challenging, so they are sparse and do not cover the entire continent.
Medley and her colleague, British Antarctic Survey's Elizabeth Thomas, reconstructed how much snow fell over the entire Antarctic continent and nearby islands from 1801 to 2000 using 53 ice cores and three atmospheric reanalyses –climate models informed by satellite observations. Ice cores are only point measurements of snow accumulation, but by comparing them to the reanalyses' simulations of Antarctic snowfall across the ice sheet, the researchers were able to determine the area of Antarctica each ice core was representative of.
The scientists found that the distribution of ice cores gave a good coverage of most of Antarctica, with some gaps in portions of East Antarctica due to the fact that this area of the continent sees extremely little snowfall, making it difficult to measure.
"Antarctica is bigger than the contiguous United States. You wouldn't say that because you're in New York City and it's snowing, it must mean that it's also snowing in San Diego. It's the same with Antarctica; you can't just stand in one spot, take one measurement and say ‘okay, I think I have a good handle on all of Antarctica.' It requires a lot of measurements," Medley said.
"From the ice cores we know that the current rate of change in snowfall is unusual in the context of the past 200 years," Thomas said.
The researchers also investigated what caused the increase of snowfall and its distribution pattern over the ice sheet from 1901 to 2000. They found that it was consistent with a warming atmosphere, which holds more moisture, combined with changes in the Antarctic circumpolar westerly winds that are related to the ozone hole. A related paper published in Geophysical Research Letters on 10 December confirms the relationship between stratospheric ozone depletion and increased snowfall over Antarctica.
"The fact that changes in westerly winds due to ozone depletion plays a role in Antarctic snow accumulation variability indicates that even this remote, uninhabited land has been affected by human activity," Medley said.
"The increased snowfall is a symptom of the same changes in atmospheric circulation that are causing the melt of Antarctic ice," Thomas said.
"Snowfall plays a critical role in Antarctic mass balance and it will continue to do so in the future," Medley said. "Currently it is helping mitigate ice losses, but it's not entirely compensating for them. We expect snowfall will continue to increase into the 21st century and beyond, but our results show that future increases in snowfall cannot keep pace with oceanic-driven ice losses in Antarctica."
Medley hoped that their results will also help evaluate existing climate models so that ice sheet modelers can pick the most reliable ones to use for their predictions of how the Antarctic ice sheet will behave in the future.
Study Finds Asian Glaciers Slowed by Ice Loss
• December 13, 2018: A NASA-led, international study finds Asia's high mountain glaciers are flowing more slowly in response to widespread ice loss, affecting freshwater availability downstream in India, Pakistan and China. Researchers analyzed almost 2 million satellite images of the glaciers and found that 94 percent of the differences in flow rates could be explained by changes in ice thickness. 9)
Figure 6: Glaciers in the Karakoram Range of Pakistan, one of the mountain regions studied in the new research (image credit: Université Grenoble Alpes/IRD/Patrick Wagnon)
For more than a decade, satellite data have documented that the glaciers were thinning as the melt rates on their top surfaces increased. However, "It has not been entirely clear how these glaciers are responding to this ice loss," said the lead author of the new study, Amaury Dehecq of NASA's Jet Propulsion Laboratory in Pasadena, California. "The rate at which they will disappear in the future depends on how they adjust to a warming climate."
Figure 7: Animation of satellite images revealing the flow of the Baltoro Glacier in the Karakoram Range, Pakistan (image credit: NASA/EO/Joshua Stevens)
Asia's mountain glaciers flow from the cold heights of the world's tallest mountains down to warmer climate zones, where they melt much faster, feeding major rivers such as the Indus and Yangtze. Scientists need to understand what is regulating the glaciers' flow speeds to project how glacial meltwater will contribute to the region's water resources and to sea level rise. Observing the glaciers from ground level is difficult because of their huge geographic expanse and inaccessibility, so the researchers turned to satellite images.
Dehecq and his colleagues developed algorithms to analyze almost 2 million pairs of U.S. Geological Survey/NASA Landsat satellite images from 1985 to 2017. The algorithms enabled automatic feature tracking to measure the distance that distinctive spots on the glaciers, such as crevasses or patches of dirt, traveled between an earlier and a later image. "We do this millions of times and average through the noise (errors and random disturbances) to see changes in velocity on the order of 1 meter a year," said study coauthor Alex Gardner of JPL.
"What's surprising about this study is that the relationship between thinning and flow speed is so consistent," said coauthor Noel Gourmelen of the University of Edinburgh in Scotland. In the few locations where glaciers have been stable or thickening rather than thinning, the study found that flow speeds also have been increasing slightly.
The reason a glacier flows down a slope at all is because gravity pulls on its mass. The pull makes a glacier both slide on its base and deform, or "creep" - a slow movement caused by ice crystals slipping past one another under the pressure of the glacier's weight. As the glacier thins and loses mass, both sliding and creeping become more difficult, and the glacier's flow slows as a result.
However, other factors also affect a glacier's rate of flow, such as whether water is lubricating the glacier's base so that it can slide more easily. Scientists were unsure of the relative importance of these different factors. The new study shows that ice thickness far outweighs any other factor in regulating flow speed over the long term.
The study published this week in Nature Geoscience is titled "Twenty-first Century Glacier Slowdown Driven by Mass Loss in High Mountain Asia." Coauthors are from JPL; the Université Savoie Mont-Blanc in Annecy, France; the University of Edinburgh in Scotland; the Université de Strasbourg in France; the Université Grenoble Alpes in Grenoble, France; and the Université de Toulouse in France. Caltech in Pasadena, California, manages JPL for NASA. 10)
Greenland ice loss quickening
• December 6, 2018: Using a 25-year record of ESA satellite data, recent research shows that the pace at which Greenland is losing ice is getting faster. The research, published in Earth and Planetary Science Letters, uses radar altimetry data gathered by the ERS, Envisat and CryoSat missions between 1992 and 2016. 11) 12)
- Radar altimeters record the height of the surface topography along the satellite's ground track. They precisely measure the height of ice, water and land by timing the interval between the transmission and reception of very short radar pulses.
- Over time, these measurements are used, for example, to work out how the height – or elevation – of huge ice sheets is changing, which, in turn, can be used to monitor ice loss.
- Although the research team, working through ESA's Climate Change Initiative, found only modest elevation changes in the early 1990s, the pace of thinning is clear in the satellite observations from 2003 onwards.
- "A pattern of thinning appears to dominate a large fraction of the ice sheet margins at the beginning of the millennium, with individual outlet glaciers exhibiting large thinning rates," says Louise Sandberg Sørensen, the paper's lead author.
- "Over the full 25-year period, the general picture shows much larger volume losses are experienced in west, northwest and southeast basins of Greenland compared to the more steady-state situations in the colder far north."
- This, according to Dr Sørensen, highlights the strong climate sensitivity of the outlet glaciers of Greenland as well as the ongoing need for reliable, long-term monitoring of climate variables that help to improve climate models and inform policy responses.
- The Greenland ice sheet is an important cog in the global climate system with its meltwater, for example, influencing ocean circulation in the North Atlantic. Ongoing monitoring of the ice sheet is equally important in understanding its contribution to the extent and changing rate in sea-level rise.
- The more recent Copernicus Sentinel-3 mission is also being used to monitor changing ice height.
- ESA's Climate Change Initiative is a research program that uses four decades of Earth observation archives established by ESA and its Member States to support the climate information requirements of the United Nations Framework Convention on Climate Change.
- In addition to the Greenland ice sheet, the program is developing long-term, consistent data products based on satellite derived observations for a further 22 essential climate variables required by the international science community to understand the Earth system.
Figure 8: Greenland ice change, 2015: The image shows Greenland ice-sheet annual elevation change in 2015, but using a 25-year record of ESA satellite data, recent research shows that the pace at which Greenland is losing ice is getting faster. The research, published in Earth and Planetary Science Letters, uses radar altimetry data gathered by the ERS, Envisat and CryoSat missions between 1992 and 2016 (image credit: ESA/Planetary Visions)
Figure 9: Greenland ice height from Sentinel-3B. Launched on 25 April 2018, the Sentinel-3B satellite has already delivered impressive first images from its ocean and land color instrument and from its radiometer. It has now also delivered data from its altimeter – which means that all of the instruments are working well (image credit: ESA, the image contains modified Copernicus Sentinel data (2018), processed by ESA)
Greenhouse gas ‘detergent' recycles itself in atmosphere: NASA Study
• December 3, 2018: A simple molecule in the atmosphere that acts as a "detergent" to breakdown methane and other greenhouse gases has been found to recycle itself to maintain a steady global presence in the face of rising emissions, according to new NASA research. Understanding its role in the atmosphere is critical for determining the lifetime of methane, a powerful contributor to climate change. 13)
Figure 10: Model output of OH primary production over a 24-hour period in July tracks with sunlight across the globe. Higher levels of OH over populated land are likely from OH recycling in the presence of NO and NO2, which are common pollutants from cars and industry (image credit: NASA / Julie Nicely)
- The hydroxyl (OH) radical, a molecule made up of one hydrogen atom, one oxygen atom with a free (or unpaired) electron, is one of the most reactive gases in the atmosphere and regularly breaks down other gases, effectively ending their lifetimes. In this way, OH is the main check on the concentration of methane, a potent greenhouse gas that is second only to carbon dioxide in contributing to increasing global temperatures.
- With the rise of methane emissions into the atmosphere, scientists historically thought that might cause the amount of hydroxyl radicals to be used up on the global scale and, as a result, extend methane's lifetime, currently estimated to be nine years. However, in addition to looking globally at primary sources of OH and the amount of methane and other gases it breaks down, this new research takes into account secondary OH sources, recycling that happens after OH breaks down methane and reforms in the presence of other gases, which has been observed on regional scales before.
- "OH concentrations are pretty stable over time," said atmospheric chemist and lead author Julie Nicely at NASA's Goddard Space Flight Center in Greenbelt, Maryland. "When OH reacts with methane, it doesn't necessarily go away in the presence of other gases, especially nitrogen oxides (NO and NO2). The break down products of its reaction with methane react with NO or NO2 to reform OH. So OH can recycle back into the atmosphere."
- Nitrogen oxides are one set of several gases that contribute to recycling OH back into the atmosphere, according to Nicely's research, published in the Journal of Geophysical Research: Atmospheres. She and her colleagues used a computer model informed by satellite observations of various gases from 1980 to 2015 to simulate the possible sources for OH in the atmosphere. These include reactions with the aforementioned nitrogen oxides, water vapor and ozone. They also tested an unusual potential source of new OH: the enlargement of the tropical regions on Earth. 14)
- OH in the atmosphere also forms when ultraviolet sunlight reaches the lower atmosphere and reacts with water vapor (H2O) and ozone (O3) to form two OH molecules. Over the tropics, water vapor and ultraviolet sunlight are plentiful. The tropics, which span the region of Earth to either side of the equator, have shown some evidence of widening farther north and south of their current range, possibly due to rising temperatures affecting air circulation patterns. This means that the tropical region primed for creating OH will potentially increase over time, leading to a higher amount of OH in the atmosphere. This tropical widening process is slow, however, expanding only 0.5 to 1 degree in latitude every 10 years. But the small effect may still be important, according to Nicely.
- She and her team found that, individually, the tropical widening effect and OH recycling through reactions with other gases each comprise a relatively small source of OH, but together they essentially replace the OH used up in the breaking down of methane.
- "The absence of a trend in global OH is surprising," said atmospheric chemist Tom Hanisco at Goddard who was not involved in the research. "Most models predict a ‘feedback effect' between OH and methane. In the reaction of OH with methane, OH is also removed. The increase in NO2 and other sources of OH, such as ozone, cancel out this expected effect." But since this study looks at the past thirty-five years, it's not guaranteed that as the atmosphere continues to evolve with global climate change that OH levels will continue to recycle in the same way into the future, he said.
- Ultimately, Nicely views the results as a way to fine-tune and update the assumptions that are made by researchers and climate modelers who describe and predict how OH and methane interact throughout the atmosphere. "This could add clarification on the question of will methane concentrations continue rising in the future? Or will they level off, or perhaps even decrease? This is a major question regarding future climate that we really don't know the answer to," she said.
- The study used data from NASA's TOMS (Total Ozone Mapping Spectrometer) instrument, the OMI (Ozone Monitoring Instrument) aboard NASA's Aura satellite, the AIRS (Atmospheric Infrared Sounder) aboard NASA's Aqua satellite, the National Oceanic and Atmospheric Administration Cooperative Global Air Sampling Network, the Modern-Era Retrospective analysis for Research and Applications, Version 2 data set and the Global Model Initiative chemical transport model.
Study reveals substantial water loss in global landlocked regions
activities, recent water storage in global landlocked basins has undergone a widespread decline. A new study reveals this decline has aggravated local water stress and caused potential sea level rise. The study was carried out by a team of scientists from six countries and appears in the current issue of Nature Geoscience. 15) 16)
Figure 11: This illustration shows terrestrial water storage changes in global endorheic basins from GRACE satellite observations, April 2002 to March 2016. In the top image, terrestrial water storage trends — in millimeters of equivalent water thickness per year — for each endorheic unit are highlighted, followed by animated monthly terrestrial water storage anomalies, also in millimeters. The bottom image shows monthly net terrestrial water storage anomalies in gigatons, in global endorheic and exorheic systems — excluding Greenland, Antarctica and the oceans — and linkage to the El Niño-Southern Oscillation, right axis. Terrestrial water storage anomalies are relative to the time-mean baseline in each unit or system, with removal of seasonality. For comparison, 360 gigatons of terrestrial water storage equals 1 millimeter of sea level equivalent (image credit: Jida Wang)
Figure 12: This is the animation to Figure 11 (video credit: Jida Wang)
"Water resources are extremely limited in the continental hinterlands where streamflow does not reach the ocean. Scientifically, these regions are called endorheic basins," said Jida Wang, a Kansas State University geographer and the study's lead author.
"Over the past few decades, we have seen increasing evidence of perturbations to the endorheic water balance," said Wang, an assistant professor of geography. "This includes, for example, the desiccating Aral Sea, the depleting Arabian aquifer and the retreating Eurasian glaciers. This evidence motivated us to ask: Is the total water storage across the global endorheic system, about one-fifth of the continental surface, undergoing a net decline?"
Using gravity observations from NASA/German Aerospace Center's Gravity Recovery and Climate Experiment, or GRACE, satellites, Wang and his colleagues quantified a net water loss in global endorheic basins of approximately 100 billion tons of water per year since the start of the current millennium. This means a water mass equivalent to five Great Salt Lakes or three Lake Meads is gone every year from the arid endorheic regions.
Surprisingly, this amount of endorheic water loss is double the rate of concurrent water changes across the remaining landmass except Greenland and Antarctica, Wang said. Opposite to endorheic basins, the remaining regions are exorheic, meaning streamflow originating from these basins drains to the ocean. Exorheic basins account for most of the continental surface and are home to many of the world's greatest rivers, such as the Nile, Amazon, Yangtze and Mississippi.
Wang noted that the signature of water storage changes in exorheic basins resembles some prominent oscillations of the climate system, such as El Niño and La Niña in multiyear cycles. However, the water loss in endorheic basins appears less responsive to such short-term natural variability. This contrast may suggest a profound impact of longer-term climate conditions and direct human water management, such as river diversion, damming and groundwater withdrawal, on the water balance in the dry hinterlands.
This endorheic water loss has dual ramifications, according to the researchers. Not only does it aggravate water stress in the arid endorheic regions, but it could also contribute to a significant factor of global environmental concern: sea level rise. Sea level rise is a result of two main causes: thermal expansion of sea water as a result of increased global temperature, and additional water mass to the ocean.
"The hydrosphere is mass conserved," said Chunqiao Song, researcher with the Nanjing Institute of Geography and Limnology, Chinese Academy of Sciences, and a co-lead author of the study. "When water storage in endorheic basins is in deficit, the reduced water mass doesn't disappear. It was reallocated chiefly through vapor flux to the exorheic system. Once this water is no longer landlocked, it has the potential to affect the sea level budget."
Despite an observation period of 14 years, the endorheic water loss equals an additional sea level rise of 4 millimeters, the study found. The researchers said this impact is nontrivial. It accounts for approximately 10 percent of the observed sea level rise during the same period; compares to nearly half of the concurrent loss in mountain glaciers, excluding Greenland and Antarctica; and matches the entire contribution of global groundwater consumption.
"We are not saying the recent endorheic water loss has completely ended up in the ocean," said Yoshihide Wada, deputy director of the water program at the International Institute for Applied Systems Analysis in Austria and a co-author of the study. "Instead, we are showing a perspective of how substantial the recent endorheic water loss has been. If it persists, such as beyond the decadal timescale, the water surplus added to the exorheic system may signify an important source of sea level rise."
By synergizing multi-mission satellite observations and hydrological modeling, Wang and his colleagues attributed this global endorheic water loss to comparable contributions from the surface — such as lakes, reservoirs and glaciers — as well as soil moisture and aquifers.
"Such comparable losses are, however, an aggregation of distinct regional variations," Wang said. "In endorheic Central Eurasia, for instance, about half of the water loss came from the surface, particularly large terminal lakes such as the Aral Sea, the Caspian Sea and Lake Urmia, and retreating glaciers in High Mountain Asia."
While glacial retreat was a response to warming temperature, water losses in the terminal lakes were a combined result of meteorological droughts and long-term water diversions from the feeding rivers.
The net water loss in endorheic Sahara and Arabia, on the other hand, was dominated by unsustainable groundwater withdrawal, according to the researchers. In endorheic North America, including the Great Basin of the U.S., a drought-induced soil moisture loss was likely responsible for most of the regional water loss. Despite a lesser extent, the surface water loss in the Great Salt Lake and the Salton Sea was at a substantial rate of 300 million tons per year, which was partially induced by mineral mining and diversion-based irrigation.
"The water losses from the world's endorheic basins are yet another example of how climate change is further drying the already dry arid and semi-arid regions of the globe. Meanwhile, human activities such as groundwater depletion are significantly accelerating this drying," said Jay Famiglietti, director of the Global Institute of Water Security, Canada 150 research chair in hydrology and remote sensing at the University of Saskatchewan, Canada and co-author of the study.
Wang said the team wants to convey three takeaway messages from their research.
- "First, water storage in the endorheic system, albeit limited in total mass, can dominate the water storage trend in the entire land surface during at least decadal timescales," Wang said.
- "Second, the recent endorheic water loss is less sensitive to natural variability of the climate system, suggesting a possible response to longer-term climate conditions and human water management.
- "Third, such a water loss in the endorheic system has dual ramifications, both to regional water sustainability and to global sea level rise," he said. "These messages highlight the underrated importance of endorheic basins in the water cycle and the need for an improved understanding of water storage changes in the global hinterlands."
This research was supported by Kansas State University faculty start-up fund, NASA SWOT (Surface Water and Ocean Topography) Grant; China's Thousand Young Talents Program; and the NASA Sea Level Change team. A portion of this research was conducted at the Jet Propulsion Laboratory, California Institute of Technology, under contract with NASA.
Seismic study reveals huge amount of water dragged into Earth's interior
• November 14, 2018: Slow-motion collisions of tectonic plates under the ocean drag about three times more water down into the deep Earth than previously estimated, according to a first-of-its-kind seismic study that spans the Mariana Trench, a crescent-shaped trench in the Western Pacific that measures 1,500 miles (2,400 km) long and is the deepest ocean trench in the world. 17)
The observations from the trench have important implications for the global water cycle, according to researchers at Washington University in St. Louis whose work is supported by the National Science Foundation (NSF).
"People knew that subduction zones could bring down water, but they didn't know how much water," said Chen Cai, lead author of the study, which was published in this week's issue of the journal Nature. 18)
"This research shows that subduction zones move far more water into Earth's deep interior — many miles below the surface — than previously thought," said Candace Major, a program director in NSF's Division of Ocean Sciences. "The results highlight the important role of subduction zones in Earth's water cycle."
Researchers listened to more than a year's worth of Earth's rumblings, from ambient noise to actual earthquakes, using a network of 19 ocean-bottom seismographs deployed across the Mariana Trench, along with seven island-based seismographs.
"Previous estimates vary widely on the amount of water that is subducted deeper than 60 miles," said Doug Wiens, a geoscientist at Washington University and co-author of the paper. "The main source of uncertainty in these calculations was the initial water content of the subducting uppermost mantle."
The Mariana Trench is where the western Pacific Ocean plate slides beneath the Mariana Plate and sinks deep into the Earth's mantle as the plates slowly converge.
The new seismic observations paint a more detailed picture of the Pacific Plate bending into the trench, resolving its 3D structure and tracking the relative speeds of types of rock that have different capabilities for holding water.
Rock can grab and hold onto water in a variety of ways. Ocean water atop the tectonic plate runs down into the Earth's crust and upper mantle along fault lines that lace the area where the plates collide and bend. Then it gets trapped.
Under certain temperatures and pressure conditions, chemical reactions force the water into a non-liquid form — hydrous minerals (wet rocks) — locking the water into the plate. Then the plate continues to crawl ever deeper into the Earth's mantle, ferrying the water with it.
Previous studies at subduction zones like the Mariana Trench have noted that the subducting plate could hold water. But they didn't determine how much water it held or how deep it went.
The seismic images Cai and Wiens obtained show that the area of hydrated rock at the Mariana Trench extends almost 20 miles (~30 km) beneath the seafloor — much deeper than previously thought. The amount of water in this block of hydrated rock is considerable, the scientists said.
Figure 13: Distribution of seismic stations and bathymetry. a) Station distribution. Red circles show ocean-bottom seismographs deployed from January 2012 to February 2013. White squares represent the temporary island-based stations. Red squares indicate stations from the USGS (US Geological Survey) Northern Mariana Islands Seismograph Network used in our study. Open triangles show the locations of three large serpentine seamounts within the study area. The dashed white line is the trench axis. The arrow labelled APM indicates the direction of absolute plate motion. Thin solid white lines show magnetic lineations (M22, M23, M24) 19)
For the Mariana Trench region alone, four times more water subducts than previously calculated. These results can be extrapolated to predict the conditions in other ocean trenches worldwide.
"If other old, cold subducting slabs contain similarly thick layers of hydrous mantle, then estimates of the global water flux into the mantle at depths greater than 60 miles must be increased by a factor of about three," said Wiens.
For water in the Earth, what goes down must come up. All the water going into the Earth at subduction zones must be coming back up somehow, not continuously piling up inside the Earth.
Scientists believe that most of the water that goes down at a trench comes back into the atmosphere as water vapor when volcanoes erupt. But with the revised estimates of water from the new study, the amount of water going in seems to greatly exceed the amount of water coming out.
"The estimates of water coming back out through the volcanic arc are probably very uncertain," said Wiens, who hopes that this study will encourage other researchers to reconsider their models for how water moves out.
"Does the amount of water vary substantially from one subduction zone to another, based on the kind of faulting where the plate bends?" Wiens asked. "There have been suggestions of that in Alaska and in Central America. But no one has yet looked at deeper structures like we were able to do in the Mariana Trench."
Figure 14: Recovery of seismographs on uninhabited islands in the Commonwealth of the Northern Mariana Islands (image credit: Douglas Wiens)
Long-term ozone measurements in the period 1979 to 2018
• November 14, 2018: The hole in the ozone layer that forms over Antarctica each September and October was slightly above average size in 2018, but smaller than expected for the weather conditions. Colder-than-average temperatures in the Antarctic stratosphere created ideal conditions for destroying ozone, NASA and NOAA scientists said, but declining levels of ozone-depleting chemicals prevented the hole from growing as large as it might have been 20 years ago. 21)
- "Chlorine levels in the Antarctic stratosphere have fallen about 11 percent from the peak year," said Paul A. Newman, chief scientist for Earth Sciences at NASA's Goddard Space Flight Center. "This year's colder temperatures would have given us a much larger ozone hole if chlorine was still at levels we saw back in the year 2000."
- The ozone hole reached an average area of 22.9 km2 (8.8 million square miles) in 2018, almost three times the size of the contiguous United States. It ranks 13th largest out of 40 years of NASA satellite observations.
- The maps of Figures 15 and 16 show the state of the ozone hole on the day of its maximum depth; that is, the day that the lowest ozone concentrations were measured in those years. The two maps of Figure 15 show the years 2000 and 2018, when ozone concentrations were 89 Dobson units and 102 Dobson units, respectively. The map series of Figure 16 shows the day of minimum concentration in every year since 1979 (except 1995, when no data was available).
- Stratospheric ozone is measured in Dobson units (DU), the number of molecules required to create a layer of pure ozone 0.01 mm thick at a temperature of 0 º Celsius and an air pressure of 1 atmosphere (the pressure at the surface of the Earth). The average amount of ozone in Earth's atmosphere is 300 Dobson Units, equivalent to a layer 3 mm thick.
- The ozone hole in 2018 was strongly influenced by a stable and cold Antarctic vortex, the stratospheric low-pressure system that flows clockwise in the atmosphere over the continent. These colder conditions—among the coldest since 1979—helped support the formation of more polar stratospheric clouds. Particles in such clouds activate ozone-destroying forms of chlorine and bromine compounds in the stratosphere.
- Ozone-depleting chemicals in the air are abundant enough to cause significant losses. According to NOAA scientist Bryan Johnson, conditions in 2018 allowed for a significant elimination of ozone in a deep, 5 km layer over the South Pole. The South Pole saw an ozone minimum of 104 Dobson units on October 12, making it the 12th lowest year out of 33 years of ozonesonde (balloon) measurements at the Pole.
- "Even with this year's optimum conditions, ozone loss was less severe in the upper altitude layers," Johnson said, "which is what we would expect given the declining chlorine concentrations we're seeing in the stratosphere."
Figure 15: The ozone hole, measured with OMI (Ozone Monitoring Instrument) on NASA's Aura satellite, was quite large in 2018 because of the cold conditions, but less severe than it might have been in previous decades. The difference is a long-term reduction in ozone-depleting substances (such as CFCs) that were phased out of commercial production by the Montreal Protocol. Atmospheric levels of CFCs and similar compounds increased up to the year 2000, but have slowly declined since then (image credit: NASA Earth Observatory image by Joshua Stevens, using data courtesy of NASA Ozone Watch. Edited by Mike Carlowicz using a story by Ellen Gray, NASA's Earth Science News Team, and Theo Stein, NASA)
Legend to Figure 15: Ozone measurements prior to 2004 were observed with TOMS (Total Ozone Mapping Spectrometer) on Nimbus-7 and Meteor-3-5 provided global measurements of total column ozone on a daily basis and together provided a complete data set of daily ozone from November 1978 to December 1994. After an 18-month period when the program had no on-orbit capability, ADEOS-1 of JAXA was launched on August 17, 1996, and provided data until the satellite, which carried TOMS, lost power on June 29, 1997. TOMS-EP (Earth Probe) was launched on 2 July 1996, to provide supplemental measurements, and was later boosted to a higher orbit to replace the failed ADEOS-1. The transmitter for TOMS-EP failed on 2 December 2006. Since 1 January 2006, OMI on Aura has replaced data from TOMS-EP.
Figure 16: This map series shows the day of minimum concentration in every year since 1979, except 1995, when no data was available (image credit: NASA Earth Observatory image by Joshua Stevens, using data courtesy of NASA Ozone Watch., edited by Mike Carlowicz using a story by Ellen Gray, NASA's Earth Science News Team, and Theo Stein, NASA)
Scientific Assessment of Ozone Depletion: 2018
November 5, 2018: The 2018 WMO/UNEP assessment contains the most up-to-date understanding of ozone depletion, reflecting the thinking of hundreds of international scientific experts who will contribute to its preparation and review. Cochairs of the 2018 Scientific Assessment Panel (SAP) of the Montreal Protocol on Substances that Deplete the Ozone Layer are Dr. David W. Fahey of the NOAA Earth System Research Lab (ESRL) Chemical Sciences Division, USA, Dr. Paul A. Newman of the NASA Goddard Space Flight Center, USA, Dr. John A. Pyle of the University of Cambridge, UK, and Dr. Bonfils Safari of the National University of Rwanda, Butare. Sarah Doherty of the University of Washington, USA is the Assessment Coordinator. Other members of ESRL are making substantial contributions to the report, serving as coauthors, contributors, reviewers, and editorial and support staff. 22)
Highlights Scientific Assessment of Ozone Depletion: 2018 23)
The Assessment documents the advances in scientific understanding of ozone depletion reflecting the thinking of the many international scientific experts who have contributed to its preparation and review. These advances add to the scientific basis for decisions made by the Parties to the Montreal Protocol. It is based on longer observational records, new chemistry-climate model simulations, and new analyses. Highlights since the 2014 Assessment are:
Actions taken under the Montreal Protocol have led to decreases in the atmospheric abundance of controlled ozone-depleting substances (ODSs) and the start of the recovery of stratospheric ozone. The atmospheric abundances of both total tropospheric chlorine and total tropospheric bromine from long-lived ODSs controlled under the Montreal Protocol have continued to decline since the 2014 Assessment. The weight of evidence suggests that the decline in ODSs made a substantial contribution to the following observed ozone trends:
- The Antarctic ozone hole is recovering, while continuing to occur every year. As a result of the Montreal Protocol much more severe ozone depletion in the polar regions has been avoided.
- Outside the polar regions, upper stratospheric ozone has increased by 1–3% per decade since 2000.
- No significant trend has been detected in global (60°S–60°N) total column ozone over the 1997–2016 period with average values in the years since the last Assessment remaining roughly 2% below the 1964–1980 average.
- Ozone layer changes in the latter half of this century will be complex, with projected increases and decreases in different regions. Northern Hemisphere mid-latitude total column ozone is expected to return to 1980 abundances in the 2030s, and Southern Hemisphere mid-latitude ozone to return around mid-century. The Antarctic ozone hole is expected to gradually close, with springtime total column ozone returning to 1980 values in the 2060s.
- The Kigali Amendment is projected to reduce future global average warming in 2100 due to hydrofluorocarbons (HFCs) from a baseline of 0.3–0.5 ºC to less than 0.1 ºC. The magnitude of the avoided temperature increase due to the provisions of the Kigali Amendment (0.2 to 0.4 ºC) is substantial in the context of the 2015 Paris Agreement, which aims to keep global temperature rise this century to well below 2 ºC above pre-industrial levels.
- There has been an unexpected increase in global total emissions of CFC-11. Global CFC-11 emissions derived from measurements by two independent networks increased after 2012, thereby slowing the steady decrease in atmospheric concentrations reported in previous Assessments. The global concentration decline over 2014 to 2016 was only twothirds as fast as it was from 2002 to 2012. While the emissions of CFC-11 from eastern Asia have increased since 2012, the contribution of this region to the global emission rise is not well known. The country or countries in which emissions have increased have not been identified.
- Sources of significant carbon tetrachloride emissions, some previously unrecognized, have been quantified. These sources include inadvertent by-product emissions from the production of chloromethanes and perchloroethylene, and fugitive emissions from the chlor-alkali process. The global budget of carbon tetrachloride is now much better understood than was the case in previous Assessments, and the previously identified gap between observation-based and industry-based emission estimates has been substantially reduced.
- Continued success of the Montreal Protocol in protecting stratospheric ozone depends on continued compliance with the Protocol. Options available to hasten the recovery of the ozone layer are limited, mostly because actions that could help significantly have already been taken. Remaining options such as complete elimination of controlled and uncontrolled emissions of substances such as carbon tetrachloride and dichloromethane; bank recapture and destruction of CFCs, halons, and HCFCs; and elimination of HCFC and methyl bromide production would individually lead to small-to-modest ozone benefits. Future emissions of carbon dioxide, methane, and nitrous oxide will be extremely important to the future of the ozone layer through their effects on climate and on atmospheric chemistry. Mitigation of nitrous oxide emissions would also have a small-to-modest ozone benefit.
Ozone Hole Modest Despite Optimum Conditions for Ozone Depletion
November 2, 2018: The ozone hole that forms in the upper atmosphere over Antarctica each September was slightly above average size in 2018, NOAA and NASA scientists reported today. 24)
Figure 17: Scientists from NASA and NOAA work together to track the ozone layer throughout the year and determine when the hole reaches its annual maximum extent. This year, the South Pole region of Antarctica was slightly colder than the previous few years, so the ozone hole grew larger (image credit: NASA Goddard/ Katy Mersmann)
Colder-than-average temperatures in the Antarctic stratosphere created ideal conditions for destroying ozone this year, but declining levels of ozone-depleting chemicals prevented the hole from as being as large as it would have been 20 years ago.
"Chlorine levels in the Antarctic stratosphere have fallen about 11 percent from the peak year in 2000," said Paul A. Newman, chief scientist for Earth Sciences at NASA's Goddard Space Flight Center in Greenbelt, Maryland. "This year's colder temperatures would have given us a much larger ozone hole if chlorine was still at levels we saw back in the year 2000."
According to NASA, the annual ozone hole reached an average area coverage of 22.9 km2 in 2018, almost three times the size of the contiguous United States. It ranks 13th largest out of 40 years of NASA satellite observations. Nations of the world began phasing out the use of ozone-depleting substances in 1987 under an international treaty known as the Montreal Protocol.
The 2018 ozone hole was strongly influenced by a stable and cold Antarctic vortex — the stratospheric low pressure system that flows clockwise in the atmosphere above Antarctica. These colder conditions — among the coldest since 1979 — helped support formation of more polar stratospheric clouds, whose cloud particles activate ozone-destroying forms of chlorine and bromine compounds.
In 2016 and 2017, warmer temperatures in September limited the formation of polar stratospheric clouds and slowed the ozone hole's growth. In 2017, the ozone hole reached a size of 19.7 km2 before starting to recover. In 2016, the hole grew to 20.7 km2.
However, the current ozone hole area is still large compared to the 1980s, when the depletion of the ozone layer above Antarctica was first detected. Atmospheric levels of man-made ozone-depleting substances increased up to the year 2000. Since then, they have slowly declined but remain high enough to produce significant ozone loss.
NOAA scientists said colder temperatures in 2018 allowed for near-complete elimination of ozone in a deep 5 km layer over the South Pole. This layer is where the active chemical depletion of ozone occurs on polar stratospheric clouds. The amount of ozone over the South Pole reached a minimum of 104 Dobson units on Oct. 12 — making it the 12th lowest year out of 33 years of NOAA ozonesonde measurements at the South Pole, according to NOAA scientist Bryan Johnson.
"Even with this year's optimum conditions, ozone loss was less severe in the upper altitude layers, which is what we would expect given the declining chlorine concentrations we're seeing in the stratosphere," Johnson said.
A Dobson unit is the standard measurement for the total amount of ozone in the atmosphere above a point on Earth's surface, and it represents the number of ozone molecules required to create a layer of pure ozone 0.01 mm thick at a temperature of 32º Fahrenheit (0 º Celsius) at an atmospheric pressure equivalent to Earth's surface. A value of 104 Dobson units would be a layer that is 1.04 mm thick at the surface, less than the thickness of a dime.
Prior to the emergence of the Antarctic ozone hole in the 1970s, the average amount of ozone above the South Pole in September and October ranged from 250 to 350 Dobson units.
What is ozone and why does it matter? Ozone comprises three oxygen atoms and is highly reactive with other chemicals. In the stratosphere, roughly 11 to 40 km above Earth's surface, a layer of ozone acts like a sunscreen, shielding the planet from ultraviolet radiation that can cause skin cancer and cataracts, suppress immune systems and damage plants. Ozone can also be created by photochemical reactions between the Sun and pollution from vehicle emissions and other sources, forming harmful smog in the lower atmosphere.
NASA and NOAA use three complementary instrumental methods to monitor the growth and breakup of the ozone hole each year. Satellite instruments like the Ozone Monitoring Instrument on NASA's Aura satellite and the OMPS (Ozone Mapping Profiler Suite) on the NASA-NOAA SuomiNPP satellite measure ozone across large areas from space. The Aura satellite's MLS (Microwave Limb Sounder) also measures certain chlorine-containing gases, providing estimates of total chlorine levels.
NOAA scientists monitor the thickness of the ozone layer and its vertical distribution above the South Pole by regularly releasing weather balloons carrying ozone-measuring "sondes" up to ~34 km in altitude, and with a ground-based instrument called a Dobson spectrophotometer.
Multiyear Study of Arctic Sea Ice Coverage
October 11, 2018: The Arctic Ocean's blanket of sea ice has changed since 1958 from predominantly older, thicker ice to mostly younger, thinner ice, according to new research published by NASA scientist Ron Kwok of the Jet Propulsion Laboratory, Pasadena, California. With so little thick, old ice left, the rate of decrease in ice thickness has slowed. New ice grows faster but is more vulnerable to weather and wind, so ice thickness is now more variable, rather than dominated by the effect of global warming. 25) 26)
Kwok's research combined decades of declassified U.S. Navy submarine measurements with more recent data from four satellites to create the 60-year record of changes in Arctic sea ice thickness. He found that since 1958, Arctic ice cover has lost about two-thirds of its thickness, as averaged across the Arctic at the end of summer. Older ice has shrunk in area by >2 million km2. Today, 70 percent of the ice cover consists of ice that forms and melts within a single year, which scientists call seasonal ice.
Sea ice of any age is frozen ocean water. However, as sea ice survives through several melt seasons, its characteristics change. Multiyear ice is thicker, stronger and rougher than seasonal ice. It is much less salty than seasonal ice; Arctic explorers used it as drinking water. Satellite sensors observe enough of these differences that scientists can use spaceborne data to distinguish between the two types of ice.
Thinner, weaker seasonal ice is innately more vulnerable to weather than thick, multiyear ice. It can be pushed around more easily by wind, as happened in the summer of 2013. During that time, prevailing winds piled up the ice cover against coastlines, which made the ice cover thicker for months.
The ice's vulnerability may also be demonstrated by the increased variation in Arctic sea ice thickness and extent from year to year over the last decade. In the past, sea ice rarely melted in the Arctic Ocean. Each year, some multiyear ice flowed out of the ocean into the East Greenland Sea and melted there, and some ice grew thick enough to survive the melt season and become multiyear ice. As air temperatures in the polar regions have warmed in recent decades, however, large amounts of multiyear ice now melt within the Arctic Ocean itself. Far less seasonal ice now thickens enough over the winter to survive the summer. As a result, not only is there less ice overall, but the proportions of multiyear ice to seasonal ice have also changed in favor of the young ice.
Seasonal ice now grows to a depth of about two meters in winter, and most of it melts in summer. That basic pattern is likely to continue, Kwok said. "The thickness and coverage in the Arctic are now dominated by the growth, melting and deformation of seasonal ice."
The increase in seasonal ice also means record-breaking changes in ice cover such as those of the 1990s and 2000s are likely to be less common, Kwok noted. In fact, there has not been a new record sea ice minimum since 2012, despite years of warm weather in the Arctic. "We've lost so much of the thick ice that changes in thickness are going to be slower due to the different behavior of this ice type," Kwok said.
Kwok used data from U.S. Navy submarine sonars from 1958 to 2000; satellite altimeters on NASA's ICESat and the European CryoSat-2, which span from 2003 to 2018; and scatterometer measurements from NASA's QuikSCAT and the European ASCAT from 1999 to 2017.
Figure 18: Small remnants of thicker, multiyear ice float with thinner, seasonal ice in the Beaufort Sea on 30 September, 2016 (image credit: NASA/GSFC/Alek Petty)
NASA Study Connects Southern California, Mexico Faults
October 8, 2018: A multiyear study has uncovered evidence that a 34-kilometer-long section of a fault links known, longer faults in Southern California and northern Mexico into a much longer continuous system. The entire system is at least 350 km long. Knowing how faults are connected helps scientists understand how stress transfers between faults. Ultimately, this helps researchers understand whether an earthquake on one section of a fault would rupture multiple fault sections, resulting in a much larger earthquake. 27) 28)
A team led by scientist Andrea Donnellan of NASA's Jet Propulsion Laboratory in Pasadena, California, recognized that the south end of California's Elsinore fault is linked to the north end of the Laguna Salada fault system, just north of the international border with Mexico. The short length of the connecting fault segment, which they call the Ocotillo section, is consistent with an immature fault zone that is still developing, where repeated earthquakes have not yet created a smoother, single fault instead of several strands.
The Ocotillo section was the site of a magnitude 5.7 aftershock that ruptured on a 8-kilometer-long fault buried under the California desert two months after the 2010 El Mayor-Cucapah earthquake in Baja California, Mexico. The magnitude 7.2 earthquake caused severe damage in the Mexican city of Mexicali and was felt throughout Southern California. It and its aftershocks caused dozens of faults in the region — including many not previously identified — to move.
Seismic activity in the region is a sign of its complex geology. The Pacific and North American plates are grinding past each other in Southern California. In the Gulf of California, there's a spreading zone where plates are moving apart. "The plate boundary is still sorting itself out," Donnellan said.
Seismic activity in the region is a sign of its complex geology. The Pacific and North American plates are grinding past each other in Southern California. In the Gulf of California, there's a spreading zone where plates are moving apart. "The plate boundary is still sorting itself out," Donnellan said.
In the new study, Donnellan's team was also able to better define where Earth's crust continued slipping or deforming following the El Mayor-Cucapah earthquake and where other factors are important. "The shaking is only part of the earthquake process," she said. "The Earth keeps on moving for years [after the shaking stops]. What's cool about UAVSAR (Uninhabited Aerial Vehicle Synthetic Aperture Radar), an L-band InSAR platform, and GPS is that you can see the rest of the process."
Figure 19: The approximate location of the newly mapped Ocotillo section, which ties together California's Elsinore fault and Mexico's Laguna Salada fault into one continuous fault system (image credit: NASA/JPL-Caltech)
Arctic sea ice extent arrives at its minimum for 2018
September 27, 2018: Arctic sea ice likely reached its lowest seasonal extent for the year on19 and 23 September 2018, according to NASA and the NASA-supported NSIDC (National Snow and Ice Data Center) at the University of Colorado Boulder. Analysis of satellite data by NSIDC and NASA showed that, at 1.77 million square miles (4.59 million km2), 2018 effectively tied with 2008 and 2010 for the sixth lowest summertime minimum extent in the satellite record. 29) 30)
This appears to be the lowest extent of the year. In response to the setting sun and falling temperatures, ice extent will begin expanding through autumn and winter. However, a shift in wind patterns or a period of late season melt could still push the ice extent lower.
The minimum extent was reached 5 and 9 days later than the 1981 to 2010 median minimum date of September 14. The interquartile range of minimum dates is September 11 to September 19. This year's minimum date of September 23 is one of the latest dates to reach the minimum in the satellite record, tying with 1997. The lateness of the minimum appears to be at least partially caused by southerly winds from the East Siberian Sea, which brought warm air into the region and prevented ice from drifting or growing southward.
Figure 20: Arctic sea ice extent for September 23, 2018 was 4.59 million km2 (1.77 million square miles). The orange line shows the 1981 to 2010 average extent for that day (image credit: NSIDC)
Figure 21: The map above compares Arctic sea ice extent on September 19, 2018 and September 23, 2018, when Arctic sea ice reached its minimum extent for the year (image credit: NSIDC)
This year's minimum is relatively high compared to the record low extent we saw in 2012, but it is still low compared to what it used to be in the 1970s, 1980s and even the 1990s," said Claire Parkinson, a climate change senior scientist at NASA's Goddard Space Flight Center in Greenbelt, Maryland.
Parkinson and her colleague Nick DiGirolamo calculated that, since the late 1970s, the Arctic sea ice extent has shrunk on average about 21,000 square miles (54,000 km2) with each passing year. That is equivalent to losing a chunk of sea ice the size of Maryland and New Jersey combined every year for the past four decades.
This summer, the weather conditions across the Arctic have been a mixed bag, with some areas experiencing warmer than average temperatures and rapid melt and other regions remaining cooler than normal, which leads to persistent patches of sea ice. Still, the 2018 minimum sea ice extent is 629,000 square miles (1.63 million km2) below the 1981-2010 average of yearly minimum extents.
One of the most unusual features of this year's melt season has been the reopening of a polynya-like hole in the icepack north of Greenland, where the oldest and thickest sea ice of the Arctic typically resides. In February of this year, a similar opening appeared in the same area, catching the attention of sea ice scientists everywhere. The first appearance of the hole raised concerns about the possibility that the region could became vulnerable if the original, thicker ice cover was replaced with thinner ice as the exposed seawater refroze. NASA's Operation IceBridge mission probed the area in March, finding that the ice was indeed thinner and thus more susceptible to be pushed around by the winds and ocean currents.
"This summer, the combination of thin ice and southerly warm winds helped break up and melt the sea ice in the region, reopening the hole," said Melinda Webster, a sea ice researcher with Goddard. "This opening matters for several reasons; for starters, the newly exposed water absorbs sunlight and warms up the ocean, which affects how quickly sea ice will grow in the following autumn. It also affects the local ecosystem; for example, it impacts seal and polar bear populations that rely on thicker, snow-covered sea ice for denning and hunting.
Measurements of sea ice thickness, an important additional factor in determining the mass and volume changes of the sea ice cover, have been far less complete than the measurements of ice extent and distribution in the past four decades. Now, with the successful launch of NASA's ICESat-2 (Ice, Cloud and land Elevation Satellite-2, on 15 September, scientists will be able to use the data from the spacecraft's advanced laser altimeter to create detailed maps of sea ice thickness in both the Arctic and the Antarctic.
Figure 22: Lowest sea ice minimum extents on record (satellite record, 1979 to present), image credit: NASA
Contrasting effects on deep convective clouds by different types of aerosols
September 24, 2018: Convective clouds produce a significant proportion of the global precipitation and play an important role in the energy and water cycles. A new NASA-led study helps answer decades-old questions about the role of smoke and human-caused air pollution on clouds and rainfall. Looking specifically at deep convective clouds — tall clouds like thunderclouds, formed by warm air rising — the study shows that smoky air makes it harder for these clouds to grow. Pollution, on the other hand, energizes their growth, but only if the pollution isn't heavy. Extreme pollution is likely to shut down cloud growth. 31)
Researchers led by scientist Jonathan Jiang of NASA's Jet Propulsion Laboratory in Pasadena, California, used observational data from two NASA satellites to investigate the effects of smoke and human-made air pollutants at different concentrations on deep convective clouds. 32)
The two satellites — CALIPSO (Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation) and CloudSat — orbited on the same track only a few seconds apart from 2006 until this year. CloudSat uses a radar to measure cloud locations and heights worldwide, and CALIPSO uses an instrument called a LIDAR to measure smoke, dust, pollution and other microscopic particles in the air, which are collectively referred to as aerosols, at the same locations at almost the same time. The combined data sets allow scientists to study how aerosol particles affect clouds.
CALIPSO is able to classify aerosols into several types, a capability which was improved two years ago when the CALIPSO mission team developed improved data-processing techniques. At about the same time, the CloudSat team also improved its classification of the cloud types. Jiang's team knew that these improvements had the potential to clarify how different aerosols affect the ability of clouds to grow. It took him and his colleagues about two years to go through both data sets, choose the best five-year period and Earth regions to study, and do the analysis.
Clouds typically cannot form without some aerosols, because water vapor in the air does not easily condense into liquid water or ice unless it comes in contact with an aerosol particle. But there are many types of aerosols — not only the ones studied here but volcanic ash, sea salt and pollen, for example — with a wide range of sizes, colors, locations and other characteristics. All of these characteristics affect the way aerosols interact with clouds. Even the same type of aerosol may have different effects at different altitudes in the atmosphere or at different concentrations of particles.
Smoke particles absorb heat radiation emitted by the ground. This increases the temperature of the smoke particles, which can then warm the air. At the same time they block incoming sunlight, which keeps the ground cooler. That reduces the temperature difference between the ground and the air. For clouds to form, the ground needs to be warmer and the air cooler so that moisture on the ground can evaporate, rise and condense higher in the atmosphere. By narrowing the temperature gap between the ground and the air, smoke suppresses cloud formation and growth.
Human-pollutant aerosols like sulfates and nitrates, on the other hand, do not absorb much heat radiation. In moderate concentrations, they add more particles to the atmosphere for water to condense onto, enabling clouds to grow taller. If pollution is very heavy, however, the sheer number of particles in the sky blocks incoming sunlight — an effect often visible in the world's most polluted cities. That cools the ground just as smoke aerosols do, inhibiting the formation of clouds.
The scientists also studied dust aerosols and found that their characteristics varied so much from place to place that they could either suppress or energize cloud formation. "It's about the complexity in dust color and size," Jiang said. "Sahara dust may be lighter, while dust from an Asian desert might likely be darker." A blanket of lighter-colored or smaller dust scatters incoming sunlight while not warming the air. Larger or darker dust particles absorb sunlight and warm the air.
Study links natural climate oscillations in north Atlantic to Greenland Ice Sheet Melt
September 18, 2018: Scientists have known for years that warming global climate is melting the Greenland Ice Sheet, the second largest ice sheet in the world. A new study from the Woods Hole Oceanographic Institution (WHOI), however, shows that the rate of melting might be temporarily increased or decreased by two existing climate patterns: the North Atlantic Oscillation (NAO), and the Atlantic Multidecadal Oscillation (AMO). 33)
Both patterns can have a major impact on regional climate. The NAO, which is measured as the atmospheric pressure difference between the Azores and Iceland, can affect the position and strength of the westerly storm track. The study, published in Geophysical Research Letters, found that when the NAO stays in its negative phase (meaning that air pressure is high over Greenland) it can trigger extreme ice melt in Greenland during the summer season. Likewise, the AMO, which alters sea surface temperatures in the North Atlantic, can cause major melting events when it is in its warm phase, raising the temperature of the region as a whole. 34)
If global climate change continues at its current rate, the Greenland ice sheet may eventually melt entirely—but whether it meets this fate sooner rather than later could be determined by these two oscillations, says Caroline Ummenhofer, a climate scientist at WHOI and co-author on the study. Depending on how the AMO and NAO interact, excess melting could happen two decades earlier than expected, or two decades later this century.
"We know the Greenland ice sheet is melting in part because of warming climate, but that's not a linear process," Ummenhofer said. "There are periods where it will accelerate, and periods where it won't."
Scientists like Ummenhofer see a pressing need to understand how natural variability can play a role in speeding up or slowing down the melting process. "The consequences go beyond just the Greenland Ice Sheet—predicting climate on the scale of the next few decades will also be useful for resource management, city planners and other people who will need to adapt to those changes," she added.
Actually forecasting environmental conditions on a decadal scale isn't easy. The NAO can switch between positive and negative phases over the course of a few weeks, but the AMO can take more than 50 years to go through a full cycle. Since scientists first started tracking climate in the late 19th century, only a handful of AMO cycles have been recorded, making it extremely difficult to identify reliable patterns. To complicate things even more, the WHOI scientists needed to tease out how much of the melting effect is caused by human-related climate change, and how much can be attributed to the AMO and NAO.
Figure 23: Scientists stand on the edge of a crevasse formed by meltwater flowing across the top of the Greenland Ice Sheet during a WHOI-led expedition in 2007 (image credit: Sarah Das, Woods Hole Oceanographic Institution)
To do so, the team relied on data from the Community Earth System Model's Large Ensemble, a massive set of climate model simulations at the NCAR (National Center for Atmospheric Research) in Boulder, CO. From that starting point, the researchers looked at 40 different iterations of the model covering 180 years over the 20th and 21st century, with each one using slightly different starting conditions.
Although the simulations all included identical human factors, such as the rise of greenhouse gases over two centuries, they used different conditions at the start—a particularly cold winter, for example, or a powerful Atlantic storm season—that led to distinct variability in the results. The team could then compare those results to each other and statistically remove the effects caused by climate change, letting them isolate the effects of the AMO and NAO.
"Using a large ensemble of model output gave more statistical robustness to our findings," said Lily Hahn, the paper's lead author. "It provided many more data points than a single model run or observations alone. That's very helpful when you're trying to investigate something as complex as atmosphere-ocean-ice interactions."
Hahn was formerly a Summer Student Fellow (SSF) and guest student at WHOI while she was an undergraduate at Yale University. She is currently working on her Ph.D. at the University of Washington. Also collaborating on the study was Young-Oh Kwon, a physical oceanographer at WHOI. This research was supported by WHOI's SSF and guest student programs, and by the U.S. National Science Foundation.
The Woods Hole Oceanographic Institution is a private, non-profit organization on Cape Cod, Mass., dedicated to marine research, engineering, and higher education. Established in 1930 on a recommendation from the National Academy of Sciences, its primary mission is to understand the oceans and their interaction with the Earth as a whole, and to communicate a basic understanding of the oceans' role in the changing global environment.
Three Causes of Earth's Spin Axis Drift Identified
September 19, 2018: A typical desk globe is designed to be a geometric sphere and to rotate smoothly when you spin it. Our actual planet is far less perfect — in both shape and in rotation.
Earth is not a perfect sphere. When it rotates on its spin axis — an imaginary line that passes through the North and South Poles — it drifts and wobbles. These spin-axis movements are scientifically referred to as "polar motion." Measurements for the 20th century show that the spin axis drifted about 10 cm per year. Over the course of a century, that becomes more than 10 meters. 35)
Using observational and model-based data spanning the entire 20th century, NASA scientists have for the first time identified three broadly-categorized processes responsible for this drift — contemporary mass loss primarily in Greenland, glacial rebound, and mantle convection.
"The traditional explanation is that one process, glacial rebound, is responsible for this motion of Earth's spin axis. But recently, many researchers have speculated that other processes could have potentially large effects on it as well," said first author Surendra Adhikari of NASA's Jet Propulsion Laboratory in Pasadena, California. "We assembled models for a suite of processes that are thought to be important for driving the motion of the spin axis. We identified not one but three sets of processes that are crucial — and melting of the global cryosphere (especially Greenland) over the course of the 20th century is one of them."
In general, the redistribution of mass on and within Earth — like changes to land, ice sheets, oceans and mantle flow — affects the planet's rotation. As temperatures increased throughout the 20th century, Greenland's ice mass decreased. In fact, a total of about 7,500 gigatons — the weight of more than 20 million Empire State Buildings — of Greenland's ice melted into the ocean during this time period. This makes Greenland one of the top contributors of mass being transferred to the oceans, causing sea level to rise and, consequently, a drift in Earth's spin axis.
While ice melt is occurring in other places (like Antarctica), Greenland's location makes it a more significant contributor to polar motion.
"There is a geometrical effect that if you have a mass that is 45 degrees from the North Pole — which Greenland is — or from the South Pole (like Patagonian glaciers), it will have a bigger impact on shifting Earth's spin axis than a mass that is right near the Pole," said coauthor Eric Ivins, also of JPL.
Previous studies identified glacial rebound as the key contributor to long-term polar motion. And what is glacial rebound? During the last ice age, heavy glaciers depressed Earth's surface much like a mattress depresses when you sit on it. As that ice melts, or is removed, the land slowly rises back to its original position. In the new study, which relied heavily on a statistical analysis of such rebound, scientists figured out that glacial rebound is likely to be responsible for only about a third of the polar drift in the 20th century.
Figure 24: The observed direction of polar motion, shown as a light blue line, compared with the sum (pink line) of the influence of Greenland ice loss (blue), postglacial rebound (yellow) and deep mantle convection (red). The contribution of mantle convection is highly uncertain (image credit: NASA/ JPL-Caltech)
The authors argue that mantle convection makes up the final third. Mantle convection is responsible for the movement of tectonic plates on Earth's surface. It is basically the circulation of material in the mantle caused by heat from Earth's core. Ivins describes it as similar to a pot of soup placed on the stove. As the pot, or mantle, heats, the pieces of the soup begin to rise and fall, essentially forming a vertical circulation pattern — just like the rocks moving through Earth's mantle.
With these three broad contributors identified, scientists can distinguish mass changes and polar motion caused by long-term Earth processes over which we have little control from those caused by climate change. They now know that if Greenland's ice loss accelerates, polar motion likely will, too.
The paper in Earth and Planetary Science Letters is titled "What drives 20th century polar motion?" 36) Besides JPL, coauthor institutions include the German Research Centre for Geosciences, Potsdam; the University of Oslo, Norway; Technical University of Denmark, Kongens Lyngby; the Geological Survey of Denmark and Greenland, Copenhagen, Denmark; and the University of Bremen, Germany. An interactive simulation of how multiple processes contribute to the wobbles in Earth's spin axis is available at: https://vesl.jpl.nasa.gov/sea-level/polar-motion/
A World on Fire
August 23, 2018: The world is on fire. Or so it appears in this image from NASA's Worldview (Figure 25). The red points overlaid on the image designate those areas that by using thermal bands detect actively burning fires. Africa seems to have the most concentrated fires. This could be due to the fact that these are most likely agricultural fires. The location, widespread nature, and number of fires suggest that these fires were deliberately set to manage land. Farmers often use fire to return nutrients to the soil and to clear the ground of unwanted plants. While fire helps enhance crops and grasses for pasture, the fires also produce smoke that degrades air quality. 37)
Elsewhere the fires, such as in North America are wildfires for the most part. In South America, specifically Chile has had horrendous numbers of wildfires this year. A study conducted by Montana State University found that: "Besides low humidity, high winds and extreme temperatures—some of the same factors contributing to fires raging across the United States—central Chile is experiencing a mega drought and large portions of its diverse native forests have been converted to more flammable tree plantations, the researchers said." 38)
However, in Brazil the fires are both wildfires and man-made fires set to clear crop fields of detritus from the last growing season. Fires are also commonly used during Brazil's dry period to deforest land and clear it for raising cattle or other agricultural or extraction purposes. The problem with these fires is that they grow out of control quickly due to climate issues. Hot, dry conditions coupled with wind drive fires far from their original intended burn area. According to the Global Fire Watch site (between15-22 August) shows: 30,964 fire alerts.
Australia is also where you tend to find large bushfires in its more remote areas. Hotter, drier summers in Australia will mean longer fire seasons – and urban sprawl into bushland is putting more people at risk for when those fires break out. For large areas in the north and west, bushfire season has been brought forward a whole two months to August – well into winter, which officially began 1 June. According to the Australian Bureau of Meteorology (Bom), the January to July period 2018 was the warmest in NSW (New South Wales) since 1910. As the climate continues to change and areas become hotter and drier, more and more extreme bushfires will break out across the entire Australian continent.
NASA's Earth Observing System Data and Information System (EOSDIS) Worldview application provides the capability to interactively browse over 700 global, full-resolution satellite imagery layers and then download the underlying data. Many of the available imagery layers are updated within three hours of observation, essentially showing the entire Earth as it looks "right now. This satellite image was collected on August 22, 2018. Actively burning fires, detected by thermal bands, are shown as red points. Image Courtesy: NASA Worldview, Earth Observing System Data and Information System (EOSDIS).
Figure 25: NASA's Worldview image of fires on a global scale. - In particular, Chile has had horrendous numbers of wildfires this year according to a study by Montana State University (image credit: EOSDIS Worldview) 39)
The results of the Montana Study Team were published in PLoS (Public Library of Science) ONE. 40) Chile has replaced many of its native forests with plantation forests to supply pulp and timber mills that produce paper and wood products. According to David McWethy of Montana State University, the lead author of the study, highly flammable non-native pine and eucalypt forests now cover the region. Eucalypt trees, which are native to Australia, and pine trees native to the United States contain oils and resins in their leaves that, when dry, can easily ignite.
"Chile replaced more heterogeneous, less flammable native forests with structurally homogenous, flammable exotic forest plantations at a time when the climate is becoming warmer and drier," said McWethy. "This situation will likely facilitate future fires to spread more easily and promote more large fires into the future."
Co-author Anibal Pauchard, professor at the University of Concepcion and researcher at the Institute of Ecology and Biodiversity in Chile, said wildfires have been a part of the Chilean landscape for centuries, but they have grown larger and more intense in recent decades, despite costly government efforts to control them. "Unfortunately, fires in central Chile are promoted by increasing human ignitions, drier and hotter climate, and the availability of abundant flammable fuels associated with pine plantations and degraded shrublands dominated by invasive species," Pauchard said.
In 2016-2017 alone, fires burned nearly 1.5 million acres (607,000 hectares)—almost twice the area of the U.S. state of Rhode Island. It was the largest area burned during a single fire season since detailed recordkeeping began in the early 1960s. In 2014, major fires near the cities of Valparaiso and Santiago destroyed thousands of homes and forced more than 10,000 people to evacuate.
The devastation prompted the Chilean government to ask what land-use policies and environmental factors were behind these fires, McWethy said. That led to a national debate about preventing and reducing the consequences of future fires and to the involvement of McWethy and his collaborators.
To better understand the Chilean fires, the researchers compared satellite information with records from the Chilean Forest Service for 2001 through 2017. They studied eight types of vegetation as well as climate conditions, elevation, slope and population density across a wide range of latitudes in Chile.
"Now we have compelling evidence that after climate, landscape composition is crucial in determining fire regimes. In particular, exotic forest plantations need to be managed to purposely reduce fire hazard," Pauchard said. "Which forestry species we plant and how we manage them matters in terms of fire frequency and intensity."
Among other things, the researchers recommended in the paper that Chile try to move away from exotic plantations toward more heterogeneous, less flammable native forests.
Figure 26: A forest of Nothofagus antarctica trees that burned in fire that covered 40,000 acres in Torres del Paine National Park, Chile in 2012 (image credit: David McWethy)
Scientists trace atmospheric rise in CO2 during deglaciation to the deep Pacific Ocean
August 13, 2018: How carbon made it out of the ocean and into the atmosphere has remained one of the most important mysteries of science. A new study, provides some of the most compelling evidence for how it happened — a 'flushing' of the deep Pacific Ocean caused by the acceleration of water circulation patterns that begin around Antarctica. 41)
Long before humans started injecting carbon dioxide into the atmosphere by burning fossil fuels like oil, gas, and coal, the level of atmospheric CO2 rose significantly as the Earth came out of its last ice age. Many scientists have long suspected that the source of that carbon was from the deep sea. - But researchers haven't been able to document just how the carbon made it out of the ocean and into the atmosphere. It has remained one of the most important mysteries of science.
A new study, published today in the journal Nature Geoscience, provides some of the most compelling evidence for how it happened — a "flushing" of the deep Pacific Ocean caused by the acceleration of water circulation patterns that begin around Antarctica. 42)
The concern, researchers say, is that it could happen again, potentially magnifying and accelerating human-caused climate change. "The Pacific Ocean is big and you can store a lot of stuff down there — it's kind of like Grandma's root cellar — stuff accumulates there and sometimes doesn't get cleaned out," said Alan Mix, an Oregon State University oceanographer and co-author on the study.
"We've known that CO2 in the atmosphere went up and down in the past, we know that it was part of big climate changes, and we thought it came out of the deep ocean. But it has not been clear how the carbon actually got out of the ocean to cause the CO2 rise."
Lead author Jianghui Du, a doctoral student in oceanography at Oregon State, said there is a circulation pattern in the Pacific that begins with water around Antarctica sinking and moving northward at great depth a few miles below the surface. It continues all the way to Alaska, where it rises, turns back southward, and flows back to Antarctica where it mixes back up to the sea surface.
It takes a long time for the water's round trip journey in the abyss — almost 1,000 years, Du said. Along with the rest of the OSU team, Du found that flow slowed down during glacial maximums but sped up during deglaciation, as the Earth warmed. This faster flow flushed the carbon from the deep Pacific Ocean — "cleaning out Grandma's root cellar" — and brought the CO2 to the surface near Antarctica. There it was released into the atmosphere.
"It happened roughly in two steps during the last deglaciation — an initial phase from 18,000 to 15,000 years ago, when CO2 rose by about 50 parts per million, and a second pulse later added another 30 parts per million," Du said. That total is just a bit less than the amount CO2 has risen since the industrial revolution. So the ocean can be a powerful source of carbon.
Brian Haley, also an Oregon State University oceanographer and co-author on the study, noted that carbon is always falling down into the deep ocean. Up near the surface, plankton grow, but when they die they sink and decompose. That is a biological pump that is always sending carbon to the bottom. "The slower the circulation," Haley said, "the more time the water spends down there, and carbon can build up."
Du said that during a glacial maximum, the water slows down and accumulates lots of carbon. "When the Earth began warming, the water movement sped up by about a factor of three," he noted, "and that carbon came back to the surface."
The key to the researchers' discovery is the analysis of neodymium isotopes in North Pacific sediment cores. Haley noted that the isotopes are "like a return address label on a letter from the deep ocean." When the ratio of isotope 143 to 144 is higher in the sediments, the water movement during that period was slower. When water movement speeds up during warming events, the ratio of neodymium isotopes reflects that too.
"This finding that the deep circulation sped up is the smoking gun in this mystery story about how CO2 got out to the deep sea," Mix said. "We now know how it happened, and the deep Pacific is the culprit — a partner in crime with Antarctica."
What concerns the researchers is that it could happen again as the climate continues to warm. "We don't know that the circulation will speed up and bring that carbon to the surface, but it seems like a reasonable thing to think about," Du said. "Our evidence that this actually happened in the past will help the people who run climate models figure out whether it is a real risk for the future."
The researchers say their findings should be considered from a policy perspective. "So far the ocean has absorbed about a third of the total carbon emitted from fossil fuels," Mix said. "That has helped slow down warming. The Paris Climate Agreement has set goals of containing warming to 1.5 to 2 degrees (Celsius) and we know pretty well how much carbon can be released to the atmosphere while keeping to that level. "But if the ocean stops absorbing the excess CO2, and instead releases more from the deep sea, that spells trouble. Ocean release would subtract from our remaining emissions budget and that means we're going to have to get our emissions down a heck of a lot faster. We need to figure out how much."
The authors are from the College of Earth, Ocean, and Atmospheric Sciences at OSU (Oregon State University), and from USGS (United States Geological Survey). The study was supported by the NSF (National Science Foundation).
Climate change is making night-shining clouds more visible
July 2, 2018: Increased water vapor in Earth's atmosphere due to human activities is making shimmering high-altitude clouds more visible, a new study finds. The results suggest these strange but increasingly common clouds seen only on summer nights are an indicator of human-caused climate change, according to the study's authors. 43)
Noctilucent, or night-shining, clouds are the highest clouds in Earth's atmosphere. They form in the middle atmosphere, or mesosphere, roughly 80 km (50 miles) above Earth's surface. The clouds form when water vapor freezes around specks of dust from incoming meteors.
Humans first observed noctilucent clouds in 1885, after the eruption of Krakatoa volcano in Indonesia spewed massive amounts of water vapor in the air. Sightings of the clouds became more common during the 20th century, and in the 1990s scientists began to wonder whether climate change was making them more visible.
In a new study, researchers used satellite observations and climate models to simulate how the effects of increased greenhouse gases from burning fossil fuels have contributed to noctilucent cloud formation over the past 150 years. Extracting and burning fossil fuels delivers carbon dioxide, methane and water vapor into the atmosphere, all of which are greenhouse gases.
The study's results suggest methane emissions have increased water vapor concentrations in the mesosphere by about 40 percent since the late 1800s, which has more than doubled the amount of ice that forms in the mesosphere. They conclude human activities are the main reason why noctilucent clouds are significantly more visible now than they were 150 years ago.
"We speculate that the clouds have always been there, but the chance to see one was very, very poor, in historical times," said Franz-Josef Lübken, an atmospheric scientist at the Leibniz Institute of Atmospheric Physics in Kühlungsborn, Germany and lead author of the new study in Geophysical Research Letters, a journal of the American Geophysical Union. 44)
The results suggest noctilucent clouds (NLCs) are a sign that human-caused climate change is affecting the middle atmosphere, according to the authors. Whether thicker, more visible noctilucent clouds could influence Earth's climate themselves is the subject of future research, Lübken said.
"Our methane emissions are impacting the atmosphere beyond just temperature change and chemical composition," said Ilissa Seroka, an atmospheric scientist at the Environmental Defense Fund in Washington, D.C. who was not connected to the new study. "We now detect a distinct response in clouds."
Figure 27: Noctilucent clouds form only in the summertime and are only visible at dawn and dusk. New research suggests they are becoming more visible and forming more frequently due to climate change (image credit: NASA)
Studying cloud formation over time: Conditions must be just right for noctilucent clouds to be visible. The clouds can only form at mid to high latitudes in the summertime, when mesospheric temperatures are cold enough for ice crystals to form. And they're only visible at dawn and dusk, when the Sun illuminates them from below the horizon.
Humans have injected massive amounts of greenhouse gases into the atmosphere by burning fossil fuels since the start of the industrial period 150 years ago. Researchers have wondered what effect, if any, this has had on the middle atmosphere and the formation of noctilucent clouds.
Figure 28: This diagram shows the major layers of Earth's atmosphere. Noctilucent clouds form in the mesosphere, high above where normal weather clouds form image credit: Randy Russel/UCAR)
In the new study, Lübken and colleagues ran computer simulations to model the Northern Hemisphere's atmosphere and noctilucent clouds from 1871 to 2008. They wanted to simulate the effects of increased greenhouse gases, including water vapor, on noctilucent cloud formation over this time period.
The researchers found the presence of noctilucent clouds fluctuates from year to year and even from decade to decade, depending on atmospheric conditions and the solar cycle. But over the whole study period, the clouds have become significantly more visible.
The reasons for this increased visibility were surprising, according to Lübken. Carbon dioxide warms Earth's surface and the lower part of the atmosphere, but actually cools the middle atmosphere where noctilucent clouds form. In theory, this cooling effect should make noctilucent clouds form more readily.
But the study's results showed increasing carbon dioxide concentrations since the late 1800s have not made noctilucent clouds more visible. It seems counterintuitive, but when the middle atmosphere becomes colder, more ice particles form but they are smaller and therefore harder to see, Lübken explained. "Keeping water vapor constant and making it just colder means that we would see less ice particles," he said.
Figure 29: Noctilucent clouds over the city of Wismar, Germany in July 2015. Tropospheric clouds are visible as dark patches near the horizon (image credit: Leibniz Institute of Atmospheric Physics)
On the contrary, the study found more water vapor in the middle atmosphere is making ice crystals larger and noctilucent clouds more visible. Water vapor in the middle atmosphere comes from two sources: water vapor from Earth's surface that is transported upward, and methane, a potent greenhouse gas that produces water vapor through chemical reactions in the middle atmosphere.
The study found the increase in atmospheric methane since the late 1800s has significantly increased the amount of water vapor in the middle atmosphere. This more than doubled the amount of mesospheric ice present in the mid latitudes from 1871 to 2008, according to the study.
People living in the mid to high latitudes now have a good chance of seeing noctilucent clouds several times each summer, Lübken said. In the 19th century, they were probably visible only once every several decades or so, he said. "The result was rather surprising that, yes, on these time scales of 100 years, we would expect to see a big change in the visibility of clouds," according to Lübken.
NASA study solves glacier puzzle in northwest Greenland
June 21, 2018: A new NASA study explains why the Tracy and Heilprin glaciers, which flow side by side into Inglefield Gulf in northwest Greenland, are melting at radically different rates. 45)
Using ocean data from NASA's OMG (Oceans Melting Greenland) campaign, the study documents a plume of warm water flowing up Tracy's underwater face, and a much colder plume in front of Heilprin. Scientists have assumed plumes like these exist for glaciers all around Greenland, but this is the first time their effects have been measured.
The finding highlights the critical role of oceans in glacial ice loss and their importance for understanding future sea level rise. A paper on the research was published June 21 in the journal Oceanography. 46)
Tracy and Heilprin were first observed by explorers in 1892 and have been measured sporadically ever since. Even though the adjoining glaciers experience the same weather and ocean conditions, Heilprin has retreated upstream less than 4 km in 125 years, while Tracy has retreated more than 15 km. That means Tracy is losing ice almost four times faster than its next-door neighbor.
This is the kind of puzzle OMG was designed to explain. The five-year campaign is quantifying ice loss from all glaciers that drain the Greenland Ice Sheet with an airborne survey of ocean and ice conditions around the entire coastline, collecting data through 2020. OMG is making additional boat-based measurements in areas where the seafloor topography and depths are inadequately known.
About a decade ago, NASA's Operation IceBridge campaign used ice-penetrating radar to document a major difference between the glaciers: Tracy is seated on bedrock at a depth of about 610 m below the ocean surface, while Heilprin extends only 350 m beneath the waves.
Scientists would expect this difference to affect the melt rates, because the top ocean layer around Greenland is colder than the deep water, which has traveled north from the midlatitudes in ocean currents. The warm water layer starts about 200 m down from the surface, and the deeper the water, the warmer it is. Naturally, a deeper glacier would be exposed to more of this warm water than a shallower glacier would.
When OMG Principal Investigator Josh Willis of NASA's Jet Propulsion Laboratory in Pasadena, California, looked for more data to quantify the difference between Tracy and Heilprin, "I couldn't find any previous observations of ocean temperature and salinity in the fjord at all," he said. There was also no map of the seafloor in the gulf.
OMG sent a research boat into the Inglefield Gulf in the summer of 2016 to fill in the data gap. The boat's soundings of ocean temperature and salinity showed a river of meltwater draining out from under Tracy. Because freshwater is more buoyant than the surrounding seawater, as soon as the water escapes from under the glacier, it swirls upward along the glacier's icy face. The turbulent flow pulls in surrounding subsurface water, which is warm for a polar ocean at about 0.5 degree Celsius. As it gains volume, the plume spreads like smoke rising from a smokestack.
"Most of the melting happens as the water rises up Tracy's face," Willis said. "It eats away at a huge chunk of the glacier."
Figure 30: Tracy and Heilprin glaciers in northwest Greenland. The two glaciers flow into a fjord that appears black in this image (image credit: NASA)
Heilprin also has a plume, but its shallower depth limits the plume's damage in two ways: the plume has a shorter distance to rise and gathers less seawater; and the shallow seawater it pulls in has a temperature of only about minus 0.5 degree Celsius. As a result, even though Heilprin is a bigger glacier and more water drains from underneath it than from Tracy, its plume is smaller and colder.
The study produced another surprise by first mapping a ridge, called a sill, only about 250 m below the ocean surface in front of Tracy, and then proving that this sill did not keep warm water from the ocean depths away from the glacier. "In fact, quite a lot of warm water comes in from offshore, mixes with the shallower layers and comes over the sill," Willis said. Tracy's destructive plume is evidence of that.
Figure 31: This figure shows estimated ice flow velocities of Tracy and Heilprin glaciers (right) and the depths of the fjord in front of the glaciers. The approximate location of the sill in front of Tracy is shown as a dashed yellow line. Research ship cruise tracks are shown in orange (image credit: NASA/JPL-Caltech)
Ice losses from Antarctica speed Sea Level Rise - IMBIE Study
June 13, 2018: Ice losses from Antarctica have tripled since 2012, increasing global sea levels by 3 mm in that timeframe alone, according to a major new international climate assessment funded by NASA and ESA (European Space Agency). 47)
In a major collaborative effort, scientists from around the world have used information from satellites to reveal that ice melting in Antarctica has not only raised sea levels by 7.6 mm since 1992, but, critically, almost half of this rise has occurred in the last five years. 48)
According to the study, ice losses from Antarctica are causing sea levels to rise faster today than at any time in the past 25 years. Results of the IMBIE (Ice Sheet Mass Balance Inter-comparison Exercise) study were published on 13 June 2018 in the journal Nature. 49)
"This is the most robust study of the ice mass balance of Antarctica to date," said assessment team co-lead Erik Ivins at NASA's Jet Propulsion Laboratory (JPL). "It covers a longer period than our 2012 IMBIE study, has a larger pool of participants, and incorporates refinements in our observing capability and an improved ability to assess uncertainties." Andrew Shepherd from the University of Leeds in the UK and Erik Ivins from NASA/JPL led a group of 84 scientists from 44 international organizations in research that has resulted in the most complete picture to date of how Antarctica's ice sheet is changing. This latest IMBIE is the most complete assessment of Antarctic ice mass changes to date, combining 24 satellite surveys of Antarctica.
ESA's CryoSat-2 and the Copernicus Sentinel-1 mission were particularly useful for the study assessment. Carrying a radar altimeter, CryoSat-2 is designed to measure changes in the height of the ice, which is used to calculate changes in the volume of the ice. It is also especially designed to measure changes around the margins of ice sheets where ice is calved as icebergs. The two-satellite Sentinel-1 radar mission, which is used to monitor ice motion, can image Earth regardless of the weather or whether is day or night – which is essential during the dark polar winters.
Figure 32: ESA's Earth Explorer CryoSat mission is dedicated to precise monitoring of changes in the thickness of marine ice floating in the polar oceans and variations in the thickness of the vast ice sheets that blanket Greenland and Antarctica (image credit: ESA/AOES Medialab)
The team looked at the mass balance of the Antarctic ice sheet from 1992 to 2017 and found ice losses from Antarctica raised global sea levels by 7.6 mm, with a sharp uptick in ice loss in recent years. They attribute the threefold increase in ice loss from the continent since 2012 to a combination of increased rates of ice melt in West Antarctica and the Antarctic Peninsula, and reduced growth of the East Antarctic ice sheet.
Prior to 2012, ice was lost at a steady rate of about 76 billion metric tons per year, contributing about 0.2 mm a year to sea level rise. Since 2012, the amount of ice loss per year has tripled to 219 billion metric tons – equivalent to about 0.6 mm of sea level rise.
West Antarctica experienced the greatest recent change, with ice loss rising from 53 billion metric tons per year in the 1990s, to 159 billion metric tons a year since 2012. Most of this loss came from the huge Pine Island and Thwaites Glaciers, which are retreating rapidly due to ocean-induced melting.
At the northern tip of the continent, ice-shelf collapse at the Antarctic Peninsula has driven an increase of 25 billion metric tons in ice loss per year since the early 2000s. Meanwhile, the team found the East Antarctic ice sheet has remained relatively balanced during the past 25 years, gaining an average of 5 billion metric tons of ice per year.
Figure 33: Changes in the Antarctic ice sheet's contribution to global sea level, 1992 to 2017 (image credit: IMBIE/Planetary Visions)
Antarctica's potential contribution to global sea level rise from its land-held ice is almost 7.5 times greater than all other sources of land-held ice in the world combined. The continent stores enough frozen water to raise global sea levels by 58 meters, if it were to melt entirely. Knowing how much ice it's losing is key to understanding the impacts of climate change now and its pace in the future.
"The datasets from IMBIE are extremely valuable for the ice sheet modeling community," said study co-author Sophie Nowicki of NASA's Goddard Space Flight Center. "They allow us to test whether our models can reproduce present-day change and give us more confidence in our projections of future ice loss."
The satellite missions providing data for this study are NASA's Ice, Cloud and land Elevation Satellite (ICESat); the joint NASA/German Aerospace Center Gravity Recovery and Climate Experiment (GRACE); ESA's first and second European Remote Sensing satellites (ERS-1 and -2), Envisat and CryoSat-2; the European Union's Sentinel-1 and Sentinel-2 missions; the Japan Aerospace Exploration Agency's Advanced Land Observatory System (ALOS); the Canadian Space Agency's RADARSAT-1 and RADARSAT-2 satellites; the Italian Space Agency's COSMO-SkyMed satellites; and the German Aerospace Center's TerraSAR-X satellite.
Tom Wagner, cryosphere program manager at NASA Headquarters, hopes to welcome a new era of Antarctic science with the May 2018 launch of the Gravity Recovery and Climate Experiment Follow-on (GRACE-FO) mission and the upcoming launch of NASA's Ice, Cloud and land Elevation Satellite-2 (ICESat-2). "Data from these missions will help scientists connect the environmental drivers of change with the mechanisms of ice loss to improve our projections of sea level rise in the coming decades," Wagner said.
Invisible barrier on ocean surface reduces carbon uptake by half
Scientists from Exeter, Heriot-Watt and Newcastle universities (UK) published their research in the journal Nature Geoscience, and say the findings have major implications for predicting our future climate. 52)
The world's oceans currently absorb around a quarter of all anthropogenic carbon dioxide emissions, making them the largest long-term sink of carbon on Earth. Atmosphere-ocean gas exchange is controlled by turbulence at the sea surface, the main cause of which is waves generated by wind. Greater turbulence means increased gas exchange and, until now, it was difficult to calculate the effect of biological surfactants on this exchange.
The Natural Environment Research Council (NERC, Swindon, UK), Leverhulme Trust and ESA (European Space Agency) funded team developed a novel experimental system that directly compares "the surfactant effect" between different sea waters collected along oceanographic cruises, in real time. Using this and satellite observations, the team then found that surfactants can reduce carbon dioxide exchange by up to 50 percent.
Dr Ryan Pereira, a Lyell Research Fellow at Heriot-Watt University in Edinburgh, said: "As surface temperatures rise, so too do surfactants, which is why this is such a critical finding. The warmer the ocean surface gets, the more surfactants we can expect, and an even greater reduction in gas exchange. What we discovered at 13 sites across the Atlantic Ocean is that biological surfactants suppress the rate of gas exchange caused by the wind. We made unique measurements of gas transfer using a purpose-built tank that could measure the relative exchange of gases impacted only by surfactants present at these sites. These natural surfactants aren't necessarily visible like an oil slick, or a foam, and they are even difficult to identify from the satellites monitoring our ocean's surface. We need to be able to measure and identify the organic matter on the surface microlayer of the ocean so that we can reliably estimate rates of gas exchange of climate active gases, such as carbon dioxide and methane."
Professor Rob Upstill-Goddard, professor of marine biogeochemistry at Newcastle University, said: "These latest results build on our previous findings that, contrary to conventional wisdom, large sea surface enrichments of natural surfactants counter the effects of high winds. The suppression of carbon dioxide uptake across the ocean basin due to surfactants, as revealed by our work, implies slower removal of anthropogenic carbon dioxide from the atmosphere and thus has implications for predicting future global climate."
The University of Exeter team, Drs Jamie Shutler (Geography) and Ian Ashton (Renewable Energy) led the satellite component of the work. Ian Ashton said: "Combining this new research with a wealth of satellite data available allows us to consider the effect of surfactants on gas exchange across the entire Atlantic Ocean, helping us to monitor carbon dioxide on a global scale."
The team collected samples across the Atlantic Ocean in 2014, during a NERC study on the Atlantic Meridional Transect (AMT). Each year the AMT cruise undertakes biological, chemical and physical oceanographic research between the UK and the Falkland Islands, South Africa or Chile, a distance of up to 13,500 km, to study the health and function of our oceans.
The research cruise crosses a range of ecosystems from sub-polar to tropical and from coastal and shelf seas and upwelling systems to oligotrophic mid-ocean gyres.
NOAA finds rising emissions of ozone-destroying chemical banned by Montreal Protocol
• May 16, 2018: CFCs (Chlorofluorocarbons) were once considered a triumph of modern chemistry. Stable and versatile, these chemicals were used in hundreds of products, from military systems to the ubiquitous can of hairspray. Then in 1987, NOAA scientists were part of an international team that proved this family of wonder chemicals was damaging Earth's protective ozone layer and creating the giant hole in the ozone layer that forms over Antarctica each September. The Montreal Protocol, signed later that year, committed the global community to phasing out their use. Production of the second-most abundant CFC, CFC-11, would end completely by 2010. 53)
A new analysis of long-term atmospheric measurements by NOAA scientists shows emissions of the chemical CFC-11 are rising again, most likely from new, unreported production from an unidentified source in East Asia. The results are published today in the journal Nature. 54)
"We're raising a flag to the global community to say, ‘This is what's going on, and it is taking us away from timely recovery of the ozone layer,'" said NOAA scientist Stephen Montzka, the study's lead author. "Further work is needed to figure out exactly why emissions of CFC-11 are increasing, and if something can be done about it soon."
The findings of Montzka and his team of researchers from CIRES [Cooperative Institute for Research in Environmental Sciences (University of Boulder, and at NOAA, Boulder, CO, USA)] offsite link, the UK, and the Netherlands, represent the first time that emissions of one of the three most abundant, long-lived CFCs have increased for a sustained period since production controls took effect in the late 1980s.
CFC-11 is the second-most abundant ozone-depleting gas in the atmosphere because of its long life and continuing emissions from a large reservoir of the chemical in foam building insulation and appliances manufactured before the mid-1990s. A smaller amount of CFC-11 also exists today in older refrigerators and freezers.
The Montreal Protocol has been effective in reducing ozone-depleting gases in the atmosphere because all countries in the world agreed to legally binding controls on the production of most human-produced gases known to destroy ozone. As a result, CFC-11 concentrations have declined by 15% from peak levels measured in 1993.
Though concentrations of CFC-11 in the atmosphere are still declining, they're declining more slowly than they would if there were no new sources, Montzka said.
The results from the new analysis of NOAA atmospheric measurements explain why. From 2014 to 2016, emissions of CFC-11 increased by 25 percent above the average measured from 2002 to 2012.
Scientists had been predicting that by the mid- to late century, the abundance of ozone-depleting gases would fall to levels last seen before the Antarctic ozone hole began to appear in the early 1980s.
Montzka said the new analysis can't definitively explain why emissions of CFC-11 are increasing, but in the paper, the team discusses potential reasons why. "In the end, we concluded that it's most likely that someone may be producing the CFC-11 that's escaping to the atmosphere," he said. "We don't know why they might be doing that and if it is being made for some specific purpose, or inadvertently as a side product of some other chemical process."
If the source of these new emissions can be identified and controlled soon, the damage to the ozone layer should be minor, Montzka said. If not remedied soon, however, substantial delays in ozone layer recovery could be expected.
Emerging Trends in Global Freshwater Availability
• May 16, 2018: In a first-of-its-kind study, scientists have combined an array of NASA satellite observations of Earth with data on human activities to map locations where freshwater is changing around the globe and to determine why. 55)
The study, published on 16 May in the journal Nature, finds that Earth's wet land areas are getting wetter and dry areas are getting drier due to a variety of factors, including human water management, climate change and natural cycles. 56)
Figure 34: This map depicts a time series of data collected by NASA's GRACE (Gravity Recovery and Climate Experiment) mission from 2002 to 2016, showing where freshwater storage was higher (blue) or lower (red) than the average for the 14-year study period (image credit: GRACE study team,NASA)
A team led by Matt Rodell of NASA/GSFC (Goddard Space Flight Center) in Greenbelt, Maryland, used 14 years of observations from the U.S./German-led GRACE spacecraft mission to track global trends in freshwater in 34 regions around the world (Figure 35). To understand why these trends emerged, they needed to pull in satellite precipitation data from the Global Precipitation Climatology Project, NASA/USGS (U.S. Geological Survey) Landsat imagery, irrigation maps, and published reports of human activities related to agriculture, mining and reservoir operations. Only through analysis of the combined data sets were the scientists able to get a full understanding of the reasons for Earth's freshwater changes as well as the sizes of those trends.
"This is the first time that we've used observations from multiple satellites in a thorough assessment of how freshwater availability is changing, everywhere on Earth," said Rodell. "A key goal was to distinguish shifts in terrestrial water storage caused by natural variability – wet periods and dry periods associated with El Niño and La Niña, for example – from trends related to climate change or human impacts, like pumping groundwater out of an aquifer faster than it is replenished."
"What we are witnessing is major hydrologic change," said co-author Jay Famiglietti of NASA/JPL in Pasadena, California. "We see a distinctive pattern of the wet land areas of the world getting wetter – those are the high latitudes and the tropics – and the dry areas in between getting dryer. Embedded within the dry areas we see multiple hotspots resulting from groundwater depletion."
Famiglietti noted that while water loss in some regions, like the melting ice sheets and alpine glaciers, is clearly driven by warming climate, it will require more time and data to determine the driving forces behind other patterns of freshwater change. "The pattern of wet-getting-wetter, dry-getting-drier during the rest of the 21st century is predicted by the Intergovernmental Panel on Climate Change models, but we'll need a much longer dataset to be able to definitively say whether climate change is responsible for the emergence of any similar pattern in the GRACE data," he said.
The twin GRACE satellites, launched in 2002 as a joint mission with DLR (German Aerospace Center), precisely measured the distance between the two spacecraft to detect changes in Earth's gravity field caused by movements of mass on the planet below. Using this method, GRACE tracked monthly variations in terrestrial water storage until its science mission ended in October 2017.
Groundwater, soil moisture, surface waters, snow and ice are dynamic components of the terrestrial water cycle. Although they are not static on an annual basis (as early water-budget analyses supposed), in the absence of hydroclimatic shifts or substantial anthropogenic stresses they typically remain range-bound. Recent studies have identified locations where TWS (Terrestrial Water Storage) appears to be trending below previous ranges, notably where ice sheets or glaciers are diminishing in response to climate change and where groundwater is being withdrawn at an unsustainable rate.
Figure 35: Trends in TWS (Terrestrial Water Storage, in cm/year) obtained on the basis of GRACE observations from April 2002 to March 2016. The cause of the trend in each outlined study region is briefly explained and color-coded by category. The trend map was smoothed with a 150-km-radius Gaussian filter for the purpose of visualization; however, all calculations were performed at the native 3º resolution of the data product (image credit: GRACE study team, NASA)
However, the GRACE satellite observations alone couldn't tell Rodell, Famiglietti and their colleagues what was causing the apparent trends. "We examined information on precipitation, agriculture and groundwater pumping to find a possible explanation for the trends estimated from GRACE," said co-author Hiroko Beaudoing of Goddard and the University of Maryland in College Park, MD.
For instance, although pumping groundwater for agricultural uses is a significant contributor to freshwater depletion throughout the world, groundwater levels are also sensitive to cycles of persistent drought or rainy conditions. Famiglietti noted that such a combination was likely the cause of the significant groundwater depletion observed in California's Central Valley from 2007 to 2015, when decreased groundwater replenishment from rain and snowfall combined with increased pumping for agriculture.
Southwestern California lost 4 gigatons (equivalent to 4 x 109 m3 or 4 km3) of freshwater per year during the same period. A gigaton of water would fill 400,000 Olympic swimming pools. A majority of California's freshwater comes in the form of rainfall and snow that collect in the Sierra Nevada snowpack and then is managed as it melts into surface waters through a series of reservoirs. When natural cycles led to less precipitation and caused diminished snowpack and surface waters, people relied on groundwater more heavily.
Downward trends in freshwater seen in Saudi Arabia also reflect agricultural pressures. From 2002 to 2016, the region lost 6.1 gigatons per year of stored groundwater. Imagery from Landsat satellites shows an explosive growth of irrigated farmland in the arid landscape from 1987 to the present, which may explain the increased drawdown.
The team's analyses also identified large, decade-long trends in terrestrial freshwater storage that do not appear to be directly related to human activities. Natural cycles of high or low rainfall can cause a trend that is unlikely to persist, Rodell said. An example is Africa's western Zambezi basin and Okavango Delta, a vital watering hole for wildlife in northern Botswana. In this region, water storage increased at an average rate of 29 gigatons per year from 2002 to 2016. This wet period during the GRACE mission followed at least two decades of dryness. Rodell believes it is a case of natural variability that occurs over decades in this region of Africa.
The researchers found that a combination of natural and human pressures can lead to complex scenarios in some regions. Xinjiang province in northwestern China, about the size of Kansas, is bordered by Kazakhstan to the west and the Taklamakan desert to the south and encompasses the central portion of the Tien Shan Mountains. During the first decades of this century, previously undocumented water declines occurred in Xinjiang.
Rodell and his colleagues pieced together multiple factors to explain the loss of 5.5 gigatons of terrestrial water storage per year in Xinjiang province. Less rainfall was not the culprit. Additions to surface water were also occurring from climate change-induced glacier melt, and the pumping of groundwater out of coal mines. But these additions were more than offset by depletions caused by an increase in water consumption by irrigated cropland and evaporation of river water from the desert floor.
The successor to GRACE, called GRACE-FO (GRACE Follow-On), a joint mission with the GFZ (German Research Center for Geosciences), currently is at Vandenberg Air Force Base in California undergoing final preparations for launch no earlier than 22 May 2018.
Earth's magnetic field is NOT about to reverse
April 30, 2018: A study of the most recent near-reversals of the Earth's magnetic field by an international team of researchers, including the University of Liverpool, has found it is unlikely that such an event will take place anytime soon. 57)
There has been speculation that the Earth's geomagnetic fields may be about to reverse , with substantial implications, due to a weakening of the magnetic field over at least the last two hundred years, combined with the expansion of an identified weak area in the Earth's magnetic field called the South Atlantic Anomaly, which stretches from Chile to Zimbabwe.
In a paper published in the Proceedings of the National Academy of Sciences (PNAS), a team of international researchers model observations of the geomagnetic field of the two most recent geomagnetic excursion events, the Laschamp, approximately 41,000 years ago, and Mono Lake, around 34,000 years ago, where the field came close to reversing but recovered its original structure (Figure 36). 58)
The model reveals a field structures comparable to the current geomagnetic field at both approximately 49,000 and 46,000 years ago, with an intensity structure similar to, but much stronger than, today's South Atlantic Anomaly (SAA); their timing and severity is confirmed by records of cosmogenic nuclides. However, neither of these SAA-like fields developed into an excursion or reversal.
Richard Holme, Professor of Geomagnetism at the University of Liverpool, said: "There has been speculation that we are about to experience a magnetic polar reversal or excursion. However, by studying the two most recent excursion events, we show that neither bear resemblance to current changes in the geomagnetic field and therefore it is probably unlikely that such an event is about to happen. - Our research suggests instead that the current weakened field will recover without such an extreme event, and therefore is unlikely to reverse."
The strength and structure of the Earth's magnetic field has varied at different times throughout geological history. At certain periods, the geomagnetic field has weakened to such an extent that it was able to swap the positions of magnetic north and magnetic south, whilst geographic north and geographic south remain the same.
Called a geomagnetic reversal, the last time this happened was 780,000 years ago. However, geomagnetic excursions, where the field comes close to reversing but recovers its original structure, have occurred more recently.
The magnetic field shields the Earth from solar winds and harmful cosmic radiation. It also aids in human navigation, animal migrations and protects telecommunication and satellite systems. It is generated deep within the Earth in a fluid outer core of iron, nickel and other metals that creates electric currents, which in turn produce magnetic fields.
Figure 36: Intensity at Earth's surface (left) and radial field (Br) at the CMB (right). Top: mid-point of the Laschamp excursion; bottom: mid-point of the Mono Lake excursion. The field is truncated at spherical harmonic degree five (image credit: University of Liverpool)
Legend to Figure 36: The geomagnetic field has been decaying at a rate of ~5% per century from at least 1840, with indirect observations suggesting a decay since 1600 or even earlier. This has led to the assertion that the geomagnetic field may be undergoing a reversal or an excursion. The study team has derived a model of the geomagnetic field spanning 30–50 ka (where ka stands for kilo anni; hence, 40 ka are 40,000 years), constructed to study the behavior of the two most recent excursions: the Laschamp and Mono Lake, centered at 41 and 34 ka, respectively.
The research also involved the University of Iceland and GFZ German Research Centre for Geosciences.
West Greenland Ice Sheet melting at the fastest rate in centuries
April 3, 2018: The West Greenland Ice Sheet melted at a dramatically higher rate over the last twenty years than at any other time in the modern record, according to a study led by Dartmouth College (Hanover, NH, USA). The research, appearing in the journal Geophysical Research Letters, shows that melting in west Greenland since the early 1990s is at the highest levels in at least 450 years. 59) 60)
While natural patterns of certain atmospheric and ocean conditions are already known to influence Greenland melt, the study highlights the importance of a long-term warming trend to account for the unprecedented west Greenland melt rates in recent years. The researchers suggest that climate change most likely associated with human greenhouse gas emissions is the probable cause of the additional warming.
"We see that west Greenland melt really started accelerating about twenty years ago," said Erich Osterberg, assistant professor of earth sciences at Dartmouth and the lead scientist on the project. "Our study shows that the rapid rise in west Greenland melt is a combination of specific weather patterns and an additional long-term warming trend over the last century."
According to research cited in the study, loss of ice from Greenland is one of the largest contributors to global sea level rise. Although glaciers calving into the ocean cause much of the ice loss in Greenland, other research cited in the study shows that the majority of ice loss in recent years is from increased surface melt and runoff.
Figure 37: Record of melt from two west Greenland ice cores showing that modern melt rates (red) are higher than at any time in the record since at least 1550 CE (black). The record is plotted as the percent of each year's layer represented by refrozen melt water (image credit: Erich Osterberg)
While satellite measurements and climate models have detailed this recent ice loss, there are far fewer direct measurements of melt collected from the ice sheet itself. For this study, researchers from Dartmouth and Boise State University spent two months on snowmobiles to collect seven ice cores from the remote "percolation zone" of the West Greenland Ice Sheet.
When warm temperatures melt snow on the surface of the percolation zone, the melt water trickles down into the deeper snow and refreezes into ice layers. Researchers were easily able to distinguish these ice layers from the surrounding compacted snow in the cores, preserving a history of how much melt occurred back through time. The more melt, the thicker the ice layers.
"Most ice cores are collected from the middle of the ice sheet where it rarely ever melts, or on the ice sheet edge where the meltwater flows into the ocean. We focused on the percolation zone because that's where we find the best record of Greenland melt going back through time in the form of the refrozen ice layers," said Karina Graeter, the lead author of the study as a graduate student in Dartmouth's Department of Earth Sciences.
The cores, some as long as 30 m, were transported to Dartmouth where the research team used a light table to measure the thickness and frequency of the ice layers. The cores were also sampled for chemical measurements in Dartmouth's Ice Core Laboratory to determine the age of each ice layer.
The cores reveal that the ice layers became thicker and more frequent beginning in the 1990s, with recent melt levels that are unmatched since at least the year 1550 CE (Common Era).
"The ice core record ends about 450 years ago, so the modern melt rates in these cores are the highest of the whole record that we can see," said Osterberg. "The advantage of the ice cores is that they show us just how unusual it is for Greenland to be melting this fast."
Year-to-year changes in Greenland melt since 1979 were already known to be closely tied to North Atlantic ocean temperatures and high-pressure systems that sit above Greenland during the summer — known as summer blocking highs. The new study extends the record back in time to show that these were important controls on west Greenland melt going back to at least 1870.
The study also shows that an additional summertime warming factor of 1.2 ºC is needed to explain the unusually strong melting observed since the 1990s. The additional warming caused a near-doubling of melt rates in the twenty-year period from 1995 to 2015 compared to previous times when the same blocking and ocean conditions were present.
"It is striking to see how a seemingly small warming of only 1.2 ºC can have such a large impact on melt rates in west Greenland," said Graeter.
The study concludes that North Atlantic ocean temperatures and summer blocking activity will continue to control year-to-year changes in Greenland melt into the future. Some climate models suggest that summer blocking activity and ocean temperatures around Greenland might decline in the next several decades, but it remains uncertain. However, the study points out that continued warming from human activities would overwhelm those weather patterns over time to further increase melting.
"Cooler North Atlantic ocean temperatures and less summer blocking activity might slow down Greenland melt for a few years or even a couple decades, but it would not help us in the long run," said Osterberg. "Beyond a few decades, Greenland melting will almost certainly increase and raise sea level as long as we continue to emit greenhouse gases."
Landslide Threats in Near Real-Time During Heavy Rains
February /March 2018: For the first time, scientists can look at landslide threats anywhere around the world in near real-time, thanks to satellite data and a new model developed by NASA. The model, developed at NASA/GSFC (Goddard Space Flight Center) in Greenbelt, Maryland, estimates potential landslide activity triggered by rainfall. Rainfall is the most widespread trigger of landslides around the world. If conditions beneath Earth's surface are already unstable, heavy rains act as the last straw that causes mud, rocks or debris — or all combined — to move rapidly down mountains and hillsides. 61)
The model is designed to increase our understanding of where and when landslide hazards are present and improve estimates of long-term patterns. A global analysis of landslides over the past 15 years using the new open source Landslide Hazard Assessment for Situational Awareness model was published in a study released online on March 22 in the journal Earth's Future. 62)
Determining where, when, and how landslide hazards may vary and affect people at the global scale is fundamental to formulating mitigation strategies, appropriate and timely responses, and robust recovery plans. While monitoring systems exist for other hazards, no such system exists for landslides. A near global LHASA (Landslide Hazard Assessment model for Situational Awareness) has been developed to provide an indication of potential landslide activity at the global scale every 30 minutes. This model uses surface susceptibility and satellite rainfall data to provide moderate to high "nowcasts." This research describes the global LHASA currently running in near real-time and discusses the performance and potential applications of this system. LHASA is intended to provide situational awareness of landslide hazards in near real-time. This system can also leverage nearly two decades of satellite precipitation data to better understand long-term trends in potential landslide activity.
"Landslides can cause widespread destruction and fatalities, but we really don't have a complete sense of where and when landslides may be happening to inform disaster response and mitigation," said Dalia Kirschbaum, a landslide expert at Goddard and co-author of the study. "This model helps pinpoint the time, location and severity of potential landslide hazards in near real-time all over the globe. Nothing has been done like this before."
The model estimates potential landslide activity by first identifying areas with heavy, persistent and recent precipitation. Rainfall estimates are provided by a multi-satellite product developed by NASA using the NASA and JAXA (Japan Aerospace Exploration Agency's)GPM ( Global Precipitation Measurement) mission, which provides precipitation estimates around the world every 30 minutes. The model considers when GPM data exceeds a critical rainfall threshold looking back at the last seven days.
In places where precipitation is unusually high, the model then uses a susceptibility map to determine if the area is prone to landslides. This global susceptibility map is developed using five features that play an important role in landslide activity: if roads have been built nearby, if trees have been removed or burned, if a major tectonic fault is nearby, if the local bedrock is weak and if the hillsides are steep.
If the susceptibility map shows the area with heavy rainfall is vulnerable, the model produces a "nowcast" identifying the area as having a high or moderate likelihood of landslide activity. The model produces new nowcasts every 30 minutes.
Figure 38: This animation shows the potential landslide activity by month averaged over the last 15 years as evaluated by NASA's Landslide Hazard Assessment model for Situational Awareness model. Here, you can see landslide trends across the world (image credit: NASA/GSFC / Scientific Visualization Studio)
The study shows long-term trends when the model's output was compared to landslide databases dating back to 2007. The team's analysis showed a global "landslide season" with a peak in the number of landslides in July and August, most likely associated with the Asian monsoon and tropical cyclone seasons in the Atlantic and Pacific oceans.
"The model has been able to help us understand immediate potential landslide hazards in a matter of minutes," said Thomas Stanley, landslide expert with the Universities Space Research Association at Goddard and co-author of the study. "It also can be used to retroactively look at how potential landslide activity varies on the global scale seasonally, annually or even on decadal scales in a way that hasn't been possible before."
Study of Antarctic ice loss
February 20, 2018: A NASA study based on an innovative technique for crunching torrents of satellite data provides the clearest picture yet of changes in Antarctic ice flow into the ocean. The findings confirm accelerating ice losses from the West Antarctic Ice Sheet and reveal surprisingly steady rates of flow from its much larger neighbor to the east. 63)
The computer-vision technique crunched data from hundreds of thousands of NASA- USGS (U.S. Geological Survey )Landsat satellite images to produce a high-precision picture of changes in ice-sheet motion.
The new work provides a baseline for future measurement of Antarctic ice changes and can be used to validate numerical ice sheet models that are necessary to make projections of sea level. It also opens the door to faster processing of massive amounts of data.
"We're entering a new age," said the study's lead author, cryospheric researcher Alex Gardner of NASA's Jet Propulsion Laboratory in Pasadena, California. "When I began working on this project three years ago, there was a single map of ice sheet flow that was made using data collected over 10 years, and it was revolutionary when it was published back in 2011. Now we can map ice flow over nearly the entire continent, every year. With these new data, we can begin to unravel the mechanisms by which the ice flow is speeding up or slowing down in response to changing environmental conditions."
The innovative approach by Gardner and his international team of scientists largely confirms earlier findings, though with a few unexpected twists. - Among the most significant: a previously unmeasured acceleration of glacier flow into Antarctica's Getz Ice Shelf, on the southwestern part of the continent — likely a result of ice-shelf thinning.
Speeding up in the west, steady flow in the east: The research, published in the journal "The Cryosphere," also identified the fastest speed-up of Antarctic glaciers during the seven-year study period. The glaciers feeding Marguerite Bay, on the western Antarctic Peninsula, increased their rate of flow by 400 to 800 m/year, probably in response to ocean warming. 64)
Perhaps the research team's biggest discovery, however, was the steady flow of the East Antarctic Ice Sheet. During the study period, from 2008 to 2015, the sheet had essentially no change in its rate of ice discharge — ice flow into the ocean. While previous research inferred a high level of stability for the ice sheet based on measurements of volume and gravitational change, the lack of any significant change in ice discharge had never been measured directly.
Figure 39: The speed of Antarctic ice flow, derived from Landsat imagery over a seven-year period (image credit: NASA)
The study also confirmed that the flow of West Antarctica's Thwaites and Pine Island glaciers into the ocean continues to accelerate, though the rate of acceleration is slowing.
In all, the study found an overall ice discharge for the Antarctic continent of 1,929 gigatons per year in 2015, with an uncertainty of plus or minus 40 gigatons. That represents an increase of 36 gigatons per year, plus or minus 15, since 2008. A gigaton is one billion tons (109 tons).
The study found that ice flow from West Antarctica — the Amundsen Sea sector, the Getz Ice Shelf and Marguerite Bay on the western Antarctic Peninsula — accounted for 89 percent of the increase.
Computer vision: The science team developed software that processed hundreds of thousands of pairs of images of Antarctic glacier movement from Landsat-7 and Landsat-8, captured from 2013 to 2015. These were compared to earlier radar satellite measurements of ice flow to reveal changes since 2008.
"We're applying computer vision techniques that allow us to rapidly search for matching features between two images, revealing complex patterns of surface motion," Gardner said.
Instead of researchers comparing small sets of very high-quality images from a limited region to look for subtle changes, the novelty of the new software is that it can track features across hundreds of thousands of images/year — even those of varying quality or obscured by clouds — over an entire continent. "We can now automatically generate maps of ice flow annually — a whole year — to see what the whole continent is doing," Gardner said.
The new Antarctic baseline should help ice sheet modelers better estimate the continent's contribution to future sea level rise. "We'll be able to use this information to target field campaigns, and understand the processes causing these changes," Gardner said. "Over the next decade, all this is going to lead to rapid improvement in our knowledge of how ice sheets respond to changes in ocean and atmospheric conditions, knowledge that will ultimately help to inform projections of sea level change."
Seismic footprint study to track Hurricanes and Typhoons
February 15, 2018: Climatologists are often asked, "Is climate change making hurricanes stronger?" but they can't give a definitive answer because the global hurricane record only goes back to the dawn of the satellite era. But now, an intersection of disciplines—seismology, atmospheric sciences, and oceanography—offers an untapped data source: the continuous seismic record, which dates back to the early 20th century.
An international team of researchers has found a new way to identify the movement and intensity of hurricanes, typhoons and other tropical cyclones by tracking the way they shake the seafloor, as recorded on seismometers on islands and near the coast. After looking at 13 years of data from the northwest Pacific Ocean, they have found statistically significant correlations between seismic data and storms. Their work was published Feb. 15 in the journal Earth and Planetary Science Letters. 65) 66)
The group of experts was assembled by Princeton University's Lucia Gualtieri, a postdoctoral research associate in geosciences, and Salvatore Pascale, an associate research scholar in atmospheric and oceanic sciences.
Most people associate seismology with earthquakes, said Gualtieri, but the vast majority of the seismic record shows low-intensity movements from a different source: the oceans. "A seismogram is basically the movement of the ground. It records earthquakes, because an earthquake makes the ground shake. But it also records all the tiny other movements," from passing trains to hurricanes. "Typhoons show up very well in the record," she said.
Because there is no way to know when an earthquake will hit, seismometers run constantly, always poised to record an earthquake's dramatic arrival. In between these earth-shaking events, they track the background rumbling of the planet. Until about 20 years ago, geophysicists dismissed this low-intensity rumbling as noise, Gualtieri said.
"What is noise? Noise is a signal we don't understand," said Pascale, who is also an associate research scientist at the National and Oceanic and Atmospheric Administration's Geophysical Fluid Dynamics Laboratory.
Just as astronomers have discovered that the static between radio stations gives us information about the cosmic background, seismologists have discovered that the low-level "noise" recorded by seismograms is the signature of wind-driven ocean storms, the cumulative effect of waves crashing on beaches all over the planet or colliding with each other in the open sea.
One ocean wave acting alone is not strong enough to generate a seismic signature at the frequencies she was examining, explained Gualtieri, because typical ocean waves only affect the upper few feet of the sea. "The particle motion decays exponentially with depth, so at the seafloor you don't see anything," she said. "The main mechanism to generate seismic abnormalities from a typhoon is to have two ocean waves interacting with each other." When two waves collide, they generate vertical pressure that can reach the seafloor and jiggle a nearby seismometer.
When a storm is large enough—and storms classified as hurricanes or typhoons are—it will leave a seismic record lasting several days. Previous researchers have successfully traced individual large storms on a seismogram, but Gualtieri came at the question from the opposite side: can a seismogram find any large storm in the area?
Figure 40: Lucia Gualtieri, a postdoctoral researcher in geosciences at Princeton University, superimposed an image of the seismogram recording a tropical cyclone above a satellite image showing the storm moving across the northwest Pacific Ocean. Gualtieri and her colleagues have found a way to track the movement and intensity of typhoons and hurricanes by looking at seismic data, which has the potential to extend the global hurricane record by decades and allow a more definitive answer to the question, "Are hurricanes getting stronger?" (image credit: Photo illustration by Lucia Gualtieri, satellite image courtesy of NASA/NOAA)
Gualtieri and her colleagues found a statistically significant agreement between the occurrence of tropical cyclones and large-amplitude, long-lasting seismic signals with short periods, between three and seven seconds, called "secondary microseisms." They were also able to calculate the typhoons' strength from these "secondary microseisms," or tiny fluctuations, which they successfully correlated to the observed intensity of the storms.
In short, the seismic record had enough data to identify when typhoons happened and how strong they were (Figure 40).
So far, the researchers have focused on the ocean off the coast of Asia because of its powerful typhoons and good network of seismic stations. Their next steps include refining their method and examining other storm basins, starting with the Caribbean and the East Pacific.
And then they will tackle the historic seismic record: "When we have a very defined method and have applied this method to all these other regions, we want to start to go back in time," said Gualtieri.
While global storm information goes back only to the early days of the satellite era, in the late 1960s and early 1970s, the first modern seismograms were created in the 1880s. Unfortunately, the oldest records exist only on paper, and few historical records have been digitized.
"If all this data can be made available, we could have records going back more than a century, and then we could try to see any trend or change in intensity of tropical cyclones over a century or more," said Pascale. "It's very difficult to establish trends in the intensity of tropical cyclones—to see the impact of global warming. Models and theories suggest that they should become more intense, but it's important to find observational evidence."
"This new technique, if it can be shown to be valid across all tropical-cyclone prone basins, effectively lengthens the satellite era," said Morgan O'Neill, a T. C. Chamberlin Postdoctoral Fellow in geosciences at the University of Chicago who was not involved in this research. "It extends the period of time over which we have global coverage of tropical cyclone occurrence and intensity," she said.
The researchers' ability to correlate seismic data with storm intensity is vital, said Allison Wing, an assistant professor of earth, ocean and atmospheric science at Florida State University, who was not involved in this research. "When it comes to understanding tropical cyclones—what controls their variability and their response to climate and climate change—having more data is better, in particular data that can tell us about intensity, which their method seems to do. ... It helps us constrain the range of variability that hurricane intensity can have."
This connection between storms and seismicity began when Gualtieri decided to play with hurricane data in her free time, she said. But when she superimposed the hurricane data over the seismic data, she knew she was on to something. "I said, 'Wow, there's something more than just play. Let's contact someone who can help."
Her research team ultimately grew to include a second seismologist, two atmospheric scientists and a statistician. "The most challenging part was establishing communications with scientists coming from different backgrounds," said Pascale. "Often, in different fields in science, we speak different dialects, different scientific dialects." Once they developed a "shared dialect," he said, they began to make exciting discoveries. "This is how science evolves," said Pascale. "Historically, it's always been like that. Disciplines first evolve within their own kingdom, then a new field is born."
New Study Finds Sea Level Rise Accelerating
February 13, 2018: The rate of global sea level rise has been accelerating in recent decades, rather than increasing steadily, according to a new study based on 25 years of NASA and European satellite data. 67) 68) 69)
This acceleration, driven mainly by increased melting in Greenland and Antarctica, has the potential to double the total sea level rise projected for 2100 when compared to projections that assume a constant rate of sea level rise, according to lead author Steve Nerem. Nerem is a professor of Aerospace Engineering Sciences at the University of Colorado Boulder, a fellow at Colorado's CIRES (Cooperative Institute for Research in Environmental Sciences), and a member of NASA's Sea Level Change team.
If the rate of ocean rise continues to change at this pace, sea level will rise 65 cm by 2100 — enough to cause significant problems for coastal cities, according to the new assessment by Nerem and colleagues from NASA/GSFC (Goddard Space Flight Center) in Greenbelt, Maryland; CU Boulder; the University of South Florida in Tampa; and Old Dominion University in Norfolk, Virginia. The team, driven to understand and better predict Earth's response to a warming world, published their work Feb. 12 in the journal PNAS (Proceedings of the National Academy of Sciences). 70)
"This is almost certainly a conservative estimate," Nerem said. "Our extrapolation assumes that the sea level continues to change in the future as it has over the last 25 years. Given the large changes we are seeing in the ice sheets today, that's not likely."
Figure 41: NASA Scientific Visualization Studio image by Kel Elkins, using data from Jason-1, Jason-2, and TOPEX/Poseidon. Story by Katie Weeman, CIRES, and Patrick Lynch, NASA GSFC. Edited by Mike Carlowicz.
Rising concentrations of greenhouse gases in Earth's atmosphere increase the temperature of air and water, which causes sea level to rise in two ways. First, warmer water expands, and this "thermal expansion" of the ocean has contributed about half of the7 cm of global mean sea level rise we've seen over the last 25 years, Nerem said. Second, melting land ice flows into the ocean, also increasing sea level across the globe.
These increases were measured using satellite altimeter measurements since 1992, including the Topex/Poseidon, Jason-1, Jason-2 and Jason-3 satellite missions, which have been jointly managed by multiple agencies, including NASA, CNES (Centre National d'Etudes Spatiales), EUMETSAT (European Organisation for the Exploitation of Meteorological Satellites), and NOAA (National Oceanic and Atmospheric Administration). NASA's Jet Propulsion Laboratory in Pasadena, California, manages the U.S. portion of these missions for NASA's Science Mission Directorate. The rate of sea level rise in the satellite era has risen from about 2.5 mm/ year in the 1990s to about 3.4 mm/year today.
"The Topex/Poseidon/Jason altimetry missions have been essentially providing the equivalent of a global network of nearly half a million accurate tide gauges, providing sea surface height information every 10 days for over 25 years," said Brian Beckley, of NASA Goddard, second author on the new paper and lead of a team that processes altimetry observations into a global sea level data record. "As this climate data record approaches three decades, the fingerprints of Greenland and Antarctic land-based ice loss are now being revealed in the global and regional mean sea level estimates."
Table 1: Significance of global sea level rise
Ozone layer not recovering in lower latitudes, despite ozone hole healing at the poles
February 8, 2018: The ozone layer - which protects us from harmful ultraviolet radiation - is recovering at the poles, but unexpected decreases in part of the atmosphere may be preventing recovery at lower latitudes. Global ozone has been declining since the 1970s owing to certain man-made chemicals. Since these were banned, parts of the layer have been recovering, particularly at the poles. 71)
However, the new result, published in the EGU (European Geosciences Union) journal Atmospheric Chemistry and Physics, finds that the bottom part of the ozone layer at more populated latitudes is not recovering. The cause is currently unknown. 72)
Ozone is a substance that forms in the stratosphere - the region of the atmosphere between about 10 and 50 km altitude, above the troposphere that we live in. It is produced in tropical latitudes and distributed around the globe. A large portion of the resulting ozone layer resides in the lower part of the stratosphere. The ozone layer absorbs much of the UV radiation from the Sun, which, if it reaches the Earth's surface, can cause damage to DNA in plants, animals and humans.
In the 1970s, it was recognized that chemicals called CFCs (Chlorofluorocarbons), used for example in refrigeration and aerosols, were destroying ozone in the stratosphere. The effect was worst in the Antarctic, where an ozone 'hole' formed.
In 1987, the Montreal Protocol was agreed (international treaty), which led to the phase-out of CFCs and, recently, the first signs of recovery of the Antarctic ozone layer. The upper stratosphere at lower latitudes is also showing clear signs of recovery, proving the Montreal Protocol is working well.
However, despite this success, scientists have evidence that stratospheric ozone is likely not recovering at lower latitudes, between 60º N and 60º S, due to unexpected decreases in ozone in the lower part of the stratosphere.
Study co-author Professor Joanna Haigh, Co-Director of the Grantham Institute for Climate Change and the Environment at Imperial College London, said: "Ozone has been seriously declining globally since the 1980s, but while the banning of CFCs is leading to a recovery at the poles, the same does not appear to be true for the lower latitudes. The potential for harm in lower latitudes may actually be worse than at the poles. The decreases in ozone are less than we saw at the poles before the Montreal Protocol was enacted, but UV radiation is more intense in these regions and more people live there."
The cause of this decline is not certain, although the authors suggest a couple of possibilities. One is that climate change is altering the pattern of atmospheric circulation, causing more ozone to be carried away from the tropics.
The other possibility is that very short-lived substances (VSLSs), which contain chlorine and bromine, could be destroying ozone in the lower stratosphere. VSLSs include chemicals used as solvents, paint strippers, and as degreasing agents. One is even used in the production of an ozone-friendly replacement for CFCs.
Dr William Ball from ETH Zürich [Eidgenoessische Technische Hochschule, Zürich (Swiss Federal Institute of Technology, Zürich)] and PMOD/WRC [Physikalisch-Meteorologisches Observatorium Davos, World Radiation Center (Switzerland)], who led the analysis, said: "The finding of declining low-latitude ozone is surprising, since our current best atmospheric circulation models do not predict this effect. Very short-lived substances could be the missing factor in these models."
It was thought that very short-lived substances would not persist long enough in the atmosphere to reach the height of the stratosphere and affect ozone, but more research may be needed.
To conduct the analysis, the team developed new algorithms to combine the efforts of multiple international teams that have worked to connect data from different satellite missions since 1985 and create a robust, long time series.
William Ball said: "The study is an example of the concerted international effort to monitor and understand what is happening with the ozone layer; many people and organizations prepared the underlying data, without which the analysis would not have been possible."
Although individual datasets had previously hinted at a decline, the application of advanced merging techniques and time series analysis has revealed a longer term trend of ozone decrease in the stratosphere at lower altitudes and latitudes.
The researchers say the focus now should be on getting more precise data on the ozone decline, and determining what the cause most likely is, for example by looking for the presence of VSLSs in the stratosphere.
Dr Justin Alsing from the Flatiron Institute in New York, who took on a major role in developing and implementing the statistical technique used to combine the data, said: "This research was only possible because of a great deal of cross-disciplinary collaboration. My field is normally cosmology, but the technique we developed can be used in any science looking at complex datasets."
Table 2: Summary of the published paper (Ref. 72)
Heat loss from Earth's interior triggers Greenland's ice sheet slide towards the sea
January 30, 2018: In North-East Greenland, researchers have measured the loss of heat that comes up from the interior of the Earth. This enormous area is a geothermal 'hot spot' that melts the ice sheet from below and triggers the sliding of glaciers towards the sea. The melting takes place with increased strength and at a speed that no models have previously predicted. 73)
As reported in the journal Scientific Reports, researchers from the Arctic Research Center, Aarhus University (Aarhus, Denmark), and the Greenland Institute of Natural Resources (Nuuk, Greenland) present results that, for the first time, show that the deep bottom water of the north-eastern Greenland fjords is being warmed up by heat gradually lost from the Earth's interior. And the researchers point out that this heat loss triggers the sliding of glaciers from the ice sheet towards the sea. 74)
Icelandic conditions: "North-East Greenland has several hot springs where the water becomes up to 60 degrees warm and, like Iceland, the area has abundant underground geothermal activity," explains Professor Søren Rysgaard, who headed the investigations.
For more than ten years (2005-2015), the researchers have measured the temperature and salinity in the fjord Young Sound, located at Daneborg, north of Scoresbysund, which has many hot springs, and south of the glacier Nioghalvfjerdsfjorden, which melts rapidly and is connected to the North-East Greenland Ice Stream (NEGIS).
By focusing on an isolated basin in the fjord with a depth range between 200 and 340 m, the researchers have measured how the deep water is heated over a ten-year period. Based on the extensive data, researchers have estimated that the loss of heat from the Earth's interior to the fjord is about 100 mW m-2. This corresponds to a 2 MW wind turbine sending electricity to a large heater at the bottom of the fjord all year round.
Heat from the Earth's interior — an important influence: It is not easy to measure the geothermal heat flux — heat emanating from the Earth's interior — below a glacier, but within the area there are several large glaciers connected directly to the ice sheet. If the Earth releases heat to a fjord, heat also seeps up to the bottom part of the glaciers. This means that the glaciers melt from below and thus slide more easily over the terrain on which they sit when moving to the sea.
"It is a combination of higher temperatures in the air and the sea, precipitation from above, local dynamics of the ice sheet and heat loss from the Earth's interior that determines the mass loss from the Greenland ice sheet," explains Søren Rysgaard.
The researchers expect that the new discoveries will improve the models of ice sheet dynamics, allowing better predictions of the stability of the Greenland ice sheet, its melting and the resulting global water rise.
Figure 42: Geothermal vents localities and ice surface speeds (2008–2009) for Greenland. Geothermal vent localities on land with temperatures >10ºC, Boreholes, hydrothermal vent complexes offshore and present study. Reconstructed geothermal anomalies (contours in inserted box). Ice drilling localities are indicated by CC, NGRIP, GRIP and Dye (image credit: Research Team of Aarhus University)
Dust on Snow Controls Springtime River Rise
January 23, 2018: A new study has found that dust, not spring warmth, controls the pace of spring snowmelt that feeds the headwaters of the Colorado River. Contrary to conventional wisdom, the amount of dust on the mountain snowpack controls how fast the Colorado Basin's rivers rise in the spring regardless of air temperature, with more dust correlated with faster spring runoff and higher peak flows. 75)
The finding is valuable for western water managers and advances our understanding of how freshwater resources, in the form of snow and ice, will respond to warming temperatures in the future. By improving knowledge of what controls the melting of snow, it improves understanding of the controls on how much solar heat Earth reflects back into space and how much it absorbs — an important factor in studies of weather and climate.
When snow gets covered by a layer of windblown dust or soot, the dark topcoat increases the amount of heat the snow absorbs from sunlight. Tom Painter of NASA's Jet Propulsion Laboratory in Pasadena, California, has been researching the consequences of dust on snowmelt worldwide. This is the first study to focus on which has a stronger influence on spring runoff: warmer air temperatures or a coating of dust on the snow.
Windblown dust has increased in the U.S. Southwest as a result of changing climate patterns and human land-use decisions. With rainfall decreasing and more disturbances of the land, protective crusts on soil are removed and more bare soil is exposed. Winter and spring winds pick up the dusty soil and drop it on the Colorado Rockies to the northeast. Historical lake sediment analyses show there is currently an annual average of five to seven times more dust falling on the Rocky Mountain snowpack than there was before the mid-1800s.
Painter and colleagues looked at data on air temperature and dust in a mountain basin in southwestern Colorado from 2005 to 2014, and streamflow from three major tributary rivers that carry snowmelt from these mountains to the Colorado River. The Colorado River's basin spans about 246,000 square miles (637,000 km2) in parts of seven western states.
The researchers found that the effects of dust dominated the pace of the spring runoff even in years with unusually warm spring air temperatures. Conversely, there was almost no statistical correlation between air temperature and the pace of runoff.
"We found that when it's clean, the rise to the peak streamflow is slower, and generally you get a smaller peak." Painter said. "When the snowpack is really dusty, water just blasts out of the mountains." The finding runs contrary to the widely held assumption that spring air temperature determines the likelihood of flooding.
Coauthor McKenzie Skiles, an assistant professor in the University of Utah Department of Geography, said that while the impacts of dust in the air, such as reduced air quality, are well known, the impacts of the dust once it's been deposited on the land surface are not as well understood. "Given the reliance of the western U.S. on the natural snow reservoir, and the Colorado River in particular, it is critical to evaluate the impact of increasing dust deposition on the mountain snowpack," she said.
Figure 43: A coating of dust on snow speeds the pace of snowmelt in the spring (image credit: NASA)
Painter pointed out that the new finding doesn't mean air temperatures in the region can be ignored in considering streamflows and flooding, especially in the future. "As air temperature continues to climb, it's going to have more influence," he said. Temperature controls whether precipitation falls as snow or as rain, for example, so ultimately it controls how much snow there is to melt. But, he said, "temperature is unlikely to control the variability in snowmelt rates. That will still be controlled by how dirty or clean the snowpack is."
Skiles noted, "Dust on snow does not only impact the mountains that make up the headwaters of Colorado River. Surface darkening has been observed in mountain ranges all over the world, including the Alps and the Himalaya. What we learn about the role of dust deposition for snowmelt timing and intensity here in the western U.S. has global implications for improved snowmelt forecasting and management of snow water resources."
The study, titled "Variation in rising limb of Colorado River snowmelt runoff hydrograph controlled by dust radiative forcing in snow," was published today in the journal Geophysical Research Letters. Coauthors are from the University of Utah, Salt Lake City; University of Colorado, Boulder; and University of California, Santa Barbara. 76)
Study of Extreme Wintertime Arctic Warm Event
January 16, 2018: In the winter of 2015/16, something happened that had never before been seen on this scale: at the end of December, temperatures rose above zero degrees Celsius for several days in parts of the Arctic. Temperatures of up to eight degrees were registered north of Svalbard. Temperatures this high have not been recorded in the winter half of the year since the beginning of systematic measurements at the end of the 1970s. As a result of this unusual warmth, the sea ice began to melt. 77)
"We heard about this from the media," says Heini Wernli, Professor of Atmospheric Dynamics at ETH Zurich. The news aroused his scientific curiosity, and a team led by his then doctoral student Hanin Binder investigated the issue. In December 2017, they published their analysis of this exceptional event in the journal Geophysical Research Letters. 78)
The researchers show in their paper how these unusual temperatures arose: three different air currents met over the North Sea between Scotland and southern Norway, carrying warm air northwards at high speed as though on a "highway" (Figure 44).
One air current originated in the Sahara and brought near-surface warm air with it. To begin with, temperature of this air was about 20º Celsius. While it cooled off on its way to the Arctic, it was still above zero when it arrived. "It's extremely rare for warm, near-surface subtropical air to be transported as far as the Arctic," says Binder.
The second air current originated in the Arctic itself, a fact that astonished the scientists. To begin with, this air was very cold. However, the air mass – which also lay close to the ground – moved towards the south along a curved path and, while above the Atlantic, was warmed significantly by the heatflux from the ocean before joining the subtropical air current.
The third warm air current started as a cold air mass in the upper troposphere, from an altitude above 5 km. These air masses were carried from west to east and descended in a stationary high-pressure area over Scandinavia. Compression thereby warmed the originally cold air, before it entered the "highway to the Arctic".
Poleward warm air transport: This highway of air currents was made possible by a particular constellation of pressure systems over northern Europe. During the period in question, intense low-pressure systems developed over Iceland while an extremely stable high-pressure area formed over Scandinavia. This created a kind of funnel above the North Sea, between Scotland and southern Norway, which channelled the various air currents and steered them northwards to the Arctic.
This highway lasted approximately a week. The pressure systems then decayed and the Arctic returned to its typical frozen winter state. However, the warm period sufficed to reduce the thickness of the sea ice in parts of the Arctic by 30 cm – during a period in which ice usually becomes thicker and more widespread.
"These weather conditions and their effect on the sea ice were really exceptional," says Binder. The researchers were not able to identify a direct link to global warming. "We only carried out an analysis of a single event; we didn't research the long-term climate aspects" emphasizes Binder.
However, the melting of Arctic sea ice during summer is a different story. The long-term trend is clear: the minimum extent and thickness of the sea ice in late summer has been shrinking continually since the end of the 1970s. Sea ice melted particularly severely in 2007 and 2012 – a fact which climate researchers have thus far been unable to fully explain. Along with Lukas Papritz from the University of Bergen, Wernli investigated the causes of these outliers.
According to their research, the severe melting in the aforementioned years was caused by stable high-pressure systems that formed repeatedly throughout the summer months. Under these cloud-free weather conditions, the high level of direct sunlight – the sun shines 24 hours a day at this time of year – particularly intensified the melting of the sea ice.
The extreme event was the result of a very unusual large-scale flow configuration in early winter 2015/2016 that came along with overall anomalously warm conditions in Europe (National Oceanic and Atmospheric Administration, 2016) and other regional extremes, for example, flooding in the UK. 79) In this study (Ref. 78),we focus on the Arctic. At the North Pole, buoys measured maximum surface temperatures of -0.8ºC on 30 December 80), and at the Svalbard airport station values of 8.7ºC were observed, the warmest temperatures ever recorded at that station between November and April (The Norwegian Meteorological Institute, 2016). According to operational analyses from the ECMWF (European Center for Medium-Range Weather Forecasts), the maximum 2 m temperature (T2m) north of 82ºN reached values larger than 0ºC during three short episodes between 29 December 2015 and 4 January 2016—almost 30 K above the winter climatological mean in this region (Figure 45a). They occurred in the Eurasian Arctic sector in the region around Svalbard and over the Kara Sea (purple contour in Figure 45b) and were the highest winter values since 1979 (Figure 45c). The warm event led to a thinning of the sea ice by more than 30 cm in the Barents and Kara Seas, and contributed to the record low Northern Hemisphere sea ice extent observed in January and February 2016 (National Snow and Ice Data Center, 2016).
Figure 45: Illustration of the Arctic warm event and its extremeness. (a) Temporal evolution of the domain maximum (red) and mean (blue) T2m (ºC) between 20 December 2015 and 10 January 2016 at latitudes ≥82ºN and between 120ºW and 120ºE, derived from operational analyses. Also shown are the domain mean December–February 1979–2014 climatological mean T2m (black), and the corresponding ±1 standard deviation envelope (grey) from ERA-Interim reanalysis data. (b) Maximum T2m (ºC) between 00 UTC 30 December 2015 and 18 UTC 4 January 2016 from operational analyses, with the purple contour highlighting the regions ≥82ºN with maximum T2m ≥ 0ºC. (c) Rank of maximum T2m shown in Figure 45b among all 6-hourly values in winter 1979–2014 in the ERA-Interim reanalyses (consisting of a total of 13,232 values), image credit: study team
Long-Term Warming Trend Continued in 2017
January 18, 2018: Earth's global surface temperatures in 2017 ranked as the second warmest since 1880, according to an analysis by NASA. Continuing the planet's long-term warming trend, globally averaged temperatures in 2017 were 1.62 º Fahrenheit (0.90º Celsius) warmer than the 1951 to 1980 mean, according to scientists at NASA's GISS (Goddard Institute for Space Studies) in New York. That is second only to global temperatures in 2016. 81)
In a separate, independent analysis, scientists at NOAA (National Oceanic and Atmospheric Administration) concluded that 2017 was the third-warmest year in their record. The minor difference in rankings is due to the different methods used by the two agencies to analyze global temperatures, although over the long-term the agencies' records remain in strong agreement. Both analyses show that the five warmest years on record all have taken place since 2010.
Because weather station locations and measurement practices change over time, there are uncertainties in the interpretation of specific year-to-year global mean temperature differences. Taking this into account, NASA estimates that 2017's global mean change is accurate to within 0.1º Fahrenheit, with a 95 percent certainty level.
"Despite colder than average temperatures in any one part of the world, temperatures over the planet as a whole continue the rapid warming trend we've seen over the last 40 years," said GISS Director Gavin Schmidt.
The planet's average surface temperature has risen about 2 degrees Fahrenheit (a little more than 1 degree Celsius) during the last century or so, a change driven largely by increased carbon dioxide and other human-made emissions into the atmosphere. Last year was the third consecutive year in which global temperatures were more than 1.8 degrees Fahrenheit (1 degree Celsius) above late nineteenth-century levels.
Phenomena such as El Niño or La Niña, which warm or cool the upper tropical Pacific Ocean and cause corresponding variations in global wind and weather patterns, contribute to short-term variations in global average temperature. A warming El Niño event was in effect for most of 2015 and the first third of 2016. Even without an El Niño event – and with a La Niña starting in the later months of 2017 – last year's temperatures ranked between 2015 and 2016 in NASA's records.
Figure 46: This map shows Earth's average global temperature from 2013 to 2017, as compared to a baseline average from 1951 to 1980, according to an analysis by NASA's Goddard Institute for Space Studies. Yellows, oranges, and reds show regions warmer than the baseline (image credit: NASA's Scientific Visualization Studio)
In an analysis where the effects of the recent El Niño and La Niña patterns were statistically removed from the record, 2017 would have been the warmest year on record.
Weather dynamics often affect regional temperatures, so not every region on Earth experienced similar amounts of warming. NOAA found the 2017 annual mean temperature for the contiguous 48 United States was the third warmest on record.
Warming trends are strongest in the Arctic regions, where 2017 saw the continued loss of sea ice.
NASA's temperature analyses incorporate surface temperature measurements from 6,300 weather stations, ship- and buoy-based observations of sea surface temperatures, and temperature measurements from Antarctic research stations.
These raw measurements are analyzed using an algorithm that considers the varied spacing of temperature stations around the globe and urban heating effects that could skew the conclusions. These calculations produce the global average temperature deviations from the baseline period of 1951 to 1980.
NOAA scientists used much of the same raw temperature data, but with a different baseline period, and different methods to analyze Earth's polar regions and global temperatures. The full 2017 surface temperature data set and the complete methodology used to make the temperature calculation are available.
GISS is a laboratory within the Earth Sciences Division of NASA's Goddard Space Flight Center in Greenbelt, Maryland. The laboratory is affiliated with Columbia University's Earth Institute and School of Engineering and Applied Science in New York.
NASA uses the unique vantage point of space to better understand Earth as an interconnected system. The agency also uses airborne and ground-based monitoring, and develops new ways to observe and study Earth with long-term data records and computer analysis tools to better see how our planet is changing. NASA shares this knowledge with the global community and works with institutions in the United States and around the world that contribute to understanding and protecting our home planet.
Study of Antarctic Ozone Hole Recovery
January 5, 2018: For the first time, scientists have shown through direct observations of the ozone hole by an instrument on NASA's Aura mission, that levels of ozone-destroying chlorine are declining, resulting in less ozone depletion. Measurements show that the decline in chlorine, resulting from an international ban on chlorine-containing human-produce chemicals called chlorofluorocarbons (CFCs), has resulted in about 20 percent less ozone depletion during the Antarctic winter than there was in 2005 — the first year that measurements of chlorine and ozone during the Antarctic winter were made by the Aura satellite. 82)
- "We see very clearly that chlorine from CFCs is going down in the ozone hole, and that less ozone depletion is occurring because of it," said lead author Susan Strahan, an atmospheric scientist from NASA's Goddard Space Flight Center in Greenbelt, Maryland. The study was published in the journal Geophysical Research Letters. 83)
- CFCs are long-lived chemical compounds that eventually rise into the stratosphere, where they are broken apart by the Sun's ultraviolet radiation, releasing chlorine atoms that go on to destroy ozone molecules. Stratospheric ozone protects life on the planet by absorbing potentially harmful ultraviolet radiation that can cause skin cancer and cataracts, suppress immune systems and damage plant life.
- Two years after the discovery of the Antarctic ozone hole in 1985, nations of the world signed the Montreal Protocol on Substances that Deplete the Ozone Layer, which regulated ozone-depleting compounds. Later amendments to the Montreal Protocol completely phased out production of CFCs.
- Past studies have used statistical analyses of changes in the ozone hole's size to argue that ozone depletion is decreasing. This study is the first to use measurements of the chemical composition inside the ozone hole to confirm that not only is ozone depletion decreasing, but that the decrease is caused by the decline in CFCs.
- The Antarctic ozone hole forms during September in the Southern Hemisphere's winter as the returning Sun's rays catalyze ozone destruction cycles involving chlorine and bromine that come primarily from CFCs. To determine how ozone and other chemicals have changed year to year, scientists used data from JPL's MLS (Microwave Limb Sounder) aboard the Aura satellite, which has been making measurements continuously around the globe since mid-2004. While many satellite instruments require sunlight to measure atmospheric trace gases, MLS measures microwave emissions and, as a result, can measure trace gases over Antarctica during the key time of year: the dark southern winter, when the stratospheric weather is quiet and temperatures are low and stable.
Figure 47: Using measurements from NASA's Aura satellite, scientists studied chlorine within the Antarctic ozone hole over the last several years, watching as the amount slowly decreased (image credit: NASA/GSFC, Katy Mersmann)
The change in ozone levels above Antarctica from the beginning to the end of southern winter — early July to mid-September — was computed daily from MLS measurements every year from 2005 to 2016. "During this period, Antarctic temperatures are always very low, so the rate of ozone destruction depends mostly on how much chlorine there is," Strahan said. "This is when we want to measure ozone loss."
They found that ozone loss is decreasing, but they needed to know whether a decrease in CFCs was responsible. When ozone destruction is ongoing, chlorine is found in many molecular forms, most of which are not measured. But after chlorine has destroyed nearly all the available ozone, it reacts instead with methane to form hydrochloric acid, a gas measured by MLS. "By around mid-October, all the chlorine compounds are conveniently converted into one gas, so by measuring hydrochloric acid we have a good measurement of the total chlorine," Strahan said.
Nitrous oxide is a long-lived gas that behaves just like CFCs in much of the stratosphere. The CFCs are declining at the surface but nitrous oxide is not. If CFCs in the stratosphere are decreasing, then over time, less chlorine should be measured for a given value of nitrous oxide. By comparing MLS measurements of hydrochloric acid and nitrous oxide each year, they determined that the total chlorine levels were declining on average by about 0.8 percent annually.
The 20 percent decrease in ozone depletion during the winter months from 2005 to 2016 as determined from MLS ozone measurements was expected. "This is very close to what our model predicts we should see for this amount of chlorine decline," Strahan said. "This gives us confidence that the decrease in ozone depletion through mid-September shown by MLS data is due to declining levels of chlorine coming from CFCs. But we're not yet seeing a clear decrease in the size of the ozone hole because that's controlled mainly by temperature after mid-September, which varies a lot from year to year."
Looking forward, the Antarctic ozone hole should continue to recover gradually as CFCs leave the atmosphere, but complete recovery will take decades. "CFCs have lifetimes from 50 to 100 years, so they linger in the atmosphere for a very long time," said Anne Douglass, a fellow atmospheric scientist at Goddard and the study's co-author. "As far as the ozone hole being gone, we're looking at 2060 or 2080. And even then there might still be a small hole."
Study solves a conflict in the post-2006 atmospheric methane budget concentrations
January 2, 2018: A new NASA-led study has solved a puzzle involving the recent rise in atmospheric methane, a potent greenhouse gas, with a new calculation of emissions from global fires. The new study resolves what looked like irreconcilable differences in explanations for the increase. 84)
Methane emissions have been rising sharply since 2006. Different research teams have produced viable estimates for two known sources of the increase: emissions from the oil and gas industry, and microbial production in wet tropical environments like marshes and rice paddies. But when these estimates were added to estimates of other sources, the sum was considerably more than the observed increase. In fact, each new estimate was large enough to explain the whole increase by itself.
John Worden of NASA's Jet Propulsion Laboratory in Pasadena, California, and colleagues focused on fires because they're also changing globally. The area burned each year decreased about 12 percent between the early 2000s and the more recent period of 2007 to 2014, according to a new study using observations by NASA's MODIS (Moderate Resolution Imaging Spectrometer) satellite instrument. The logical assumption would be that methane emissions from fires have decreased by about the same percentage. Using satellite measurements of methane and carbon monoxide, Worden's team found the real decrease in methane emissions was almost twice as much as that assumption would suggest.
When the research team subtracted this large decrease from the sum of all emissions, the methane budget balanced correctly, with room for both fossil fuel and wetland increases. The research is published in the journal Nature Communications. 85)
Most methane molecules in the atmosphere don't have identifying features that reveal their origin. Tracking down their sources is a detective job involving multiple lines of evidence: measurements of other gases, chemical analyses, isotopic signatures, observations of land use, and more. "A fun thing about this study was combining all this different evidence to piece this puzzle together," Worden said.
Carbon isotopes in the methane molecules are one clue. Of the three methane sources examined in the new study, emissions from fires contain the largest percentage of heavy carbon isotopes, microbial emissions have the smallest, and fossil fuel emissions are in between. Another clue is ethane, which (like methane) is a component of natural gas. An increase in atmospheric ethane indicates increasing fossil fuel sources. Fires emit carbon monoxide as well as methane, and measurements of that gas are a final clue.
Worden's team used carbon monoxide and methane data from the Measurements of Pollutants in the Troposphere instrument on NASA's Terra satellite and the Tropospheric Emission Spectrometer instrument on NASA's Aura to quantify fire emissions of methane. The results show these emissions have been decreasing much more rapidly than expected.
Combining isotopic evidence from ground surface measurements with the newly calculated fire emissions, the team showed that about 17 teragrams per year of the increase is due to fossil fuels, another 12 is from wetlands or rice farming, while fires are decreasing by about 4 teragrams per year. The three numbers combine to net emissions increase of ~25 Tg/year of CH4 — the same as the observed increase.
The magnitude of the global CH4 masses involved are illustrated by: 1 Tg (1 teragram) = 1012 g = 1,000,000 tons. Methane emissions are increasing by about 25 Tg/year, with total emissions currently of ~550 Tg/year budget.
Worden's coauthors are at the NCAR (National Center for Atmospheric Research), Boulder, Colorado; and the Netherlands Institute for Space Research and University of Utrecht, both in Utrecht, the Netherlands.
Figure 48: This time series was created using data from the MODIS instrument data onboard NASA's Terra and Aqua satellites. The burned area is estimated by applying an algorithm that detects rapid changes in visible and infrared surface reflectance imagery. Fires typically darken the surface in the visible part of the electromagnetic spectrum, and brighten the surface in several wavelength bands in the shortwave infrared that are sensitive to the surface water content of vegetation (image credit: NASA/GSFC/SVS)
Legend to Figure 48: Thermal emissions from actively burning fires also are measured by MODIS and are used to improve the burned area estimates in croplands and other areas where the fire sizes are relatively small. This animation portrays burned area between September 2000 and August 2015 as a percent of the 1/4 degree grid cell that was burned each month. The values on the color bar are on a log scale, so the regions shown in blue and green shades indicate small burned areas while those in red and orange represent a larger percent of the region burned. Beneath the burned area, the seasonal Blue Marble landcover shows the advance and retreat of snow in the northern hemisphere.
Trend in CH4 emissions from fires. Figure 49 shows the time series of CH4 emissions that were obtained from GFEDv4s (Global Fire Emissions Database, version 4s) and top-down estimates based on CO emission estimates and GFED4s-based emission ratios. The CO-based fire CH4 emissions estimates amount to 14.8 ± 3.8 Tg CH4 per year for the 2001–2007 time period and 11.1 ± 3 Tg CH4 per year for the 2008–2014 time period, with a 3.7 ± 1.4 Tg CH4 per year decrease between the two time periods. The mean burnt area (a priori)-based estimate from GFED4s is slightly larger and shows a slightly smaller decrease (2.3 Tg CH4 per year) in fire emissions after 2007 relative to the 2001–2006 time period. The range of uncertainties (shown as blue error bars in Figure 49 is determined by the uncertainty in top-down CO emission estimates that are derived empirically using the approaches discussed in the Methods). The red shading describes the range of uncertainty stemming from uncertainties in CH4/CO emission factors (Methods). By assuming temporally constant sector-specific CH4/CO emission factors, we find that mean 2001–2014 emissions average to 12.9 ± 3.3 Tg CH4 per year, and the decrease averages to 3.7 ± 1.4 Tg CH4 per year for 2008–2014, relative to 2001–2007. This decrease is largely accounted for by a 2.9 ± 1.2 Tg CH4 per year decrease during 2006–2008, which is primarily attributable to a biomass burning decrease in Indonesia and South America.
Figure 49: Trend of methane emissions from biomass burning. Expected methane emissions from fires based on the Global Fire Emissions Database (black) and the CO emissions plus CH4/CO ratios shown here (red). The range of uncertainties in blue is due to the calculated errors from the CO emissions estimate and the shaded red describes the range of error from uncertainties in the CH4/CO emission factors (image credit: Methane Study Team)