Google Search

For weather information from across the nation, please check out our home site National Weather Outlook. Thanks!

New York City Current Weather Conditions

New York City Weather Forecast

New York City 7 Day Weather Forecast

New York City Weather Radar

Summer no sweat for Aussies but winter freeze fatal

Australians are more likely to die during unseasonably cold winters than hotter than average summers, QUT research has found.

Across the country severe winters that are colder and drier than normal are a far bigger risk to health than sweltering summers that are hotter than average.

QUT Associate Professor Adrian Barnett, a statistician with the Institute of Health and Biomedical Innovation and the lead researcher of the study, said death rates in Australian cities were up to 30 per cent higher in winter than summer.

The researchers analyzed temperature, humidity and mortality data from 1988 to 2009 for Adelaide Brisbane, Melbourne, Perth and Sydney.

Professor Barnett said the finding that hotter or more humid summers had no effect on mortality was "surprising."

"We know that heatwaves kill people in the short-term, but our study did not find any link between hotter summers and higher deaths," he said.

"The increase in deaths during colder winter could be because Australians are well-prepared for whatever summer throws at them, but are less able to cope with cold weather. There isn't the same focus on preparing for cold weather as there is for hot weather, for example through public health campaigns or even wearing the right sort of clothes.

"The strongest increase in deaths during a colder winter was in Brisbane, the city with the warmest climate, with an extra 59 deaths a month on average for a one degree decrease in mean winter temperature."

"Brisbane has the mildest winter of the five cities but has the greatest vulnerability. We believe this is because most homes are designed to lose heat in summer, which also allows cold outdoor air to get inside during winter."

Professor Barnett said the findings of the study, published in the journal Environmental Research, could trigger more prevention programs to help reduce the future burden on the health system.

"Excess winter deaths have a significant impact on health systems across Australia," he said.

"There are extra demands on doctors, hospitals and emergency departments in winter months, especially for cardiovascular and respiratory diseases which are triggered by exposure to cold weather.

"Our findings show the winter increases in mortality are predictable so ramping up public health measures, such as influenza vaccinations and insulating homes, particularly for vulnerable groups, should be considered to try to reduce the impact of severe winters."

Journal Reference:

Cunrui Huang, Cordia Chu, Xiaoming Wang, Adrian G. Barnett. Unusually cold and dry winters increase mortality in Australia. Environmental Research, 2015; 136: 1 DOI: 10.1016/j.envres.2014.08.046

View the original article here

The legend of the Kamikaze typhoons

In the late 13th century, Kublai Khan, ruler of the Mongol Empire, launched one of the world's largest armada of its time in an attempt to conquer Japan. Early narratives describe the decimation and dispersal of these fleets by the "Kamikaze" of CE 1274 and CE 1281 -- a pair of intense typhoons divinely sent to protect Japan from invasion.

These historical accounts are prone to exaggeration, and significant questions remain regarding the occurrence and true intensity of these legendary typhoons. For independent insight, we provide a new 2,000 year sedimentary reconstruction of typhoon overwash from a coastal lake near the location of the Mongol invasions. Two prominent storm deposits date to the timing of the Kamikaze typhoons and support them being of significant intensity.

Our new storm reconstruction also indicates that events of this nature were more frequent in the region during the timing of the Mongol invasions. Results support the paired Kamikaze typhoons in having played an important role in preventing the early conquest of Japan by Mongol fleets. In doing so, the events may provide one of the earliest historical cases for the shaping of a major geopolitical boundary by an increased probability of extreme weather due to changing atmospheric and oceanic conditions.

Journal Reference:

J. D. Woodruff, K. Kanamaru, S. Kundu, T. L. Cook. Depositional evidence for the Kamikaze typhoons and links to changes in typhoon climatology. Geology, 2014; DOI: 10.1130/G36209.1

View the original article here

Deep Space Climate Observatory to provide 'EPIC' views of Earth

NASA has contributed two Earth science instruments for NOAA's space weather observing satellite called the Deep Space Climate Observatory or DSCOVR, set to launch in January 2015. One of the instruments called EPIC or Earth Polychromatic Imaging Camera will image the Earth in one picture, something that hasn't been done before from a satellite. EPIC will also provide valuable atmospheric data.

Currently, to get an entire Earth view, scientists have to piece together images from satellites in orbit. With the launch of the National Oceanic and Atmospheric Administration's (NOAA) DSCOVR and the EPIC instrument, scientists will get pictures of the entire sunlit side of Earth. To get that view, EPIC will orbit the first sun-Earth Lagrange point (L1), 1 million miles from Earth. At this location, four times further than the orbit of the Moon, the gravitational pull of the sun and Earth cancel out providing a stable orbit for DSCOVR. Most other Earth-observing satellites circle the planet within 22,300 miles.

"Unlike personal cameras, EPIC will take images in 10 very narrow wavelength ranges," said Adam Szabo, DSCOVR project scientist at NASA's Goddard Space Flight Center, Greenbelt, Maryland. "Combining these different wavelength images allows the determination of physical quantities like ozone, aerosols, dust and volcanic ash, cloud height, or vegetation cover. These results will be distributed as different publicly available data products allowing their combination with results from other missions."

These data products are of interest to climate science, as well as hydrology, biogeochemistry, and ecology. Data will also provide insight into Earth's energy balance.

EPIC was built by Lockheed Martin's Advanced Technology Center, in Palo Alto, California. It is a 30 centimeter (11.8 inch) telescope that measures in the ultraviolet, and visible areas of the spectrum. EPIC images will have a resolution of between 25 and 35 kilometers (15.5 to 21.7 miles).


View the original article here

Glacier beds can get slipperier at higher sliding speeds

As a glacier's sliding speed increases, the bed beneath the glacier can grow slipperier, according to laboratory experiments conducted by Iowa State University glaciologists.

They say including this effect in efforts to calculate future increases in glacier speeds could improve predictions of ice volume lost to the oceans and the rate of sea-level rise.

The glaciologists -- Lucas Zoet, a postdoctoral research associate, and Neal Iverson, a professor of geological and atmospheric sciences -- describe the results of their experiments in the Journal of Glaciology. The paper uses data collected from a newly constructed laboratory tool, the Iowa State University Sliding Simulator, to investigate glacier sliding. The device was used to explore the relationship between drag and sliding speed for comparison with the predictions of theoretical models.

"We really have a unique opportunity to study the base of glaciers with these experiments," said Zoet, the lead author of the paper. "The other tactic you might take is studying these relationships with field observations, but with field data so many different processes are mixed together that it becomes hard to untangle the relevant data from the noise."

Data collected by the researchers show that resistance to glacier sliding -- the drag that the bed exerts on the ice -- can decrease in response to increasing sliding speed. This decrease in drag with increasing speed, although predicted by some theoreticians a long as 45 years ago, is the opposite of what is usually assumed in mathematical models of the flow of ice sheets.

These are the first empirical results demonstrating that as ice slides at an increasing speed -- perhaps in response to changing weather or climate -- the bed can become slipperier, which could promote still faster glacier flow.

The response of glaciers to changing climate is one of the largest potential contributors to sea-level rise. Predicting glacier response to climate change depends on properly characterizing the way a glacier slides over its bed. There has been a half-century debate among theoreticians as to how to do that.

The simulator features a ring of ice about 8 inches thick and about 3 feet across that is rotated over a model glacier bed. Below the ice is a hydraulic press that can simulate the weight of a glacier several hundred yards thick. Above are motors that can rotate the ice ring over the bed at either a constant speed or a constant stress. A circulating, temperature-regulated fluid keeps the ice at its melting temperature -- a necessary condition for significant sliding.

"About six years were required to design, construct, and work the bugs out of the new apparatus," Iverson said, "but it is performing well now and allowing hypothesis tests that were formerly not possible."


View the original article here

Hurricane sandy increased incidents of heart attacks, stroke in hardest hit New Jersey counties

Heart attacks and strokes are more likely to occur during extreme weather and natural disasters such as earthquakes and floods. Researchers at the Cardiovascular Institute of New Jersey at Rutgers Robert Wood Johnson Medical School have found evidence that Hurricane Sandy, commonly referred to as a superstorm, had a significant effect on cardiovascular events, including myocardial infarction (heart attack) and stroke, in the high-impact areas of New Jersey two weeks following the 2012 storm. The study, led by Joel N. Swerdel, MS, MPH, an epidemiologist at the Cardiovascular Institute and the Rutgers School of Public Health, was published in the Journal of the American Heart Association.

Utilizing the Myocardial Infarction Data Acquisition System (MIDAS), the researchers examined changes in the incidence of and mortality from myocardial infarctions and strokes from 2007 to 2012 for two weeks prior to and two weeks after October 29, the date of Hurricane Sandy. MIDAS is an administrative database containing hospital records of all patients discharged from non-federal hospitals in New Jersey with a cardiovascular disease diagnosis or invasive cardiovascular procedure.

In the two weeks following Hurricane Sandy, the researchers found that in the eight counties determined to be high-impact areas, there was a 22 percent increase in heart attacks as compared with the same time period in the previous five years. In the low impact areas (the remaining 13 counties), the increase was less than one percent. 30-day mortality from heart attacks also increased by 31 percent in the high-impact area.

"We estimate that there were 69 more deaths from myocardial infarction during the two weeks following Sandy than would have been expected. This is a significant increase over typical non-emergency periods," said Swerdel. "Our hope is that the research may be used by the medical community, particularly emergency medical services, to prepare for the change in volume and severity of health incidents during extreme weather events."

In regard to stroke, the investigators found an increase of 7 percent compared to the same time period in the prior five years in areas of the state impacted the most. There was no change in the incidence of stroke in low-impact areas. There also was no change in the rate of 30-day mortality due to stroke in either the high- or low-impact areas.

"Hurricane Sandy had unprecedented environmental, financial and health consequences on New Jersey and its residents, all factors that can increase the risk of cardiovascular events," said John B. Kostis, MD, director of the Cardiovascular Institute of New Jersey and associate dean for cardiovascular research at Rutgers Robert Wood Johnson Medical School. "Increased stress and physical activity, dehydration and a decreased attention or ability to manage one's own medical needs probably caused cardiovascular events during natural disasters or extreme weather. Also, the disruption of communication services, power outages, gas shortages, and road closures, also were contributing factors to efficiently obtaining medical care."

Journal Reference:

J. N. Swerdel, T. M. Janevic, N. M. Cosgrove, J. B. Kostis. The Effect of Hurricane Sandy on Cardiovascular Events in New Jersey. Journal of the American Heart Association, 2014; 3 (6): e001354 DOI: 10.1161/JAHA.114.001354

View the original article here

Birds sensed severe storms and fled before tornado outbreak

Golden-winged warblers apparently knew in advance that a storm that would spawn 84 confirmed tornadoes and kill at least 35 people last spring was coming, according to a report in the Cell Press journal Current Biology on December 18. The birds left the scene well before devastating supercell storms blew in.

The discovery was made quite by accident while researchers were testing whether the warblers, which weigh "less than two nickels," could carry geolocators on their backs. It turns out they can, and much more. With a big storm brewing, the birds took off from their breeding ground in the Cumberland Mountains of eastern Tennessee, where they had only just arrived, for an unplanned migratory event. All told, the warblers travelled 1,500 kilometers in 5 days to avoid the historic tornado-producing storms.

"The most curious finding is that the birds left long before the storm arrived," says Henry Streby of the University of California, Berkeley. "At the same time that meteorologists on The Weather Channel were telling us this storm was headed in our direction, the birds were apparently already packing their bags and evacuating the area."

The birds fled from their breeding territories more than 24 hours before the arrival of the storm, Streby and his colleagues report. The researchers suspect that the birds did it by listening to infrasound associated with the severe weather, at a level well below the range of human hearing.

"Meteorologists and physicists have known for decades that tornadic storms make very strong infrasound that can travel thousands of kilometers from the storm," Streby explains. While the birds might pick up on some other cue, he adds, the infrasound from severe storms travels at exactly the same frequency the birds are most sensitive to hearing.

The findings show that birds that follow annual migratory routes can also take off on unplanned trips at other times of the year when conditions require it. That's probably good news for birds, as climate change is expected to produce storms that are both stronger and more frequent. But there surely must be a downside as well, the researchers say.

"Our observation suggests [that] birds aren't just going to sit there and take it with regards to climate change, and maybe they will fare better than some have predicted," Streby says. "On the other hand, this behavior presumably costs the birds some serious energy and time they should be spending on reproducing." The birds' energy-draining journey is just one more pressure human activities are putting on migratory songbirds.

In the coming year, Streby's team will deploy hundreds of geolocators on the golden-winged warblers and related species across their entire breeding range to find out where they spend the winter and how they get there and back.

"I can't say I'm hoping for another severe tornado outbreak," Streby says, "but I am eager to see what unpredictable things happen this time."


View the original article here

When it comes to variations in crop yield, climate has a big say

What impact will future climate change have on food supply? That depends in part on the extent to which variations in crop yield are attributable to variations in climate. A new report from researchers at the University of Minnesota Institute on the Environment has found that climate variability historically accounts for one-third of yield variability for maize, rice, wheat and soybeans worldwide -- the equivalent of 36 million metric tons of food each year. This provides valuable information planners and policy makers can use to target efforts to stabilize farmer income and food supply and so boost food security in a warming world.

The work was published in the journal Nature Communications by Deepak Ray, James Gerber, Graham MacDonald and Paul West of IonE's Global Landscapes Initiative.The researchers looked at newly available production statistics for maize, rice, wheat and soybean from 13,500 political units around the world between 1979 and 2008, along with precipitation and temperature data. The team used these data to calculate year-to-year fluctuations and estimate how much of the yield variability could be attributed to climate variability.About 32 to 39 percent of year-to-year variability for the four crops could be explained by climate variability. This is substantial -- the equivalent of 22 million metric tons of maize, 3 million metric tons of rice, 9 million metric tons of wheat, and 2 million metric tons of soybeans per year.The links between climate and yield variability differed among regions. Climate variability explained much of yield variability in some of the most productive regions, but far less in low-yielding regions. "This means that really productive areas contribute to food security by having a bumper crop when the weather is favorable but can be hit really hard when the weather is bad and contribute disproportionately to global food insecurity," says Ray. "At the other end of the spectrum, low-yielding regions seem to be more resilient to bad-weather years but don't see big gains when the weather is ideal." Some regions, such as in parts of Asia and Africa, showed little correlation between climate variability and yield variability.More than 60 percent of the yield variability can be explained by climate variability in regions that are important producers of major crops, including the Midwestern U.S., the North China Plains, western Europe and Japan.Depicted as global maps, the results show where and how much climate variability explains yield variability.

The research team is now looking at historical records to see whether the variability attributable to climate has changed over time -- and if so, what aspects of climate are most pertinent.

"Yield variability can be a big problem from both economic and food supply standpoints," Ray said. "The results of this study and our follow-up work can be used to improve food system stability around the world by identifying hot spots of food insecurity today as well as those likely to be exacerbated by climate change in the future."


View the original article here

In the mood to trade? Weather may influence institutional investors' stock decisions

Weather changes may affect how institutional investors decide on stock plays, according to a new study by a team of finance researchers. Their findings suggest sunny skies put professional investors more in a mood to buy, while cloudy conditions tend to discourage stock purchases.

The researchers conclude that cloudier days increase the perception that individual stocks and the Dow Jones Industrials are overpriced, increasing the inclination for institutions to sell.

The research paper, "Weather-Induced Mood, Institutional Investors, and Stock Returns," has been published in the January 2015 issue of The Review of Financial Studies. The research was collaborated by Case Western Reserve University's Dasol Kim and three other finance professors (William Goetzmann of Yale University, Alok Kumar of University of Miami and Qin Wang of University of Michigan-Dearborn).

Institutional investors represent large organizations, such as banks, mutual funds, labor union funds and finance or insurance companies that make substantial investments in stocks. Kim said the results of the study are surprising, given that professional investors are well regarded for their financial sophistication.

"We focus on institutional investors because of the important role they have in how stock prices are formed in the markets," said Kim, assistant professor of banking and finance at Case Western Reserve's Weatherhead School of Management. "Other studies have already shown that ordinary retail investors are susceptible to psychological biases in their investment decisions. Trying to evaluate similar questions for institutional investors is challenging, because relevant data is hard to come by."

Building on previous findings from psychological studies about the effect of sunshine on mood, the researchers wanted to learn how mood affects professional investor opinions on their stock market investments.

By linking responses to a survey of investors from the Yale Investor Behavior Project of Nobel Prize-winning economist Robert Shiller and institutional stock trade data with historical weather data from the National Oceanic and Atmospheric Administration, the researchers concluded aggregated data shows that seasonably sunnier weather leads to optimistic responses and a willingness to buy.

The research accounts for differences in weather across regions of the country and seasons. They show that these documented mood effects also influence stock prices, and that the observed impact does not persist for long periods of time.

A summary of the research was also recently featured at The Harvard Law School Forum on Corporate Governance and Financial Regulation.

Journal Reference:

W. N. Goetzmann, D. Kim, A. Kumar, Q. Wang. Weather-Induced Mood, Institutional Investors, and Stock Returns. Review of Financial Studies, 2014; 28 (1): 73 DOI: 10.1093/rfs/hhu063

View the original article here

NASA, NOAA find 2014 warmest year in modern record

The year 2014 ranks as Earth's warmest since 1880, according to two separate analyses by NASA and National Oceanic and Atmospheric Administration (NOAA) scientists.

The 10 warmest years in the instrumental record, with the exception of 1998, have now occurred since 2000. This trend continues a long-term warming of the planet, according to an analysis of surface temperature measurements by scientists at NASA's Goddard Institute of Space Studies (GISS) in New York.

In an independent analysis of the raw data, also released Friday, NOAA scientists also found 2014 to be the warmest on record.

"NASA is at the forefront of the scientific investigation of the dynamics of the Earth's climate on a global scale," said John Grunsfeld, associate administrator for the Science Mission Directorate at NASA Headquarters in Washington. "The observed long-term warming trend and the ranking of 2014 as the warmest year on record reinforces the importance for NASA to study Earth as a complete system, and particularly to understand the role and impacts of human activity."

Since 1880, Earth's average surface temperature has warmed by about 1.4 degrees Fahrenheit (0.8 degrees Celsius), a trend that is largely driven by the increase in carbon dioxide and other human emissions into the planet's atmosphere. The majority of that warming has occurred in the past three decades.

"This is the latest in a series of warm years, in a series of warm decades. While the ranking of individual years can be affected by chaotic weather patterns, the long-term trends are attributable to drivers of climate change that right now are dominated by human emissions of greenhouse gases," said GISS Director Gavin Schmidt.

While 2014 temperatures continue the planet's long-term warming trend, scientists still expect to see year-to-year fluctuations in average global temperature caused by phenomena such as El Ni?o or La Ni?a. These phenomena warm or cool the tropical Pacific and are thought to have played a role in the flattening of the long-term warming trend over the past 15 years. However, 2014's record warmth occurred during an El Ni?o-neutral year.

"NOAA provides decision makers with timely and trusted science-based information about our changing world," said Richard Spinrad, NOAA chief scientist. "As we monitor changes in our climate, demand for the environmental intelligence NOAA provides is only growing. It's critical that we continue to work with our partners, like NASA, to observe these changes and to provide the information communities need to build resiliency."

Regional differences in temperature are more strongly affected by weather dynamics than the global mean. For example, in the U.S. in 2014, parts of the Midwest and East Coast were unusually cool, while Alaska and three western states -- California, Arizona and Nevada -- experienced their warmest year on record, according to NOAA.

The GISS analysis incorporates surface temperature measurements from 6,300 weather stations, ship- and buoy-based observations of sea surface temperatures, and temperature measurements from Antarctic research stations. This raw data is analyzed using an algorithm that takes into account the varied spacing of temperature stations around the globe and urban heating effects that could skew the calculation. The result is an estimate of the global average temperature difference from a baseline period of 1951 to 1980.

NOAA scientists used much of the same raw temperature data, but a different baseline period. They also employ their own methods to estimate global temperatures.

GISS is a NASA laboratory managed by the Earth Sciences Division of the agency's Goddard Space Flight Center, in Greenbelt, Maryland. The laboratory is affiliated with Columbia University's Earth Institute and School of Engineering and Applied Science in New York.

NASA monitors Earth's vital signs from land, air and space with a fleet of satellites, as well as airborne and ground-based observation campaigns. NASA develops new ways to observe and study Earth's interconnected natural systems with long-term data records and computer analysis tools to better see how our planet is changing. The agency shares this unique knowledge with the global community and works with institutions in the United States and around the world that contribute to understanding and protecting our home planet.

The data set of 2014 surface temperature measurements is available at:

http://data.giss.nasa.gov/gistemp/

The methodology used to make the temperature calculation is available at:

http://data.giss.nasa.gov/gistemp/sources_v3/

For more information about NASA's Earth science activities, visit:

http://www.nasa.gov/earthrightnow


View the original article here

Atmospheric rivers, cloud-creating aerosol particles, and california reservoirs

In the midst of the California rainy season, scientists are embarking on a field campaign designed to improve the understanding of the natural and human-caused phenomena that determine when and how the state gets its precipitation. They will do so by studying atmospheric rivers, meteorological events that include the famous rainmaker known as the Pineapple Express.

CalWater 2015 is an interagency, interdisciplinary field campaign starting January 14, 2015. CalWater 2015 will entail four research aircraft flying through major storms while a ship outfitted with additional instruments cruises below. The research team includes scientists from Scripps Institution of Oceanography at UC San Diego, the Department of Energy's Pacific Northwest National Laboratory, NOAA, and NASA and uses resources from the DOE's Atmospheric Radiation Measurement (ARM) Climate Research Facility -- a national scientific user facility.

The study will help provide a better understanding of how California gets its rain and snow, how human activities are influencing precipitation, and how the new science provides potential to inform water management decisions relating to drought and flood.

"After several years in the making by an interdisciplinary science team, and through support from multiple agencies, the CalWater 2015 field campaign is set to observe the key conditions offshore and over California like has never been possible before," said Scripps climate researcher Marty Ralph, a CalWater lead investigator. "These data will ultimately help develop better climate projections for water and will help test the potential of using existing reservoirs in new ways based on atmospheric river forecasts."

Like land-based rivers, atmospheric rivers carry massive amounts of moisture long distances -- in California's case, from the tropics to the U.S. West Coast. When an atmospheric river hits the coast, it releases its moisture as precipitation. How much and whether it falls as rain or snow depends on aerosols -- tiny particles made of dust, sea salt, volatile molecules, and pollution.

The researchers will examine the strength of atmospheric rivers, which produce up to 50 percent of California's precipitation and can transport 10-20 times the flow of the Mississippi River. They will also explore how to predict when and where atmospheric rivers will hit land, as well as the role of ocean evaporation and how the ocean changes after a river passes.

"Climate and weather models have a hard time getting precipitation right," said Ralph. "In fact, the big precipitation events that are so important for water supply and can cause flooding, mostly due to atmospheric rivers, are some of the most difficult to predict with useful accuracy. The severe California drought is essentially a result of a dearth of atmospheric rivers, while, conversely, the risk of Katrina-like damages for California due to severe ARs has also been quantified in previous research."

For the next month or more, instrument teams will gather data from the NOAA research vessel Ronald H. Brown and two NOAA, one DOE, and one NASA research aircraft with a coordinated implementation strategy when weather forecasters see atmospheric rivers developing in the Pacific Ocean off the coast of California. NASA will also provide remote sensing data for the project.

"Improving our understanding of atmospheric rivers will help us produce better forecasts of where they will hit and when, and how much rain and snow they will deliver," said Allen White, NOAA research meteorologist and CalWater 2015 mission scientist. "Better forecasts will give communities the environmental intelligence needed to respond to droughts and floods."

Most research flights will originate at McClellan Airfield in Sacramento. Ground-based instruments in Bodega Bay, Calif., and scattered throughout the state will also collect data on natural and human contributions to the atmosphere such as dust and pollution. This data-gathering campaign follows the 2009-2011 CalWater1 field campaign, which yielded new insights into how precipitation processes in the Sierra Nevada can be influenced by different sources of aerosols that seed the clouds.

"This will be an extremely important study in advancing our overall understanding of aerosol impacts on clouds and precipitation," said Kimberly Prather, a CalWater lead investigator and Distinguished Chair in Atmospheric Chemistry with appointments at Scripps Oceanography and the Department of Chemistry and Biochemistry at UC San Diego. "It will build upon findings from CalWater1, adding multiple aircraft to directly probe how aerosols from different sources, local, ocean, as well as those from other continents, are influencing clouds and precipitation processes over California."

"We are collecting this data to improve computer models of rain that represent many complex processes and their interactions with the environment," said PNNL's Leung. "Atmospheric rivers contribute most of the heavy rains along the coast and mountains in the West. We want to capture those events better in our climate models used to project changes in extreme events in the future."

Prather's group showed during CalWater1 that aerosols can have competing effects, depending on their source. Intercontinental mineral dust and biological particles possibly from the ocean corresponded to events with more precipitation, while aerosols produced by local air pollution correlated with less precipitation.

The CalWater 2015 campaign is comprised of two interdependent efforts. Major investments in facilities include aircraft, ship time, and sensors by NOAA. Marty Ralph, Kim Prather, and Dan Cayan from Scripps, and Chris Fairall, Ryan Spackman, and Allen White of NOAA lead CalWater-2. The DOE-funded ARM Cloud Aerosol Precipitation Experiment (ACAPEX) is led by Ruby Leung from PNNL. NSF and NASA have also provided major support for aspects of CalWater, leveraging the NOAA and DOE investments.


View the original article here

Global warming's influence on extreme weather

Understanding the cause-and-effect relationship between global warming and record-breaking weather requires asking precisely the right questions.

Extreme climate and weather events such as record high temperatures, intense downpours and severe storm surges are becoming more common in many parts of the world. But because high-quality weather records go back only about 100 years, most scientists have been reluctant to say if global warming affected particular extreme events.

On Wednesday, Dec. 17, at the American Geophysical Union's Fall Meeting in San Francisco, Noah Diffenbaugh, an associate professor of environmental Earth system science at the Stanford School of Earth Sciences, will discuss approaches to this challenge in a talk titled "Quantifying the Influence of Observed Global Warming on the Probability of Unprecedented Extreme Climate Events." He will focus on weather events that -- at the time they occur -- are more extreme than any other event in the historical record.

Diffenbaugh emphasizes that asking precisely the right question is critical for finding the correct answer.

"The media are often focused on whether global warming caused a particular event," said Diffenbaugh, who is a senior fellow at the Stanford Woods Institute for the Environment. "The more useful question for real-world decisions is: 'Is the probability of a particular event statistically different now compared with a climate without human influence?'"

Diffenbaugh said the research requires three elements: a long record of climate observations; a large collection of climate model experiments that accurately simulate the observed variations in climate; and advanced statistical techniques to analyze both the observations and the climate models.

One research challenge involves having just a few decades or a century of high-quality weather data with which to make sense of events that might occur once every 1,000 or 10,000 years in a theoretical climate without human influence.

But decision makers need to appreciate the influence of global warming on extreme climate and weather events.

"If we look over the last decade in the United States, there have been more than 70 events that have each caused at least $1 billion in damage, and a number of those have been considerably more costly," said Diffenbaugh. "Understanding whether the probability of those high-impact events has changed can help us to plan for future extreme events, and to value the costs and benefits of avoiding future global warming."


View the original article here

Even in restored forests, extreme weather strongly influences wildfire's impacts

The 2013 Rim Fire, the largest wildland fire ever recorded in the Sierra Nevada region, is still fresh in the minds of Californians, as is the urgent need to bring forests back to a more resilient condition. Land managers are using fire as a tool to mimic past fire conditions, restore fire-dependent forests, and reduce fuels in an effort to lessen the potential for large, high-intensity fires, like the Rim Fire. A study led by the U.S. Forest Service's Pacific Southwest Research Station (PSW) and recently published in the journal Forest Ecology and Management examined how the Rim Fire burned through forests with restored fire regimes in Yosemite National Park to determine whether they were as resistant to high-severity fire as many scientists and land managers expected.

Since the late 1960s, land managers in Yosemite National Park have used prescribed fire and let lower intensity wildland fires burn in an attempt to bring back historical fire regimes after decades of fire suppression. For this study, researchers seized a unique opportunity to study data on forest structure and fuels collected in 2009 and 2010 in Yosemite's old-growth, mixed-conifer forests that had previously burned at low to moderate severity. Using post-Rim Fire data and imagery, researchers found that areas burned on days the Rim Fire was dominated by a large pyro-convective plume -- a powerful column of smoke, gases, ash, and other debris -- burned at moderate to high severity regardless of the number of prior fires, topography, or forest conditions.

"The specific conditions leading to large plume formation are unknown, but what is clear from many observations is that these plumes are associated with extreme burning conditions," says Jamie Lydersen, PSW biological science technician and the study's lead author. "Plumes often form when atmospheric conditions are unstable, and result in erratic fire behavior driven by its own local effect on surface wind and temperatures that override the influence of more generalized climate factors measured at nearby weather stations."

When the extreme conditions caused by these plumes subsided during the Rim Fire, other factors influenced burn severity. "There was a strong influence of elapsed time since the last burn, where forests that experienced fire within the last 14 years burned mainly at low severity in the Rim Fire. Lower elevation areas and those with greater shrub cover tended to burn at higher severity," says Lydersen.

When driven by extreme weather, which often coincides with wildfires that escape initial containment efforts, fires can severely burn large swaths of forest regardless of ownership and fire history. These fires may only be controlled if more forests across the landscape have been managed for fuel reduction to allow early stage suppression before weather- and fuels-driven fire intensity makes containment impossible. Coordination of fire management activities by land management agencies across jurisdictions could favor burning under more moderate weather conditions when wildfires start and reduce the occurrences of harmful, high-intensity fires.


View the original article here

Hurricane-forecast satellites will keep close eyes on the tropics

A set of eight hurricane-forecast satellites being developed at the University of Michigan is expected to give deep insights into how and where storms suddenly intensify--a little-understood process that's becoming more crucial to figure out as the climate changes, U-M researchers say.

The Cyclone Global Navigation Satellite System is scheduled to launch in fall 2016. At the American Geophysical Union Meeting in San Francisco this week, U-M researchers released estimates of how significantly CYGNSS could improve wind speed and storm intensity forecasts.

CYGNSS--said like the swan constellation--is a $173-million NASA mission that U-M is leading with Texas-based Southwest Research Institute. Each of its eight observatories is about the size of a microwave oven. That's much smaller than a typical weather satellite, which is about the size of a van.

The artificial CYGNSS "constellation," as researchers refer to it, will orbit at tropical, hurricane-belt latitudes. Its coverage will stretch from the 38th parallel north near Delaware's latitude to its counterpart in the south just below Buenos Aires.

Because of their arrangement and number, the observatories will be able to measure the same spot on the globe much more often than the weather satellites flying today can. CYGNSS's revisit time will average between four and six hours, and at times, it can be as fast as 12 minutes.

Conventional weather satellites only cross over the same point once or twice a day. Meteorologists can use ground-based Doppler radar to help them make predictions about storms near land, but hurricanes, which form over the open ocean, present a tougher problem.

"The rapid refresh CYGNSS will offer is a key element of how we'll be able to improve hurricane forecasts," said CYGNSS lead investigator Christopher Ruf, director of the U-M Space Physics Research Lab and professor of atmospheric, oceanic and space sciences.

"CYGNSS gets us the ability to measure things that change fast, like extreme weather. Those are the hardest systems to measure with today's satellites. And because the world is warmer and there's more energy to feed storm systems, there's more likelihood of extreme weather."

Through simulations, the researchers quantified the improvement CYGNSS could have on storm intensity predictions. They found that for a wind speed forecast that is off by 33 knots, or 38 miles per hour--the average error with current capabilities--CYGNSS could reduce that by 9 knots, or about 10 mph.

Considering that the categories of hurricane strength ratchet up, on average, every 20 mph, the accuracy boost is "a very significant number," Ruf said.

"I'd describe the feeling about it as guarded excitement," he said. "It's preliminary and it's all based on models. People will be really excited when we get up there and it works."

The numbers could also improve as scientists update weather prediction tools to better use the new kind of information that CYGNSS will provide.

For people who live in common hurricane or typhoon paths, closer wind speed predictions could translate into more accurate estimates of the storm surge at landfall, Ruf said. That's the main way these systems harm people and property.

"The whole ocean gets higher because the wind pushes the water. That's really hard to forecast now and it's an area we hope to make big improvements in," Ruf said.

Researchers expect the satellite system to give them new insights into storm processes. Hurricanes evolve slowly at first, but then they reach a tipping point, says Aaron Ridley, a professor of atmospheric, oceanic and space sciences.

"The hurricane could be meandering across the Atlantic Ocean and then something happens." Ridley said. "It kicks up a notch and people aren't exactly sure why. A lot of scientists would like to study this rapid intensification in more detail. With a normal mission, you might not be able to see it, but with CYGNSS, you have a better chance."

The satellites will operate in a fundamentally different way than their counterparts do. Rather than transmit a signal and read what reflects back, they'll measure how GPS signals from other satellites bounce off the ocean surface. Each of the eight CYGNSS nodes will measure signals from four of the 32 Global Positioning System satellites.

They'll also be able to take measurements through heavy rain--something other weather satellites are, surprisingly, not very good at.


View the original article here

Giant atmospheric rivers add mass to Antarctica's ice sheet

Extreme weather phenomena called atmospheric rivers were behind intense snowstorms recorded in 2009 and 2011 in East Antarctica. The resulting snow accumulation partly offset recent ice loss from the Antarctic ice sheet, report researchers from KU Leuven.

Atmospheric rivers are long, narrow water vapour plumes stretching thousands of kilometres across the sky over vast ocean areas. They are capable of rapidly transporting large amounts of moisture around the globe and can cause devastating precipitation when they hit coastal areas.

Although atmospheric rivers are notorious for their flood-inducing impact in Europe and the Americas, their importance for Earth's polar climate -- and for global sea levels -- is only now coming to light.

In this study, an international team of researchers led by Irina Gorodetskaya of KU Leuven's Regional Climate Studies research group used a combination of advanced modelling techniques and data collected at Belgium's Princess Elisabeth polar research station in East Antarctica's Dronning Maud Land to produce the first ever in-depth look at how atmospheric rivers affect precipitation in Antarctica.

The researchers studied two particular instances of heavy snowfall in the East Antarctic region in detail, one in May 2009 and another in February 2011, and found that both were caused by atmospheric rivers slamming into the East Antarctic coast.

The Princess Elisabeth polar research station recorded snow accumulation equivalent to up to 5 centimetres of water for each of these weather events, good for 22 per cent of the total annual snow accumulation in those years.

The findings point to atmospheric rivers' impressive snow-producing power. "When we looked at all the extreme weather events that took place during 2009 and 2011, we found that the nine atmospheric rivers that hit East Antarctica in those years accounted for 80 per cent of the exceptional snow accumulation at Princess Elisabeth station," says Irina Gorodetskaya.

And this can have important consequences for Antarctica's diminishing ice sheet. "There is a need to understand how the flow of ice within Antarctica's ice sheet responds to warming and gain insight in atmospheric processes, cloud formation and snowfall," adds Nicole Van Lipzig, co-author of the study and professor of geography at KU Leuven.

A separate study found that the Antarctic ice sheet has lost substantial mass in the last two decades -- at an average rate of about 68 gigatons per year during the period 1992-2011.

"The unusually high snow accumulation in Dronning Maud Land in 2009 that we attributed to atmospheric rivers added around 200 gigatons of mass to Antarctica, which alone offset 15 per cent of the recent 20-year ice sheet mass loss," says Irina Gorodetskaya.

"This study represents a significant advance in our understanding of how the global water cycle is affected by atmospheric rivers. It is the first to look at the effect of atmospheric rivers on Antarctica and to explore their role in cryospheric processes of importance to the global sea level in a changing climate," says Martin Ralph, contributor to the study and Director of the Center for Western Weather and Water Extremes at the University of California, San Diego.

"Moving forward, we aim to explore the impact of atmospheric rivers on precipitation in all Antarctic coastal areas using data records covering the longest possible time period. We want to determine exactly how this phenomenon fits into climate models," says Irina Gorodetskaya.

"Our results should not be misinterpreted as evidence that the impacts of global warming will be small or reversed due to compensating effects. On the contrary, they confirm the potential of Earth's warming climate to manifest itself in anomalous regional responses. Thus, our understanding of climate change and its worldwide impact will strongly depend on climate models' ability to capture extreme weather events, such as atmospheric rivers and the resulting anomalies in precipitation and temperature," she concludes.


View the original article here

Muddy forests, shorter winters present challenges for loggers

Stable, frozen ground has long been recognized a logger's friend, capable of supporting equipment and trucks in marshy or soggy forests. Now, a comprehensive look at weather from 1948 onward shows that the logger's friend is melting.

The study, published in the current issue of the Journal of Environmental Management, finds that the period of frozen ground has declined by an average of two or three weeks since 1948. During that time, wood harvests have shifted in years with more variability in freezing and thawing to red pine and jack pine -- species that grow in sandy, well-drained soil that can support trucks and heavy equipment when not frozen.

Jack pine, a characteristic north woods Wisconsin species, is declining, and areas that have been harvested are often replaced with a different species, changing the overall ecosystem.

The study was an effort to look at how long-term weather trends affect forestry, says author Adena Rissman, an assistant professor of forest and wildlife ecology at the University of Wisconsin-Madison. "When my co-author, Chad Rittenhouse, and I began this project, we wanted to know how weather affects our ability to support sustainable working forests. We found a significant decline in the duration of frozen ground over the past 65 years, and at the same time, a significant change in the species being harvested."

"This study identifies real challenges facing forest managers, loggers, landowners, and industry," says Rittenhouse, now an assistant research professor of natural resources and the environment at the University of Connecticut. "Once we understood the trends in frozen ground, we realized how pulling out that issue tugged on economics, livelihoods, forest ecology, wildlife habitat and policy."

Mud can make forests impassable in fall, and even more so after the snow melts in spring, making life difficult for companies that buy standing trees, Rittenhouse says. "Nobody wants to get stuck; you lose time and have to get hauled out or wait for the ground to firm up again."

Shorter winters and uncertainty complicate management for logging companies, Rissman adds. "They often need to plan out their jobs for the next six months or year." The same is true for managers of state and county forests, which typically allow two years for a cut to be completed. "In some cases," she says, "they are going to three-year contracts to give more time to get the timber out."

Even if equipment can traverse muddy roads, their ruts may ruin the road and cause unacceptable erosion. "There is increased attention to rutting on public land, and on private land that is in the state's managed forest program or in a form of sustainable forest certification," says Rissman. "Excessively wet and muddy ground during harvest is a lose-lose-lose for the logger, the landowner and the environment."

The study drew data from weather records from airports, used to model when the ground was frozen; Department of Natural Resources records on harvest levels for various tree species; and interviews with forest managers and loggers.

"People in the forestry industry say this is a big deal; winter is normally the most profitable time," Rissman observes. "It's more and more difficult to make a profit in forestry (with) more loggers (taking) on a lot of debt -- they are heavily mechanized, have heavy labor and insurance expenses, and these costs don't end when they don't have work."

The uncertainty about when and where they can work emerged during an interview with a veteran logger, who is quoted as follows in the study: "When I started in the business ... the typical logger ... would shut down and not do anything for the month or two months that the spring break up would last for. Nowadays, with the cost of equipment, and just the cost of insurance on that equipment alone, you're looking for work almost 12 months out of the year."

The shorter winters seem linked to climate change, Rissman acknowledges. "For many people, climate change is something that happens, or not, in places that are far away, at scales that are difficult to see or understand through personal experience. Here's an example of something we can clearly document, of a trend that is having an impact on how forests are managed, right here at home."


View the original article here

Average temperature in Finland has risen by more than two degrees

Over the past 166 years, the average temperature in Finland has risen by more than two degrees. During the observation period, the average increase was 0.14 degrees per decade, which is nearly twice as much as the global average.

According to a recent University of Eastern Finland and Finnish Meteorological Institute study, the rise in the temperature has been especially fast over the past 40 years, with the temperature rising by more than 0.2 degrees per decade. "The biggest temperature rise has coincided with November, December and January. Temperatures have also risen faster than the annual average in the spring months, i.e., March, April and May. In the summer months, however, the temperature rise has not been as significant," says Professor Ari Laaksonen of the University of Eastern Finland and the Finnish Meteorological Institute. As a result of the temperature rising, lakes in Finland get their ice cover later than before, and the ice cover also melts away earlier in the spring. Although the temperature rise in the actual growth season has been moderate, observations of Finnish trees beginning to blossom earlier than before have been made.

Temperature has risen in leaps

The annual average temperature has risen in two phases, the first being from the beginning of the observation period to the late 1930s, and the second from the late 1960s to present. Since the 1960s, the temperature has risen faster than ever before, with the rise varying between 0.2 and 0.4 degrees per decade. Between the late 1930s and late 1960s, the temperature remained nearly steady. "The stop in the temperature rise can be explained by several factors, including long-term changes in solar activity and post-World War II growth of human-derived aerosols in the atmosphere. When looking at recent years' observations from Finland, it seems that the temperature rising is not slowing down," University of Eastern Finland researcher Santtu Mikkonen explains.

The temperature time series was created by averaging the data produced by all Finnish weather stations across the country. Furthermore, as the Finnish weather station network wasn't comprehensive nation-wide in the early years, data obtained from measurement stations in Finland's neighbouring countries was also used.

Finland is located between the Atlantic Ocean and the continental Eurasia, which causes great variability in the country's weather. In the time series of the average temperature, this is visible in the form of strong noise, which makes it very challenging to detect statistically significant trends. The temperature time series for Finland was analysed by using a dynamic regression model. The method allows the division of the time series into sections indicating mean changes, i.e. trends, periodic variation, observation inter-dependence and noise. The method makes it possible to take into consideration the seasonal changes typical of Nordic conditions, as well as significant annual variation.

Journal Reference:

S. Mikkonen, M. Laine, H. M. M?kel?, H. Gregow, H. Tuomenvirta, M. Lahtinen, A. Laaksonen. Trends in the average temperature in Finland, 1847–2013. Stochastic Environmental Research and Risk Assessment, 2014; DOI: 10.1007/s00477-014-0992-2

View the original article here

Small volcanic eruptions partly explain 'warming hiatus'

The "warming hiatus" that has occurred over the last 15 years has been caused in part by small volcanic eruptions.

Scientists have long known that volcanoes cool the atmosphere because of the sulfur dioxide that is expelled during eruptions. Droplets of sulfuric acid that form when the gas combines with oxygen in the upper atmosphere can persist for many months, reflecting sunlight away from Earth and lowering temperatures at the surface and in the lower atmosphere.

Previous research suggested that early 21st-century eruptions might explain up to a third of the recent warming hiatus.

New research available online in the journal Geophysical Research Letters (GRL) further identifies observational climate signals caused by recent volcanic activity. This new research complements an earlier GRL paper published in November, which relied on a combination of ground, air and satellite measurements, indicating that a series of small 21st-century volcanic eruptions deflected substantially more solar radiation than previously estimated.

"This new work shows that the climate signals of late 20th- and early 21st-century volcanic activity can be detected in a variety of different observational data sets," said Benjamin Santer, a Lawrence Livermore National Laboratory scientist and lead author of the study.

The warmest year on record is 1998. After that, the steep climb in global surface temperatures observed over the 20th century appeared to level off. This "hiatus" received considerable attention, despite the fact that the full observational surface temperature record shows many instances of slowing and acceleration in warming rates. Scientists had previously suggested that factors such as weak solar activity and increased heat uptake by the oceans could be responsible for the recent lull in temperature increases. After publication of a 2011 paper in the journal Science by Susan Solomon of the Massachusetts Institute of Technology (link is external) (MIT), it was recognized that an uptick in volcanic activity might also be implicated in the warming hiatus.

Prior to the 2011 Science paper, the prevailing scientific thinking was that only very large eruptions -- on the scale of the cataclysmic 1991 Mount Pinatubo eruption in the Philippines, which ejected an estimated 20 million metric tons (44 billion pounds) of sulfur -- were capable of impacting global climate. This conventional wisdom was largely based on climate model simulations. But according to David Ridley, an atmospheric scientist at MIT and lead author of the November GRL paper, these simulations were missing an important component of volcanic activity.

Ridley and colleagues found the missing piece of the puzzle at the intersection of two atmospheric layers, the stratosphere and the troposphere -- the lowest layer of the atmosphere, where all weather takes place. Those layers meet between 10 and 15 kilometers (six to nine miles) above Earth.

Satellite measurements of the sulfuric acid droplets and aerosols produced by erupting volcanoes are generally restricted to above 15 km. Below 15 km, cirrus clouds can interfere with satellite aerosol measurements. This means that toward the poles, where the lower stratosphere can reach down to 10 km, the satellite measurements miss a significant chunk of the total volcanic aerosol loading.

To get around this problem, the study by Ridley and colleagues combined observations from ground-, air- and space-based instruments to better observe aerosols in the lower portion of the stratosphere. They used these improved estimates of total volcanic aerosols in a simple climate model, and estimated that volcanoes may have caused cooling of 0.05 degrees to 0.12 degrees Celsius since 2000.

The second Livermore-led study shows that the signals of these late 20th and early 21st eruptions can be positively identified in atmospheric temperature, moisture and the reflected solar radiation at the top of the atmosphere. A vital step in detecting these volcanic signals is the removal of the "climate noise" caused by El Ni?os and La Ni?as.

"The fact that these volcanic signatures are apparent in multiple independently measured climate variables really supports the idea that they are influencing climate in spite of their moderate size," said Mark Zelinka, another Livermore author. "If we wish to accurately simulate recent climate change in models, we cannot neglect the ability of these smaller eruptions to reflect sunlight away from Earth."


View the original article here

Electromagnetic waves linked to particle fallout in Earth's atmosphere, new study finds

In a new study that sheds light on space weather's impact on Earth, Dartmouth researchers and their colleagues show for the first time that plasma waves buffeting the planet's radiation belts are responsible for scattering charged particles into the atmosphere.

The study is the most detailed analysis so far of the link between these waves and the fallout of electrons from the planet's radiation belts. The belts are impacted by fluctuations in "space weather" caused by solar activity that can disrupt GPS satellites, communication systems, power grids and manned space exploration.

The results appear in the journal Geophysical Research Letters. A PDF is available on request.

The Dartmouth space physicists are part of a NASA-sponsored team that studies the Van Allen radiation belts, which are donut-shaped belts of charged particles held in place by Earth's magnetosphere, the magnetic field surrounding our planet. In a quest to better predict space weather, the Dartmouth researchers study the radiation belts from above and below in complementary approaches -- through satellites (the twin NASA Van Allen Probes) high over Earth and through dozens of instrument-laden balloons (BARREL, or Balloon Array for Radiation belt Relativistic Electron Losses) at lower altitudes to assess the particles that rain down.

The Van Allen Probes measure particle, electric and magnetic fields, or basically everything in the radiation belt environment, including the electrons, which descend following Earth's magnetic field lines that converge at the poles. This is why the balloons are launched from Antarctica, where some of the best observations can be made. As the falling electrons collide with the atmosphere, they produce X-rays and that is what the balloon instruments are actually recording.

"We are measuring those atmospheric losses and trying to understand how the particles are getting kicked into the atmosphere," says co-author Robyn Millan, an associate professor in Dartmouth's Department of Physics and Astronomy and the principal investigator of BARREL. "Our main focus has been really on the processes that are occurring out in space. Particles in the Van Allen belts never reach the ground, so they don't constitute a health threat. Even the X-rays get absorbed, which is why we have to go to balloon altitudes to see them."

In their new study, the BARREL researchers' major objective was to obtain simultaneous measurements of the scattered particles and of ionoized gas called plasma out in space near Earth's equator. They were particularly interested in simultaneous measurements of a particular kind of plasma wave called electromagnetic ion cyclotron waves and whether these waves were responsible for scattering the particles, which has been an open question for years.

The researchers obtained measurements in Antarctica in 2013 when the balloons and both the Geostationary Operational Environmental Satellite (GOES) and Van Allen Probe satellites were near the same magnetic field line. They put the satellite data into their model that tests the wave-particle interaction theory, and the results suggest the wave scattering was the cause of the particle fallout. "This is the first real quantitative test of the theory," Millan says.


View the original article here

NASA's Fermi Mission brings deeper focus to thunderstorm gamma rays

Each day, thunderstorms around the world produce about a thousand quick bursts of gamma rays, some of the highest-energy light naturally found on Earth. By merging records of events seen by NASA's Fermi Gamma-ray Space Telescope with data from ground-based radar and lightning detectors, scientists have completed the most detailed analysis to date of the types of thunderstorms involved.

"Remarkably, we have found that any thunderstorm can produce gamma rays, even those that appear to be so weak a meteorologist wouldn't look twice at them," said Themis Chronis, who led the research at the University of Alabama in Huntsville (UAH).

The outbursts, called terrestrial gamma-ray flashes (TGFs), were discovered in 1992 by NASA's Compton Gamma-Ray Observatory, which operated until 2000. TGFs occur unpredictably and fleetingly, with durations less than a thousandth of a second, and remain poorly understood.

In late 2012, Fermi scientists employed new techniques that effectively upgraded the satellite's Gamma-ray Burst Monitor (GBM), making it 10 times more sensitive to TGFs and allowing it to record weak events that were overlooked before.

"As a result of our enhanced discovery rate, we were able to show that most TGFs also generate strong bursts of radio waves like those produced by lightning," said Michael Briggs, assistant director of the Center for Space Plasma and Aeronomic Research at UAH and a member of the GBM team.

Previously, TGF positions could be roughly estimated based on Fermi's location at the time of the event. The GBM can detect flashes within about 500 miles (800 kilometers), but this is too imprecise to definitively associate a TGF with a specific storm.

Ground-based lightning networks use radio data to pin down strike locations. The discovery of similar signals from TGFs meant that scientists could use the networks to determine which storms produce gamma-ray flashes, opening the door to a deeper understanding of the meteorology powering these extreme events.

Chronis, Briggs and their colleagues sifted through 2,279 TGFs detected by Fermi's GBM to derive a sample of nearly 900 events accurately located by the Total Lightning Network operated by Earth Networks in Germantown, Maryland, and the World Wide Lightning Location Network, a research collaboration run by the University of Washington in Seattle. These systems can pinpoint the location of lightning discharges -- and the corresponding signals from TGFs -- to within 6 miles (10 km) anywhere on the globe.

From this group, the team identified 24 TGFs that occurred within areas covered by Next Generation Weather Radar (NEXRAD) sites in Florida, Louisiana, Texas, Puerto Rico and Guam. For eight of these storms, the researchers obtained additional information about atmospheric conditions through sensor data collected by the Department of Atmospheric Science at the University of Wyoming in Laramie.

"All told, this study is our best look yet at TGF-producing storms, and it shows convincingly that storm intensity is not the key," said Chronis, who will present the findings Wed., Dec. 17, in an invited talk at the American Geophysical Union meeting in San Francisco. A paper describing the research has been submitted to the Bulletin of the American Meteorological Society.

Scientists suspect that TGFs arise from strong electric fields near the tops of thunderstorms. Updrafts and downdrafts within the storms force rain, snow and ice to collide and acquire electrical charge. Usually, positive charge accumulates in the upper part of the storm and negative charge accumulates below. When the storm's electrical field becomes so strong it breaks down the insulating properties of air, a lightning discharge occurs.

Under the right conditions, the upper part of an intracloud lightning bolt disrupts the storm's electric field in such a way that an avalanche of electrons surges upward at high speed. When these fast-moving electrons are deflected by air molecules, they emit gamma rays and create a TGF.

About 75 percent of lightning stays within the storm, and about 2,000 of these intracloud discharges occur for each TGF Fermi detects.

The new study confirms previous findings indicating that TGFs tend to occur near the highest parts of a thunderstorm, between about 7 and 9 miles (11 to 14 kilometers) high. "We suspect this isn't the full story," explained Briggs. "Lightning often occurs at lower altitudes and TGFs probably do too, but traveling the greater depth of air weakens the gamma rays so much the GBM can't detect them."

Based on current Fermi statistics, scientists estimate that some 1,100 TGFs occur each day, but the number may be much higher if low-altitude flashes are being missed.

While it is too early to draw conclusions, Chronis notes, there are a few hints that gamma-ray flashes may prefer storm areas where updrafts have weakened and the aging storm has become less organized. "Part of our ongoing research is to track these storms with NEXRAD radar to determine if we can relate TGFs to the thunderstorm life cycle," he said.

Video: https://www.youtube.com/watch?v=JgK4Ds_Sj6Q#t=66


View the original article here