Save on your hotel -


Nuclear power plant shuts down due to faulty water pump

Save on your hotel -

One of two nuclear power plants at Indian Point in Westchester County has been shut down due to a faulty water pump


Be the first to comment - What do you think?  Posted by Editor - June 28, 2017 at 9:28 am

Categories: General   Tags:

Coal power jumps in China, India after 2016 downturn

Coal’s fortunes had appeared to hit a new low less than two weeks ago


Be the first to comment - What do you think?  Posted by Editor - June 27, 2017 at 7:35 am

Categories: General   Tags:

Warming Brews Big Trouble in Coffee Birthplace Ethiopia

By Damian Carrington, The Guardian

Global warming is likely to wipe out half of the coffee growing area in Ethiopia, the birthplace of the bean, according to a groundbreaking new study. Rising temperatures have already damaged some special areas of origin, with these losses being likened to France losing one of its great wine regions.

Ethiopia’s highlands also host a unique treasure trove of wild coffee varieties, meaning new flavour profiles and growing traits could be lost before having been discovered. However, the new research also reveals that if a massive programme of moving plantations up hillsides to cooler altitudes were feasible, coffee production could actually increase.

Coffee cherries, hand-picked in Africa.
Credit: rogiro/flickr

Coffee vies with tea as the world’s favorite beverage and employs 100 million people worldwide in farming the beans alone. But climate change is coffee’s greatest long-term threat, killing plantations or reducing bean quality and allowing the deadly coffee leaf rust fungus to thrive. Without major action both in the coffee industry and in slashing greenhouse gas emissions, coffee is predicted to become more expensive and worse-tasting.

The research combined climate-change computer modelling with detailed measurements of current ground conditions, gathered in fieldwork that covered a total distance of 30,000km within Ethiopia. It found that 40-60 percent of today’s coffee growing areas in Ethiopia would be unsuitable by the end of the century under a range of likely warming scenarios.

But the study, published in the journal Nature Plants, also shows that major relocation programmes could preserve or even expand the country’s coffee-growing areas. “There is a pathway to resilience, even under climate change,” said Aaron Davis, at the Royal Botanic Gardens Kew in the UK, who conducted the work with Ethiopian scientists. “But it is a hugely daunting task. Millions of farmers would have to change.”

However, by 2040, such moves uphill will have reached the top of Ethiopia’s mountains. “It literally reaches the ceiling, because you don’t have any higher place to go,” Davis said.

The impacts of global warming are already being seen as temperatures have been rising steadily in Ethiopia for decades. Farmers report a longer, more extreme dry season and more intense rain in the wet season, with good harvests much less frequent than in their parents and grandparents’ time.

Coffee trees in Africa.
Credit: carsten ten brink/flickr

One famous coffee location likely to be lost is Harar. “In one area, there are hundreds if not thousands of hectares of dead trees,” said Davis. “It is a world renowned name and has been grown in that area for many centuries. But under all [climate change] scenarios, it’s going to get worse.

“Some of the origins, what you would call terroir in the wine industry, will disappear, unless serious intervention is undertaken,” he said. “It would be like losing the Burgundy wine region. Those areas are found nowhere else but Ethiopia, and because of the genetic diversity, the diversity of flavor profiles is globally unique.”

Both arabica and robusta coffee originated in Ethiopia and wild arabica plants are virtually unknown outside the country. The wild arabica varieties may well harbor traits for disease and drought resistance that could prove vital for the future health of coffee crops.

Prof Sebsebe Demissew, from the University of Addis Ababa and one of the research team, said: “Coffee originates from the highland forests of Ethiopia, and it is our gift to the world. As Ethiopia is the main natural storehouse of arabica genetic diversity, what happens in Ethiopia could have long-term impacts for coffee farming globally.”

The new research is a “brilliant piece of work”, according to Tim Schilling, chief executive of the World Coffee Research programme: “This is the only comprehensive, country-specific study I have seen that uses some of the best methods in climate modelling coupled to very rigorous ground-truthing — extremely useful for governments and industry and a model to be repeated.”

Just-picked coffee beans at farm in Cauca, southwestern Colombia.
Credit: CIAT/flickr

Schilling led an expedition into South Sudan in 2013 to confirm wild arabica coffee was also present in the Boma forest: “What we found was major degradation caused by climate change on the forest and the wild coffee under its canopy. That is pretty much what I think we can expect if nothing is done to preserve the arabica genetic treasure chest in Ethiopia.”

Schilling said new varieties and growing methods must be developed and that plantation “migration will have to be part of a plan B”. He added: “Plan C might be moving up in latitude and growing coffee in Southern France and Texas!” But he said funding all this is difficult when coffee producers are not making much money at present.

The Intergovernmental Panel on Climate Change concluded in 2014: “The overall predictions are for a reduction in area suitable for coffee production by 2050 in all countries studied. In many cases, the area suitable for production would decrease considerably with increases of temperature of only 2-2.5°C.” It said that in Brazil, the world’s biggest coffee producer, a temperature rise of 3°C would slash the area suitable for coffee by two-thirds in the principal growing states. In 2016, other researchers predicted climate change will halve the world’s coffee-growing area.

“People should also be thinking about the millions of smallholder farmers who put their coffee on the table,” said Davis. “The coffee farmers of Ethiopia are really on the frontline [of climate change] — they are the people who will pay the price first. In the longer term, the only truly sustainable solution is to combat the root causes of climate change.”

Reprinted with permission from The Guardian.


Be the first to comment - What do you think?  Posted by Editor - June 26, 2017 at 6:46 am

Categories: General   Tags:

Nuclear Decommissioning Threatens Climate Targets

By Geert De Clercq, Reuters

Decommissioning nuclear plants in Europe and North America from 2020 threatens global plans to cut carbon emissions unless governments build new nuclear plants or expand the use of renewables, a top International Energy Agency official said.

Nuclear and wind in Drôme, France.
Credit: Jeanne Menjoulet/flickr

Nuclear is now the largest low-carbon power source in Europe and the United States, about three times bigger than wind and solar combined, according to IEA data. But most reactors were built in the 1970s and early 80s, and will reach the end of their life around 2020.

With the average nuclear plant running for 8,000 hours a year versus 1,500-2,000 hours for a solar plant, governments must expand renewable investments to replace old nuclear plants if they are to meet decarbonization targets, IEA Chief Economist Laszlo Varro told Reuters.

“The ageing of the nuclear fleet is a considerable challenge for energy security and decarbonization objectives,” he said on the sidelines of the Eurelectric utilities conference in Portugal. 

Renewables have grown rapidly in the past decade but about 20 percent of new low-carbon capacity has been lost from the decommissioning of nuclear plants in the same period, he said.

“This is just a taste of thing to come,” Varro said.

Russia and India were building new plants, while China was bringing a new plant online every quarter, Varro said.

Kalinin Nuclear Power Plant in Moscow, Russia.
Credit: IAEA Imagebank/flickr

However, he said future projects in Japan were uncertain after the 2011 Fukushima disaster, while there was less appetite for new nuclear projects in Europe and the United States.

Financing new nuclear plants has become more challenging, partly because several new builds were running over budget, while in the United States nuclear has been struggling to compete against plants run on cheap shale gas.

Governments who chose the renewables route would have to consider upgrading power grids and invest in power storage to offset the variable nature of renewable generation, while those choosing nuclear would need to offer financial support as Britain has done for its plans, Varro said.

Wind and solar generation was expanding rapidly, but the pace needed to increase to meet climate stabilization goals. “At the moment it is not quickly enough,” he said.

The biggest challenge was in Europe and the United States, where nuclear energy capacity was steady or falling, he said.

“If we do not keep nuclear in the energy mix and do not accelerate wind and solar deployment, the loss of nuclear capacity will knock us back by 15 to 20 years. We do not have that much time to lose,” he said.

Reporting by Geert De Clercq; Editing by Edmund Blair


Be the first to comment - What do you think?  Posted by Editor - June 25, 2017 at 6:01 am

Categories: General   Tags:

3 Takeaways From the Renewable Energy Finance Forum: Taxes, Merchant Solar and Customer Choice

It’s tough to get large-scale solar deals done in 2017, but there was little gloom in the crowd of financiers at Renewable Energy Finance Forum this year.

Sure, volatility still defines many sectors of the solar market, the U.S. pulled out the Paris climate agreement and the federal government is skeptical of intermittent renewable energy resources. But the speakers and audience at REFF-Wall St. were focused on the opportunities in front of them — and those remain plentiful.

Corporations in the drivers seat

“You can get a great price if you can get something done,” Ray Henger, SVP of M&A and structured finance for sPower, said of the falling rates in solar. He also noted that nontraditional drivers, including the C&I sector and even community-choice aggregators, are starting to drive new deals.

“At some point, it needs to be the customer demanding solar — like MGM,” said Yuri Horwitz, CEO of Sol Systems. “It’s munis, corporations and other large entitles that are making the decision to go with solar or wind,” he said. “If you are involved in finance and want volume, you have to look at [contracts for difference], synthetic PPA swaps and remote PPAs. You need to get comfortable with these.”

Michael Silvestrini, president of Greenskies Renewable Energy, told financiers that are interested in getting deeper into C&I, there is a need to find efficiencies when putting together the complex transactions that can involve multiple offtakers and locations. “To limit structural complexity, we’re asked to bring fully executed contracts to financiers,” he noted. “So it’s a little clunky. We’d like to wash, rinse and repeat.”

Valuing the merchant tail of renewable energy

“Financial innovation is happening in equity,” said Henger. “The piece people argue about is what the value of the asset is at the end of its hedge.”

That is an open argument, one that makes Dan Benoit, chief investment officer for North America at Brookfield Asset Management, a bit queasy. “We struggle with this question,” he said. “I don’t think we’ve found the magic bullet.”

Margins are already thin in the rest of the merchant generation business, and with falling prices, a good answer about how to make money on projects once they get to the merchant tail after their initial contract is hard to find. “It’s a tough business, and it’s getting tougher,” Tom O’Flynn, EVP and CFO of AES Corporation, said of the merchant business.

Beyond merchant solar, repowering contracts is a topic that’s warming up in the wind industry, although not many deals are happening just yet. Earlier this year, GE announced it repowered 300 turbines in a deal with NextEra Energy. Once people work through technical and tax issues, “I think they’ll come fast,” Kevin Walsh, managing director of renewable energy for GE Energy Financial Services, said of future repowering deals. Brookfield Asset Management and AES said they have not executed repowering contracts in the U.S. yet, but they are keeping an eye on opportunities.

The tax conundrum

The issue of PTC and ITC reform was mostly a non-starter. “There is little appetite with the House and Senate to revisit wind and solar phase-down schedules,” Greg Wetstone, CEO of the American Council on Renewable Energy, said about potential for the ITC and PTC to be revisited, after having spoken with members of Congress and White House staffers.

Even so, there are still questions that need to be answered. There is not yet clear guidance for solar in terms of which projects will be granted safe harbor under the ITC and PTC. For wind that uses the PTC, the IRS says as long as a developer has excavated a foundation at the site, that’s enough, said Katherine Breaks, managing tax director within tax credit and energy advisory services at KPMG. As for solar-plus-storage guidance from the IRS, “[It] would be helpful,” said Walsh, “but I’m not hopeful.”

Another potential issue is the possibility of changes to the corporate tax rate. “In the upside-down world we live in, corporate tax rate reductions are a bit of a mixed bag for renewable energy,” said Breaks. She noted that because wind and solar can get a five-year tax write-off, “a major component of tax equity return is the write-off of these losses.” Therefore, any reduction in the corporate tax rate would be a reduction in those returns. “Developers will have to grapple with, ‘Where do we find the cash to fill that hole?’” she said.

If the renewable financiers in a New York midtown ballroom are any gauge, however, developers will not have to grapple with that question any time soon, if at all. When one moderator asked the room how many saw tax reform coming this year, not a single hand went up. When the question was pushed out to 2018, only a few raised hands could be counted amongst a few hundred conference attendees.


Be the first to comment - What do you think?  Posted by Editor - June 24, 2017 at 6:55 am

Categories: General   Tags:

The Most Important Solar Charts of 2017, H1 Edition [GTM Squared]


Be the first to comment - What do you think?  Posted by Editor - at 6:45 am

Categories: General   Tags:

The Big Problem Facing Offshore Wind in Australia

Recently unveiled plans for an offshore wind farm in the Australian state of Victoria face a major hurdle: onshore projects are far, far cheaper.

“Right now, in Australia it’s a very competitive price market,” said Robert Liew, senior analyst for Asia-Pacific at MAKE Consulting, which is owned by GTM’s parent company Wood Mackenzie. “The price of onshore wind is even more competitive than, say, a new-build coal project.”

Onshore projects are delivering power at between USD $0.45 and $0.56 per megawatt-hour, he said. Offshore wind in Australia might struggle to come in at twice that level.

In Europe, offshore wind is getting close to Australia’s onshore price range because countries such as Germany and the U.K. have spent decades building an industry to support their projects. Europe also boasts several major offshore wind turbine manufacturers. 

But the lack of native turbine-makers or an established supply chain makes it hard for offshore generation to come anywhere close to the price of onshore projects in Australia.

Nevertheless, Victoria’s government this month welcomed a proposal from Offshore Energy, a little known developer, to carry out a feasibility study for a 250-turbine project between 10 and 25 kilometers off the coast of Gippsland, in the southeast of the state.  

“A new renewable power generator of this size would drive down electricity prices, and we’ll support offshore energy wherever we can to progress this study,” said Victoria’s Minister for Energy, Environment and Climate Change, Lily D’Ambrosio, in a press release.

If the AUD $8 billion (USD $6 billion) project goes ahead, “It is hoped the wind farm could be generating power in time to contribute to the Labor Government’s Renewable Energy Target of 40 percent by 2025,” the press note said.

It’s a big “if,” though.

Jack-up barges, which are just one vital element of the offshore wind supply chain, can cost $165,000 a day. There are almost certainly none in Australia, nor, quite possibly, in the whole of the southern hemisphere.

If a barge has to be chartered all the way from Europe, along with all the other vessels needed for construction, support, cable-laying and more, and the turbines and other components also have to be shipped around the world, “It’s going to be difficult to get the cost down,” said Liew. 

Onshore wind, in contrast, is cheap and easy. The average size of onshore wind farms in Australia is 130 megawatts, and the projects have capacity factors of between 35 percent to 45 percent. 

Add in the low cost of plots in Australia’s vast open landscape, and the country emerges as one of the best places on the planet to build onshore wind farms. “Whether offshore can offer better value is the million-dollar question,” Liew commented.

And it’s not just costs that could pose a problem for offshore wind in Australia.

According to Robert Bates, assistant underwriter at the renewable energy insurer GCube, “Earthquakes and cyclones, while infrequent in Australia, are natural-catastrophe-type risks that developers in Australia offshore wind may have to contend with.”

The seabed surrounding Australia is “diverse and complex,” he said. “Moreover, different soil types require different foundation types. Detailed geotechnical studies will be crucial in determining what will be best for each site.”

Finally, given that there are more than 1,100 offshore oil and gas platforms around the country, “safely circumnavigating existing marine infrastructure is especially challenging.”

Australia does not appear likely to gain an industrial advantage by planting turbines off Gippsland. It has no original equipment manufacturers that would benefit, or nearby markets to exploit. 

That said, it is too early to completely write off the prospect of Australian offshore wind. Liew said he spoke to developers “curious” about investigating offshore projects in the country.

Australia also has a history of welcoming foreign companies to build infrastructure projects, he said. And with high electricity prices, there might be an opportunity to introduce technologies that would not be viable elsewhere.

Finally, the timeframe for the Gippsland project may leave enough room for further cost reductions. Beyond 2020, a low-cost offshore supply chain might be accessible from Asian markets such as Japan or South Korea. 

Turbines, meanwhile, might be supplied by firms such as Siemens, Vestas or Senvion, which already have a significant presence in the Australian onshore market.

“I wouldn’t rule it out,” said Liew. “Maybe the conditions [in Gippsland] are just perfect. But it’s a real tough sell.”


Be the first to comment - What do you think?  Posted by Editor - at 6:40 am

Categories: General   Tags:

Nuclear Can Be Friends With Renewables—If It’s Modular

As the wind gusts across the rural plains of Idaho rise and fall, a new type of nuclear plant could react in kind, generating more and less power in tandem with the wind farm. 

That’s the vision laid out in a new paper from nuclear startup NuScale Power using computer models for a planned nuclear farm built near the Horse Butte Wind Project in Idaho.

The company, founded a decade ago, recently looked at how its modular nuclear reactors could follow clean energy, lowering and raising electricity output, if needed.

The surge of wind and solar in grids around the country is creating more variability in generation. As a result, power companies are starting to look at how traditional baseload energy sources like coal and nuclear can be more flexible, and lower their energy generation to avoid wasting power or overloading the grid.

In Germany, France, and Canada, some nuclear plants are already doing this, but it’s more out of necessity than design. Because NuScale’s reactors are much smaller than traditional nuclear reactors, the design can enable a power company to switch off individual modules, enabling load following (adjusting power output) in a more efficient way.

“We concluded that yes, we can load follow and we should be able to do it better and more responsibly than large nuclear plants,” said Daniel Ingersoll, NuScale Power’s director of research collaborations and lead author on the paper.

“Baseload plants are being forced into situations where they need to load-follow, and that’s not really operating those plants in their best form,” Ingersoll explained.

The findings are important because they provide a new way for nuclear energy to adapt and be more flexible as the grid mix changes. Some U.S. nuclear plants are getting shut down early, removing large sources of carbon-free baseload energy from the grid, and a handful of new nuclear plants are in danger of not getting built. It’s now cheaper to build natural gas and renewables in most states.

The U.S. has 90 nuclear reactors, which provide about a fifth of U.S. electricity needs. A recent study from MIT found that if all U.S. nuclear reactors at risk of operating uneconomically were replaced by new gas plants, U.S. emissions would jump by 4.9 percent.

Modular nuclear, like NuScale’s reactors, could offer a new role for the technology. “The whole issue of load-following comes down to economics,” said Ingersoll.

However, modular nuclear reactors aren’t yet available commercially. NuScale, the company closest to commercializing its tech, submitted designs to the Nuclear Regulatory Commission late last year. Regulators accepted the designs for review in March of this year.

If all goes well, NuScale’s first commercial plant could be operating in Idaho by 2026 for Utah Associated Municipal Power Systems and operated by Energy Northwest.

It’s supposed to take the NRC three years to review the design papers, which include 12,000 pages of technical information and work from 800 NuScale staff.

NuScale has been funded by the Department of Energy and by engineering giant Fluor Corporation. The company’s technology is based on light water reactor designs and uses a convection process, eliminating the need for pumps (which can break down) to circulate the cooling liquid.

Each NuScale reactor is 65 feet tall and 9 feet in diameter, and can produce 50 megawatts of power. The modules are meant to be grouped together, with up to 12 monitored by a single control room.

The NuScale reactors can load-follow by turning off individual reactors. They can also react quickly to changes in the grid by bypassing the steam turbine. However, all of these operations are theoretical until the NuScale reactors are up and running and operating in tandem with a wind or solar farm.

NuScale isn’t the only company working on modular reactors. Others include Terrestrial Energy, Babcock & Wilcox, Gen4Energy and UPower Technologies. But the nuclear industry, and its U.S. regulatory body, aren’t very friendly to entrepreneurship or change.

Nuclear entrepreneurs have advocated for a new federal approval process to get novel nuclear reactor designs to market much more quickly. Large funding needs and long lead times to commercialization have also proved difficult for nuclear entrepreneurs, particularly those supported by venture capital.

The federal government is presenting new challenges. President Trump’s budget doesn’t renew a grant that NuScale is using to fund its first plant in Idaho. The budget also proposes cutting the Department of Energy’s nuclear office by 31 percent, halves a program that extends the life of existing nuclear reactors, and slashes several advanced nuclear tech efforts. 


Be the first to comment - What do you think?  Posted by Editor - at 6:30 am

Categories: General   Tags:

We Couldn’t Monitor Larsen C Without These Satellites

The Larsen C ice shelf is about to calve one of the biggest icebergs on record.

The iceberg-to-be is hanging on by a thread, with just eight miles of solid ice standing in the way of a rift that’s spent years carving through the ice. Scientists can track the growth of the crack with precision during the summer season by flying over it, but even during the dead of Antarctic night, they’re still able to see it clearly thanks to eyes in the sky.

The evolution of the crack across the Larsen C ice shelf (seen in the lower right-hand corner) as it spreads 38 miles from January 2016-January 2017.
Credit: European Space Agency

Two European satellites, known as Sentinel-1, criss-cross over the region every six days like clockwork. Their sensors are able to see through clouds and darkness to provide a real-time image of the most-watched patch of ice on the planet.

“The close monitoring of this rift really is a success story for Sentinel-1,” Adrian Luckman, a glaciologist at Swansea University, said.


Luckman is part of Project MIDAS, a team of researchers intently monitoring the crack. He said without the satellite, researchers would only have access to low-resolution images or have to pay for data from private companies.

To monitor Larsen C, scientists are using what’s known as synthetic aperture radar. It’s particularly useful in polar regions because it can see through clouds and darkness, both of which are plentiful at certain times of the year at high latitudes. The same technology has also been used on space probes to image the surface of cloud-covered Venus.

With Larsen C likely to calve one of the largest icebergs on record, having instruments that can track it no matter the condition is crucial to improving researchers’ understanding of the polar regions. While the rift on Larsen C is likely due to natural causes, the instability that climate change is fueling in the Antarctic make these types of observations essential to know what comes next.

A composite Sentinel-1 image of northern Canada showing changes in sea ice extent in winter 2016-17.
Credit: European Space Agency

Because it’s polar orbiting, the Sentinel-1 mission also provides information on what’s happening on the other end of the planet. Climate change is taking a toll on the Arctic that’s in some ways even more dramatic. Sea ice is disappearing at an alarming clip and Greenland’s massive ice sheet and other land ice is also melting.

Monitoring changes now can help improve future predictions, but the satellites also provide important observations that can be used now in an otherwise data-sparse region.

“Their high resolution measurements are of significant value for numerous stakeholders beyond just scientists, e.g., shipping industry during the Arctic summer (and) navigation through sea ice,” Zack Labe, a PhD student studying the Arctic at the University of California, Irvine, said in an email. “I think this is a key point that we often forget. These remote sensing observations (like from the Sentinels) provide services to many industries on both land and water.”

Labe pointed to monitoring oil spills and creating forecasts for the Arctic as just two of the uses for the Sentinel-1 satellite data.

The two satellites are managed by the European Space Agency as part of its Copernicus program, which is designed to create a comprehensive monitoring program for changes around the world. There are three other missions currently in orbit and three more will be launched in the coming years.

Together, they’ll be used to monitor a wide array of planetary vital signs at a time when the world is rapidly changing due to carbon pollution.



Be the first to comment - What do you think?  Posted by Editor - at 6:25 am

Categories: General   Tags:

Climate Change Altering Droughts, Impacts Across U.S.

As a major drought devastated the West and Midwest beginning in 2012, farmers racked up billions of dollars in crop losses and water managers grappled with possible water shortages for millions of people as reservoirs dried up in the heat.

That drought is now gone. But scientists have found that the dry spell showed unusual wild extremes of wetness and warmth — indicators that climate change may be altering the typical characteristics of drought across the U.S., according to a new National Oceanic and Atmospheric Administration study published in the Bulletin of the American Meteorological Society.

Water levels in Lake Success, Calif., dropped to 4 percent of capacity during the state’s devastating drought in 2014.
Credit: David Siebold/flickr

The stakes are high. Extreme drought across the U.S. has contributed to tens of thousands of job losses, unpredictable and often extreme rainfall and devastating wildfires that have left behind many millions of charred acres of land and billions of dollars in property losses.

Study author Richard Heim, Jr., a researcher at the National Centers for Environmental Information at NOAA, compared a nationwide series of dry spells beginning in 1998 to two other devastating droughts in the 1930s and 1950s, including the Dust Bowl.


He said one of the most unique features of the drought as it spread across the country in 2012 was that some parts of the country were outright soggy while others dried up in the sun. Compared to two earlier dry spells, a recent series of droughts beginning in 1998 saw the largest area of the U.S. with normal or above-normal precipitation.

“The main thing is the 1998-2014 drought episode is the warmest of the three major drought episodes. It’s also the wettest,” Heim said. “A huge chunk of the country is in drought (during that period) and other chunks are really wet.”

Warmer temperatures over the past 30 years and more frequent regional dry spells since 1998 are changing the water cycle, posing challenges for urban and agricultural areas throughout the country.

“When broken down by region, the 1998-2014 episode had more days with precipitation than the other two episodes for all regions except the southern plains and Lower Mississippi Valley,” the study says.

During that time, the Northeast and Midwest, which saw their own dry spells, received the most rain and snowfall, while much of the West dried out, the study says.

2012 was a critical year for drought in the U.S. Several ongoing regional droughts “merged” to create a massive nationwide drought leaving more than 60 percent of the country suffering severe levels of drought or worse.

Today, about 7 percent of the continental U.S. is in drought considered severe or worse, down from 22 percent in early January and 31 percent last November.

Research shows that when dry spells occur, climate change is likely to make them drier than they would otherwise be because warmer temperatures increase evaporation. It also means that when storms hit, more precipitation is likely to fall as rain than snow, shrinking the snowpack that’s important to storing water for cities and farmers to use during the dry season.

A Climate Central analysis shows that since 1949, 68 percent of weather stations between 2,000 feet and 5,000 feet in elevation in 42 states have seen a lower percentage of winter precipitation falling as snow.

The U.S. Drought Monitor in July 2012 at the height of the nationwide rought.
Credit: U.S. Drought Monitor

Benjamin Cook, an associate research scientist at Columbia University’s Lamont-Doherty Earth Observatory, who is unaffiliated with the study, said that though scientists can’t say definitively that global warming caused the most recent drought, climate change projections clearly show that drought intensity and risk will increase for much of the U.S. Warmer temperatures are already exacerbating recent droughts in California, the Pacific Northwest and the Colorado River Basin, he said.

Cook’s 2015 research shows that rising temperatures and decreasing rainfall driven by climate change are creating an unprecedented risk of severe drought in the Southwest and Central Great Plains, setting up drought conditions that could be worse than at any time in the last millennium, even worse than the Dust Bowl.

Heim’s study shows that one of the unusual characteristics of the most recent drought compared to those in the 20th century is that, on a national scale, the drought’s driest seasons were in the winter and spring. Warm temperatures during those seasons resulted in greater evaporation and transpiration, worsening drought conditions. The driest seasons in the previous droughts were fall and summer.

Cook said that the 2012 “merger” of regional droughts is unusual because different regions of the country typically see their wet and dry seasons at different times of year. California, for example, receives nearly all its precipitation during the winter, while the Central Plains see their wettest months during spring and early summer.

“For multiple regions to be in a period of extended drought simultaneously therefore requires a level of synchronicity across the climate system that is rare,” Cook said.

Heim said drought over the past 20 years was caused by different atmospheric conditions depending on the year.

For example, a more northerly shift in the track of the jet stream and storms since the 1970s have led to shifts in drought location and intensity in the U.S.

La Niña was among the triggers of the devastating California drought, along with a prolonged high pressure ridge over the Pacific Ocean that deflected storms northward, missing much of California during the height of its drought.

Jay Lund, director of the Center for Watershed Sciences at the University of California-Davis, said Heim’s study is “thought provoking,” but he said every drought is unique.

“Most droughts are a bit quirky in their character, and all droughts impact a different society and economy than their predecessors making each drought and its lessons substantially unique,” Lund said. “Still, it is very likely that higher temperatures will worsen the severity of droughts.”



Be the first to comment - What do you think?  Posted by Editor - at 6:21 am

Categories: General   Tags:

Next Page »