Save on your hotel -

Pay-As-You-Go Transactions in Off-Grid Solar Top $41M in Late 2016

Save on your hotel -

People in developing nations used pay-as-you-go services to purchase more than $41 million in small-scale, off-grid solar products in the latter half of 2016.

The number of PAYGO transactions, as they are known, is actually probably far higher, according to the latest semi-annual report from the Global Off-Grid Lighting Association.

This is the first report where GOGLA has separated out PAYGO from cash payments, which still make up the bulk of purchases. Globally, consumers spent about $114 million in cash on products such as solar lanterns and residential micro-solar systems in the same period. Single lights with a phone charger make up about half of the sales, no matter the payment method.

At the global level, about 3.77 million products (including verified and non-quality verified products) were sold in the second half of 2016. Sub-Saharan Africa and South Asia account for approximately 1.87 million units (50 percent) and 1.41 million units (38 percent) sold, respectively.

Since sales reporting began in July 2010 through the end of 2016, a cumulative total of 23.72 million quality-verified and 3.48 million non-quality-verified product sales have been reported.

The report authors acknowledge the numbers are low all around. GOGLA only includes data from its 55 member companies and products qualified by the World Bank’s Lighting Global platform.

The data is self-reported, and some companies do not provide revenue information or complete sales information. If information is missing from sales volumes, it is not included. According to GOGLA and Bloomberg New Energy Finance, the data in the report likely represents about half of all of the sales in off-grid solar products in the markets covered.

GOGLA found that by the end of 2016, more than 85 million people had improved energy access due to these products, which range from about 1 watt to 100 watts. The number of consumers with energy access is actually lower than GOGLA reported previously, but that is due to a change in methodology. The off-grid solar market is a relatively young one, especially for companies that are not entirely reliant on nonprofits and grants, and tracking data on these markets is still somewhat scarce.

“The drop in these impact metrics is therefore owed to restructuring our database and not to a decreasing number of products being used actively by households,” explained Dutch consulting company Berenschot, which oversaw the report. “We hope each report will be an improvement on the last in terms of accuracy and quality of data.”

The sweet spot for product revenue, no matter the payment method, is 3 watts to 10 watts, with most of the volume sold in East Africa. That wattage includes multiple lights and phone-charging capabilities. These products define Tier 1 energy access for at least one person and up to a household. Within the 3- to 10-watt category, PAYGO sales are 40 percent higher than cash sales.

In terms of sheer volume, however, simple solar LED lights of fewer than 3 watts, with or without mobile charging, still dominate sales. Most purchases of unites smaller than 3 watts are in cash, whereas systems larger than 50 watts are, rather unsurprisingly, purchased through PAYGO systems.

Although the young market still skews toward small solar LED lanterns that bring homes and small businesses the very lowest level of modern energy access, the impact of the growth of this market in the past few years is extraordinary.

GOGLA estimates the products represented in their report help contribute to the incomes of nearly 2 million people. Households have saved about $200 each, with a 164 percent average increase in available hours of light. As Greentech Media previously reported, one of the most significant outcomes of the proliferation of off-grid solar LEDs is that 20 million dangerous kerosene lanterns are no longer in use.

Of course, this is still the tip of the iceberg, with plenty of challenges to scale in the off-grid solar market, not to mention the market for clean cooking options, which solar does not tackle at this point. “With around 1.2 billion people living without access to the grid, spending about USD $27 billion annually on lighting and mobile phone charging,” Koen Peters, executive director of GOGLA, said in the foreword of the report, “the sector still has a lot of work to do.”


Be the first to comment - What do you think?  Posted by Editor - May 20, 2017 at 6:25 am

Categories: General   Tags:

The Quest to Define the Locational Value of Energy Efficiency [GTM Squared]


Be the first to comment - What do you think?  Posted by Editor - at 6:15 am

Categories: General   Tags:

The Solar Industry Can Learn Discipline From the Airline Industry, Says Deutsche Bank Analyst

The U.S. solar market nearly doubled last year. From 30,000 feet, it’s a spectacular view.

Along with soaring 95 percent, solar actually became the largest source of new generating capacity in 2016 — beating gas and wind.

Take a look.

But come down closer to earth, and the view changes drastically.

Public companies are watching their valuations tumble, and investors are taking their money elsewhere. With residential companies restructuring and manufacturers under severe pricing pressure, it’s hard to see how the trend will reverse anytime soon.

But if installation volumes are so high in the U.S. and around the world, why are investors so down on the sector?

We’ve chewed on this before. Many of the problems are unique to individual companies. However, they do share a common theme: too much complexity and a lack of discipline.

It’s not about the macroeconomics of solar, says Vishal Shah, a managing director at Deutsche Bank. It’s about simplifying business models, ditching the growth-at-all-costs mentality, and telling investors a clear story.

“If you look at the industry overall, the volume increase has been quite significant. We don’t see that stopping,” said Shah, speaking at GTM’s Solar Summit this week. “What this industry needs is cost discipline.”

Investors see solar business models as overly complicated or too risky in a vicious pricing environment. “Simplify the business model,” exhorted Shah. 

He offered some examples.

Before getting acquired by Tesla, SolarCity’s vertical integration strategy, ballooning customer acquisition costs, and confusing accounting practices turned a lot of investors off. But when the company finally focused on more moderate growth and lower installation guidance, “the Street didn’t like that” either. 

Investors found it difficult to understand SolarCity’s complex story, said Shah.

Now that the company is a part of Tesla, it will be harder to evaluate its performance. (Although it now arguably benefits by becoming a part of the Elon Musk narrative, and hiding some of its problems within a much larger company.)

SunEdison’s breathtaking collapse was also caused by too much complexity. Instead of focusing on steady project origination and development, SunEdison became a financial engineer and too much growth. Its planned acquisition of residential installer Vivint further baffled investors, crushing the stock.

“It became harder to analyze,” said Shah. Investors don’t want to spend their time trying to figure out the complexity of a sub-billion-dollar market cap company, so they go elsewhere. “That’s the feedback.”

The inverter and solar optimizer maker SolarEdge — the most recent solar IPO — suffers from a different perception problem. While the company has been disciplined and maintained margins of 30 percent, investors are worried that margins continue to fall downward because of severe price competition. “It’s hard for a hardware company to go public in this type of environment,” said Shah.

So are there other industries similar to solar, where demand is high but companies can’t seem to make money? 

Look to the airline industry, said Shah.

“There are a lot of similarities. Companies went bankrupt and there was no price discipline. But if you look at airlines today, it’s an example of where solar could be.”

From 2001 to 2008, 15 airlines in the U.S. alone went bankrupt. But they turned things around quite dramatically by implementing better pricing structures. Since top airlines restructured, the Arca airline index has far outpaced the S&P 500.

Harvard Business Review has published an instructional look at the turnaround in the industry. “We term the steps that the airlines took to save themselves ‘edge strategy’ — the strategic monetization of the huge value that often lies untapped on the edge of a core business.” (Low fuel prices didn’t hurt, either.)

In such a nascent, fragmented solar market, it’s hard to say what kind of edge strategies would appeal to investors. For Tesla/SolarCity, it could be the premium solar roof product and a customized consumer experience, rather than growth for growth’s sake; for SolarEdge, it could be expanding analytics and distributed resource management.

Meanwhile, public solar companies are stuck in an awkward place. Their addressable market is nearly limitless, but investors still don’t agree on the best way to tap that growth.

“It’s hard to make money. But if you look at just the volume trends, there’s no reason not to be bullish on this sector,” said Shah.

Are you a Squared member? If so, lucky you — you can watch this interview and all our other panel discussions from this week’s Solar Summit. If not, sign up to get access to a ridiculous amount of in-depth content.


Be the first to comment - What do you think?  Posted by Editor - at 6:05 am

Categories: General   Tags:

Secretary Perry, We Have Some Questions for You Too

In April, Department Of Energy Secretary Rick Perry issued a memorandum to his staff asking pointed questions about the future of the electric grid as coal is retired off the system, including:

  • “Whether wholesale energy and capacity markets are adequately compensating attributes such as on-site fuel supply and other factors that strengthen grid resilience and, if not, the extent to which this could affect grid reliability and resilience in the future; and
  • The extent to which continued regulatory burdens, as well as mandates and tax and subsidy policies, are responsible for forcing the premature retirement of baseload power plants”

Given the rapid change facing America’s electricity system, these questions may seem reasonable, but they reflect an outdated world view. 

DOE’s publication of this memorandum presents an opportunity to uncover many of these outdated assumptions and understand what’s driving the unstoppable transition from coal to other technologies. By taking each premise in turn and providing evidence-based analysis — as others have done in different ways — we can see that the projected demise of coal will result in a cleaner, cheaper and more reliable energy system.

Premise 1: “Baseload power is necessary to a well-functioning grid”

To understand whether this is true, some definitional work is needed. Baseload generation’s purpose is to meet the baseload or demand, which Edison Electric Institute defines as “the minimum load over a given period of time” in its Glossary of Electric Industry Terms. The same glossary defines baseload generation as: “Those generating facilities within a utility system that are operated to the greatest extent possible to maximize system mechanical and thermal efficiency and minimize system operating costs…designed for nearly continuous operation at or near full capacity to provide all or part of the baseload.” 

In other words, baseload plants are those whose efficiency is highest when run at a designed level of power, usually maximum output, and deviations from this level of power reduce efficiency and increase costs. Baseload generation is an economic construct, not a reliability paradigm.

A system with baseload thermal generators as its backbone comes with reliability pros and cons. For example, baseload power usually has heavy generators with spinning inertia, which gives conventional generators time to respond with more power when a large generator or transmission line unexpectedly fails. But we now know how to get such responses much more quickly from customer loads, newer inverter-based resources like wind, storage and solar, and gas-fired resources.

As the Rocky Mountain Institute’s Amory Lovins recently detailed in Forbes, fuel storage may appear to provide protection from a failure of gas supplies or weather events, but stored fuel has its own set of problems and failure modes: 

  • The 2014 polar vortex rendered 8 of 11 gigawatts of gas-fired generators in New England unable to operate.  
  • Coal has serious risks of supply due to susceptible transport by rail, as over 40 percent of U.S. coal comes from a narrow rail corridor from Wyoming’s Powder River Basin.  
  • Extreme cold can also render on-site coal unusable, as happened during the Southwestern blackout of February 2011 that shut off power to tens of millions of customers. 
  • Nuclear power can be shut down or impaired by unseasonable hot weather, when cooling water is too warm and plants must be shut down for safety and to prevent mechanical damage. 

So in fact, baseload units, even with fuel stored onsite, are sensitive to weather and many other failure events.

Lovins also points out that coal and nuclear baseload generators are unable to operate continuously, despite perceptions to the contrary. On average, coal-fired stations suffer unexpected “forced outages” 6 percent to 10 percent of the time, and nuclear plants experience forced outages 1 percent to 2 percent of the time, plus 6 percent to 7 percent scheduled downtime for refueling and planned maintenance. 

On the flip side, solar and wind are 98 to 99 percent available when their fuel (the sun and wind) is available, and the ability to predict the weather is improving all the time. The reliability risks from fossil fuels are collectively managed today, mostly by paying to keep reserve generation running to respond when they unexpectedly fail, but this creates the need for redundancies and costs in the grid comparable to those that cover the uncertainty of weather forecasts for wind and solar power.

Premise 2: “[The] diminishing diversity of our nation’s electric generation mix”

The U.S. electricity mix is seeing a trend of increasing diversity, rather than decreasing diversity. Until recently, the notion that supporting coal generation would improve diversity was nonsensical; coal was the dominant and largest source of U.S. electricity for decades, so an argument for more diversity would be an argument for reducing the use of coal.

Today, coal and natural gas produce roughly equal shares of U.S. generation, while nuclear and hydropower (hidden in the renewables bucket below) are projected to continue their near-constant supporting roles.  With increasing renewable fuels, driven particularly by the growth of wind, solar and biomass generation, one can see that fuel diversity has actually increased dramatically since 2001.

Source: U.S. Energy Information Administration and AEO 2017

Today, coal is declining, with more than 90 gigawatts of the more than 250-gigawatt fleet projected to retire under business-as-usual conditions by 2030, but it will remain a meaningful player in the marketplace for at least the next decade according to baseline Energy Information Administration projections. In the long term, however, whether reducing coal generation impacts fuel diversity and resilience depends more on what replaces it than whether the coal remains. 

A portfolio of generation options with different characteristics insulates consumers from price risk and availability risk. Keeping some coal-fired generation on-line would help in that regard, particularly if its environmental costs are not considered. If retiring coal and nuclear are replaced mostly by natural gas, we would see a decline in fuel diversity, and that could potentially increase risk due to the characteristics of the natural gas supply. 

The same would be true, for example, if we myopically rely on solar as the only technology to decarbonize the grid. Studies of the optimal mix of resources in California to meet the 50 percent renewable portfolio standard by E3 and NREL each found that geographic and technology diversity of the renewable resources will substantially reduce the cost of compliance compared to the high-solar and in-state-only cases. 

But under current projections out to 2030, we are only going to see greater fuel diversity, not less, as natural gas, demand-side resources and utility-scale renewables take the place of retiring coal. This should increase the resilience and security of the system, particularly if this change is accompanied by more investment in transmission, storage and demand-side management.

Premise 3: “[Renewable] subsidies create acute and chronic problems for maintaining adequate baseload generation and have impacted reliable generators of all types”

Beyond the question of what “adequate baseload generation” actually means, it is undoubtedly true that coal and nuclear baseload units are suffering financially in both vertically integrated and restructured markets

A recent FERC technical conference offered a forum for generators and wholesale market operators to vent their frustration in what they see as the inadequacy of markets to provide generators with sufficient revenue.  But in reality, this financial pain is the effect of oversupply and intense competition. For example, despite low capacity prices in the PJM Interconnection, 5 gigawatts of new natural gas capacity cleared in the most recent auction. Coupled with stagnant demand, something has to give — and inefficient coal plants are the more expensive and the least flexible generators that are not needed in this competitive landscape.

Competitive pressure from cheap gas, inexpensive renewables, and declining demand are undermining the financial viability of baseload plants, but we are far from a crisis of reliability and resilience. Consider the reserve margins and reference levels in the major markets.

Each market is oversupplied. In the case of PJM and SPP, the condition is drastic — they have double the excess capacity they need to meet stringent federal reliability criteria. One panelist at the May FERC technical conference captured the dynamic: “PJM has reserve margins of 22 percent. I think of Yogi Berra, my favorite economist — we’ve got so much capacity, we’re going to run out.”  

A well-functioning market would allow uncompetitive generators to retire amid steep competition and declining prices, given the oversupply conditions. To blame coal’s suffering on policies supporting clean energy denies the root cause — coal-fired generation is dying on economics alone.

Keep calm and retire on

Several premises that underlie the need for the forthcoming DOE study are false. Chief among them is a singular focus on baseload generation, particularly coal-fired power plants, as necessary for maintaining reliability. Baseload generation is an economic characteristic, not a reliability concern. 

Replacing expensive, environmentally unsustainable coal is not a matter of ensuring adequate baseload — the key will be quantifying the reliability services that are needed and ensuring that the replacement generators can provide it. 

But hitting the panic button, which is what Rick Perry’s memorandum appears to do, is completely missing the picture; rather than losing diversity, our electricity mix is rapidly diversifying and our markets are vastly oversupplied with energy sources today. Any attempt to conclude otherwise will simply create unjustified roadblocks for new renewable generation, which is crucial to ensure a clean, affordable and reliable electricity system.


Michael O’Boyle is a power sector transformation expert for America’s Power Plan


Be the first to comment - What do you think?  Posted by Editor - at 6:01 am

Categories: General   Tags:

GTM Solar Summit Keynote: ‘We Are at a Pivotal Moment in Solar Going Mainstream’

GTM’s tenth annual Solar Summit took place this week in exciting, frustrating and volatile times for the industry. Shayle Kann, senior vice president at GTM Research, kicked off the event by confronting the difficult realities of the solar industry today. He also gave some strong reasons for hope in this still maturing marketplace.

Here, in Part 1 of the presentation, Kann looks back and at the next year or two. In Part 2, which we’ll cover tomorrow, he forecast the ways this market can reach thousands of gigawatts installed and trillions of dollars invested. (The video archive of the event is available to GTM Squared members.)

“If we look at just an aggregate of solar and pure-play solar stocks, it’s down about 40 percent since the beginning of 2016,” said Kann. “It improved a little bit in the past few weeks, but not a whole lot. Of course, stock prices are not always the greatest indicator of the whole market, but in this case, many companies have had a tough year, year and a half.

“In particular, it’s been hard for module manufacturers,” he continued. “And this has been true especially in the second half of 2016 and today.”

According to Kann, “The change between the first half and second half of last year was pretty stark, and it went in two different ways. First of all, between the first half and the second half of last year, the global demand for solar fell by 16 percent. This is despite the fact that we are having a banner year for solar overall.

“But China pulled back its entire program at the end of the first half of the year. There was a big contraction in the Chinese market in the second half of the year, and despite the fact that we saw a lot of growth in the U.S. and India, and some other markets, the global market shrank in that six-month period.

“As has been the case many times in solar, and this is a cycle we tend to repeat — and module manufacturers and manufacturers did not got the message in time — they continued to expand capacity. What happens when you have a shrinking market and you grow capacity at that the same time? You get thrown into oversupply. And that’s what happened in the second half of 2016. And the result of that, of course, as is generally the case in oversupply situations, is that prices crash. They ended up falling almost 40 percent just over the course of 2016, most of it in the second half.

“We remain in an oversupply cycle for panel manufacturing, and manufacturers have suffered as a result and margins have compressed. They’ve had to figure out how to survive through the downturn. Now normally, when you have oversupply in panel manufacturing, that is going to be bad for the manufacturers. It’s good for the downstream. It’s good for anybody who is buying panels, anybody who is developing projects. Indeed, I think that was true for many companies in 2016 through to today. We have a lot developers that have benefited from that. But it’s not universally true.”

In the U.S. in particular, there’s been a simultaneous challenge in the market specific to the residential solar, Kann said. 

“The residential market in the U.S. was booming for years,” he said. “From 2012 to 2015, four years in a row, the residential solar market grew by over 50 percent a year. And then in 2016, it started to slow down. We saw 19 percent growth last year. And all indications are that 2017 is basically going to be flat overall for residential solar.

“California is a big part of the issue. Not only is the state going to be transitioning to its second-generation of net metering and switching to time-of-use rates, but the state also had to cope with unprecedented rains in the first quarter of this year. The rain made it basically impossible for many installers to get up on roofs. And while California’s residential solar market saw a downturn over the winter, growth in the rest of the country went flat.

“This…has been hardest on the largest residential solar companies — with one little exception: Sunrun,” said Kann. CEO Lynn Jurich, who also spoke at the Solar Summit yesterday, said her company delivered 20 percent growth in the first quarter of 2017, while new residential solar capacity overall dropped by 11 percent from the fourth quarter of 2016 to the first quarter of this year. 

The combination of an oversupply for upstream companies and slow growth in the U.S. residential market ended up producing “a lot of negative results,” said Kann.

“There have been bankruptcies from SunEdison last year, to Sungevity to OneRoof to SolarWorld. It’s been a rough time in the public eye for solar,” he said. “And yet, I think we are going to have to take a small step backward…to recognize the milestones that the solar market — both in the U.S. and abroad — has simultaneously hit despite some of the turmoil.”

“We are at a really amazing point in the history of the solar market, and when we look back at this time 10, 20, 30 years from now, this is going to be a pivotal moment in solar becoming mainstream.”

On a more positive note, Kann noted that the global solar market grew by 50 percent overall. “That’s huge growth globally — that’s 78 gigawatts of solar, 10 times the amount that we installed in 2009,” he said.

The U.S. installed 14.5 gigawatts last year, up 95 percent over 2015. So the market basically doubled in this country in the last year.

And for the first time ever, solar was the single largest source of new generating capacity in the U.S. “We added more solar to the grid in this country than any other individual resource, more than natural gas, more than coal, more than nuclear, more than had ever happened before,” said Kann.

In addition to this massive growth within the U.S., Kann said another important development is that the sources of demand for solar are diversified. He explains:

“So the way that we got to 14.5 gigawatts installed last year is not by just throwing a bunch solar at a single type of market, but instead by opening up new avenues in solar, for the solar economics to work. So how did we do that?

“We add residential solar that was installed both through a third-party ownership model and directly owned by customers through a cash purchase or lease. We have a commercial market, non-residential market, that had actually a fair bit of smaller commercial projects, as well as larger commercial projects. A community solar market emerged. We have RPS-driven demand for utility-scale solar in California and other states. In addition to that, we have new PURPA mechanisms, which are performing across a bunch of states, and voluntary utility procurement, just because the economics work. We have the corporates emerging as a new source of procurement.”

Kann’s point: This is a diversifying market. If you looked at the chart below just two years earlier, in 2014, there would have only been a couple of segments.

The core point to make here is that there’s been a dramatic transformation in solar economics, said Kann. He explained: “Back in 2010, if you wanted to build solar, first of all, it was a new market, with a wide range of pricing, even within a single market — projects had far different pricing. And if you wanted to make the economics work, you needed a federal incentive, you needed the Investment Tax Credit. But that really wasn’t sufficient if you wanted to compete.

“And so the way that we built solar, back in the beginning of this decade, was by stacking incentives. You needed the federal incentive and then you needed something that was state-run. That is, generally speaking, no longer true.

“Today, the cost of solar, if even just taking the ITC into account, is more highly competitive throughout most of the country than basically any other resource. The exception to this is places that have really high wind resources.

“Solar is often now the cheapest new source of generation. And in fact that is becoming true even if you don’t take the ITC into account. If all you care about is getting the cheapest kilowatt-hours from any source of generation — solar is going to win.”

He continued:

“There are lots of ways you can look for proof points to prove that solar is the cheapest source. Here’s one example: Dominion. Dominion is a utility based in Virginia and North Carolina. And Dominion submits an Integrated Resource Plan every year to its regulators. In that Integrated Resource Plan, what Dominion and most other utilities are doing is they’re basically running a model to determine, given the resource needs and prospective costs in areas of generation, what the least-cost, best-fit resources are going to be in that territory.

“If you look at the 2016 Integrated Resource Plan, they were already planning to build out a bunch of solar in their territory in the next 15 years or so. But just a couple of weeks ago, they submitted their 2017 IRP, just one year later, running similar models, and the amount of solar more than quadrupled. They now expect in a no-Clean-Power-Plan scenario to build mostly solar, more than any other resource by a long shot — over 4 gigawatts just by 2031. If we extend it out to 2042, which is how far out the IRP ultimately goes, it becomes more like 5 gigawatts.

“And here’s what they said about it in the IRP. Basically, they explained that they added so much more solar into that plan due to the optimal economics, and importantly, the fact that the cost of solar, from their perspective, fell 24 percent between the time that they filed the 2016 IRP and the time that they reviewed the analysis for the 2017 IRP.”

“This is what happens when you get these drastic cost reductions for solar,” said Kann. “It has a real, meaningful impact on solar’s competitiveness with the other sources of generation. And lest you think that Dominion is alone in this, there are a dozen utilities planning in the same vein.”

“In two of the most recent Integrated Resource Plans, they have suggested that they should build at least half a gigawatt of solar in their territory,” he said. “Just these 12 utilities could nearly double the size of the cumulative utility-scale solar market. This is 12 utilities out of 3,000 in the U.S. — so, this is a very real transformation.”

This isn’t just a U.S. phenomenon. It’s a global phenomenon as well. 

Average PPA prices globally are now below 5 cents per kilowatt-hour, according to GTM Research. Plus, basically all of these markets are unsubsidized, so there’s no access fee in most of these places — it’s just pure solar prices. 

“There’s a race for the next record for lowest power-purchase agreement price that keeps happening,” Kann said. “We’ll get probably the lowest one again in the next couple of months. We have a sub-$30 megawatt-hour price in the U.A.E. We’ve got another one at about $30 in Mexico. In India, we’re down to about $40 a megawatt-hour. India historically has been a much more expensive market for solar, so these projects are real. This is a notable change.”

“Again, the point here is not that these individual projects are so cheap; it’s that broadly speaking, this is the pricing for big utility-scale solar projects in most of the world.”

These ongoing cost declines make solar competitive with, and often times better than, any other source of generation from an economic perspective. The result is that countries all over the world are introducing tender processes, or auctions, or something where they compare competitive pricing from solar values. “We now have some form of auction or tender scheme in over 16 countries either already in place or planned,” Kann said.

But despite all of this growth, “We have a huge mountain to climb for solar to become a meaningful chunk of electricity both in the U.S. and in the rest of the world,” he said. “Right now solar accounts for about 2 percent of electricity generation in this country, and that’s actually roughly the same globally. So, we’re growing a lot, but from a small base, and we’re bleeding into a gigantic market for power generation.”

He concluded: “There’s a lot to be done.”

Read about Part 2 of Shayle Kann’s State of Solar presentation tomorrow.


Be the first to comment - What do you think?  Posted by Editor - May 19, 2017 at 6:55 am

Categories: General   Tags:

European Mandate Catalyzes $48 Billion in Advanced Meter Spending Over the Next 5 Years

Utilities around the world are upgrading to advanced metering infrastructure (AMI) to access granular energy usage data and two-way communications in order to implement more complex network analytics.

Cumulative global AMI installations are expected to reach 922 million by 2021. Current contracts will drive a steady increase in installments through 2020, but will require new contracts to avoid a decline in 2021.

FIGURE 1: AMI Meters Installed, 2017E-2021E

Source: GTM Research, AMI Global Forecast 2017-2021

Europe will be a driving force in the global market, even as installs decline in 2021. A European Commission mandate is the primary force behind the world’s second-largest AMI market; the mandate targets 80 percent AMI penetration among EU countries by 2020.

The EU (along with Great Britain) is expected to make strong progress in the coming years, but will still be 37.6 million meters short in 2020 as investment in metering networks and hardware plummet.

FIGURE 2: AMI Meter Spend in Europe, 2017E-2021E

Source: GTM Research, AMI Global Forecast 2017-2021

Altogether, 15 countries have either announced delays or have not yet contracted the volume of meters required to meet their 2020 goal. Despite this shortcoming, GTM Research estimates that 175 million AMI meters will be installed in Europe by 2021. Under the current pipeline, European utilities will spend $18.7 billion to deploy AMI meters and networks, equal to 39 percent of global spend, from 2017 to 2021.

The Spanish, French and Italian markets are projected to install over 56 percent of these meters by 2021. Spain is expected to finish its rollout in 2018 when utilities Iberdrola, Endesa and Unión Fenosa Distribución complete deployments, totaling 27.3 million meters. France also remains on track to replace 95 percent of the country’s meters with AMI. French utility Enedis kicked off the second phase of its Linky smart meter deployment program last year.

A wave of investment in analytics solutions to capitalize on new AMI data will follow these deployments. From these solutions, utilities can leverage the data to improve DER integration, customer engagement, distribution O&M and a range of other analytics.

The full global AMI forecast covers 1.53 billion electric meters, based on existing AMI project pipelines and mandates. Get the full forecast and database through a Grid Edge Subscription. Learn more here.


Be the first to comment - What do you think?  Posted by Editor - at 6:53 am

Categories: General   Tags:

Focus on Carbon Removal a ‘High-Stakes Gamble’

The manmade emissions fueling global warming are accumulating so quickly in the atmosphere that climate change could spiral out of control before humanity can take measures drastic enough to cool the earth’s fever, many climate scientists say.

The most important way the earth’s rising temperature can be tempered is to reduce the use of fossil fuels. But scientists say another critical solution is to physically remove greenhouse gases from the atmosphere — something called “negative emissions” — so that carbon dioxide and rising temperatures could peak, and then begin to decline over time.

An electric power plant in India. Carbon emissions may need to be removed from the atmosphere as a solution to global warming.
Credit: Global Landscapes Forum/flickr

Many of the assumptions underlying the landmark Paris Climate Agreement rely on the idea that humans will be actively removing carbon from the atmosphere late this century because reducing emissions won’t be enough to prevent global warming from exceeding levels considered dangerous.

But that assumption relies on technology that hasn’t been proven to work on a global scale. Removing carbon dioxide from the atmosphere on a scale large enough to slow global warming is untested, and the technology is in its infancy. The effect it could have on the earth is largely unknown, and some scientists warn that some of the consequences of using negative emissions technology could be catastrophic.


Because of all those unknowns, it’s critical that humanity doesn’t bet its future on negative emissions, Stanford University Woods Institute for the Environment scientists Katharine Mach and Christopher Field write in a paper published Thursday in the journal Science.

The paper argues that both negative emissions technology and a commitment to quickly cutting carbon dioxide emissions as much as possible are critical to solving the climate crisis.

Carbon concentrations in the atmosphere must not exceed 450 ppm if global warming is to be prevented from exceeding a level considered dangerous by most climate scientists —  2°C (3.6°F), the primary goal of the Paris Climate Agreement. The problem, though, is that humanity is quickly running out of time to limit more warming. The atmosphere blew past the 400 ppm mark last September and it’s on a trajectory to pass 450 ppm within 22 years.

Most of the Intergovernmental Panel on Climate Change models underlying the Paris Climate Agreement assume some level of large-scale carbon removal will be occurring in the coming decades, but nobody knows exactly how that will be accomplished.

Ben Sanderson, a climate scientist at the National Center for Atmospheric Research who is unaffiliated with the paper, said the study shows that carbon removal shouldn’t be treated as a cure-all for climate change because the future of humanity can’t rely on untested technology.

“The major risk is that the planned-for CO2 removal might never come to pass — and this is a very real concern,” Sanderson said.  

The paper warns of dire consequences if the effects of negative emissions technology aren’t fully accounted for before they’re implemented.

For example, one of the negative emissions technologies carbon-removal proponents often cite as the most promising — bioenergy and carbon capture and storage, or BECCS — could create widespread food insecurity because it could take half of the world’s farmland out of production.

BECCS relies on converting agricultural areas and other land to vast new forests, which absorb atmospheric carbon in tree trunks and roots. The trees would be harvested for biomass energy and burned in power plants. The resulting carbon emissions would be captured and stored permanently — a method some scientists believe could be worse for global warming than burning fossil fuels.

“Converting land on this staggering scale would pit climate change responses against food security and biodiversity protection,” the paper says.

Chief among the many other negative emissions technologies being developed include expanding forests globally to store more carbon naturally, and building hundreds or thousands of facilities that directly remove carbon from the atmosphere and store it permanently. Those facilities, called “direct air capture” plants, have never been built on a large scale and scientists say they would require a large amount of energy to operate and many thousands of them would have to be constructed to make a dent in global warming.

The paper criticizes the idea of peak and decline — the theory that carbon removal could bring about a peak in global temperatures and then begin to cool the planet. That may be risky because the costs and consequences of global cooling following a temperature peak are not well understood. Some of the effects of climate change such as sea level rise and melting polar ice sheets can’t be reversed as the globe cools.

“These scenarios bet the future on CDR (carbon dioxide removal) technologies operating effectively at vast scales within only a few decades,” the paper says, referring to climate models assuming a peak and decline in atmospheric carbon concentrations. “Estimates of economic costs are crude for such scales and environmental tradeoffs are potentially stark.”

A Syncrude oil sands facility in Canada. Oil sands are among the most polluting forms of fossil fuels.
Credit: Pembina Institute/flickr

Ecosystems that will have begun to adapt to higher global temperatures as the world warms may struggle to adjust to the global cooling that peak and decline envisions. Scientists haven’t done much research on what the effects might be, Field said.

Sanderson said that peak and decline is unlikely within this century, but much more likely in the next century or beyond because changes in the climate system are hard to turn around even as atmospheric carbon concentrations decline.

The paper says that massive deployments of negative emissions technologies might work, but if they don’t, “future generations may be stuck with substantial climate change impacts, large mitigation costs, and unacceptable tradeoffs.”

Field said that when all the unknowns about negative emissions are considered, the best strategy to solve the climate crisis is to both develop carbon removal technology and work as quickly as possible to cut emissions today.

The paper generated a range of responses from negative emissions experts.

Klaus Lackner, director of the Center for Negative Carbon Emissions at Arizona State University and one of the world’s leading experts on carbon removal, said that he agrees with most of the paper’s conclusions, but it implies that climate change can be mitigated if emissions are drawn down quickly today.

“Yes, we should reduce emissions and move away from fossil fuels, but we will overshoot acceptable CO2 concentrations — indeed, we may already have overshot,” Lackner said, adding that no matter how much emissions are cut today, failure to develop negative emissions technologies will seriously damage the planet.

Pollution from a manufacturing complex in Toronto.
Credit: United Nations/flickr

He said cutting carbon emissions gets more and more difficult the more they are reduced. Carbon removal is necessary to prevent atmospheric carbon dioxide concentrations from growing further.

Glen Peters, a scientist at the Norway-based climate research organization CICERO whose 2016 paper called focusing on negative emissions technology a “moral hazard,” said he agrees with the paper’s conclusion that countries need to both radically cut greenhouse gas emissions while also upscaling negative emissions technology.

Peters said he isn’t worried about the consequences of peak and decline because paleontological records suggest there are few catastrophic consequences of small declines in the earth’s average temperature over a period of decades.

Sanderson said the questions the paper raises about the degree to which negative emissions technologies should be developed may be best solved by a putting a price on carbon.

“A well designed carbon market would find the optimal combination of emissions reductions and carbon removal,” he said. “This rather breaks the deadlock for decision makers today — they don’t have to decide whether to reduce emissions or invest in reduction technologies, all they need to do is put a price on carbon.”

John DeCicco, a professor at the University of Michigan-Ann Arbor Energy Institute, called the paper “excellent and thoughtful,” but it does not sufficiently recognize that the global carbon cycle already includes carbon removal processes. He said the paper doesn’t sufficiently factor in ways forests and soils could be managed to store more carbon than they do naturally.

DeCicco said that he agrees with the paper’s conclusion that it’s unwise for countries to automatically assume that technology will be developed to bring atmospheric carbon concentrations and global temperatures down to tolerable levels eventually.

“Hoping that future generations might somehow figure out an atmospheric CO2 decline in a way that undoes climate catastrophe is just foolish,” he said, adding that it’s critical for humanity to pursue both emissions cuts and carbon removal technology as quickly as possible.

“We can’t wait for some sort of technological deus ex machina to save us from ourselves,” DeCicco said. 




Be the first to comment - What do you think?  Posted by Editor - at 6:40 am

Categories: General   Tags:

Sharp Rise in Flooding Ahead for World’s Poorest

Coastal residents of poor and fast-growing tropical countries face rapid increases in the numbers of once-rare floods they may face as seas rise, with a new statistical analysis offering troubling projections for regions where sea level data is sparse.

Stark increases in instances of flooding are projected for Pacific islands, parts of Southeast Asia and coastlines along India, Africa and South America in the years and decades ahead — before spreading to engulf nearly the entire tropical region, according to a study led by Sean Vitousek, a researcher at the University of Illinois, Chicago.

Residents of Kerala in southern India face sharp increases in the number of floods in the years ahead.
Credit: Thejas Panarkandy/Flickr

“Imagine what it might feel like to live on a low-lying island nation in the Pacific, where not only your home, but your entire nation might be drowned,” Vitousek said.

The researchers combined a statistical technique used to analyze extreme events with models simulating waves, storms, tides and the sea level effects of global warming. They created snapshots of the future — flood projections that can be difficult to generate with the limited ocean data available in some places.


“If it’s easy to flood with smaller water levels coming from the ocean side, then gradual sea level rise can have a big impact,” Vitousek said. “For places like the Pacific islands in the middle of nowhere that don’t have any data, we can make an assessment for what’s going to happen.”

The study found that the frequency of formerly once-in-50-year floods could double in some tropical places in the decades ahead. The findings were published Thursday in the Nature journal Scientific Reports.

Red areas in this map represent large projected increases in the frequency of floods following 10 centimeters (4 inches) of additional sea level rise.
Credit: Vitousek et al., “Doubling of coastal flooding frequency within decades due to sea-level rise,” Scientific Reports, 2017.

“This is the first paper I’ve seen that tries to combine all these different elements in the context of sea level rise,” said Richard Smith, a statistics professor at the University of North Carolina, Chapel Hill who studies environmental change. “They’ve done it in a very systematic and well organized way.”

The tropics are home to some of the world’s most vulnerable coastal residents, often living in houses made from flimsy construction materials, under governments that have limited ability to provide food, water and care when disasters strike.

“Many poor developing countries like Bangladesh are going to see greater frequency and magnitude of flood events — even with the best efforts to reduce emissions,” said Saleemul Huq, director of the International Centre for Climate Change and Development in Bangladesh.

Residents of these sweltering regions have released little of the greenhouse gas pollution that’s warming the earth’s surface, melting ice and expanding ocean water and causing seas to rise. Global temperatures have risen nearly 2°F since the 1800s.

“These vulnerable countries and communities need to be supported to improve early warning and safe shelters — followed by economic support to recover afterwards,” Huq said.

All coastal regions face risks from rising seas, though the nature of the hazards varies. Seas are rising at about an inch per decade globally, at an accelerating rate with several feet or more of sea rise likely this century. Detailed information on water levels in many vulnerable places, however, is sparse.

“Understanding of sea level rise in the tropics is challenging because there’s a lack of long-term data,” said Benjamin Horton, a Rutgers professor who wasn’t involved with the study. “Tide gauges were installed for navigation in ports; big trade was between the industrialized nations of Europe and U.S.”

Unlike vulnerable cities and towns along the East Coast of the U.S., where frequent storms and big waves lead to large variations in day-to-day water levels, tropical coastlines tend to be surrounded by waters with depths that vary less. That means many tropical coastlines were not built to withstand the kinds of routine flooding that will be caused by rising seas.



Be the first to comment - What do you think?  Posted by Editor - at 6:30 am

Categories: General   Tags:

El Niño Again? This Is Why It’s Hard to Tell

The tropical Pacific Ocean is once again carrying on a will-it-or-won’t-it flirtation with an El Niño event, just a year after the demise of one of the strongest El Niños on record.

The odds right now are about even for an El Niño to develop, frustrating forecasters stuck in the middle of what is called the spring predictability barrier. During this time, model forecasts aren’t as good as seeing into the future, in part because of the very nature of the El Niño cycle.

The reason scientists try to forecast El Niño is because of the major, often damaging, shifts in weather it can cause around the world. The last one brought punishing drought to parts of Southeast Asia and Africa and torrential rains to parts of South America.

An El Niño also helps boosts global temperatures, as it did in 2016, the hottest year on record, and previously in 1998. Global warming, though, means that 2016 was almost 0.5°F (0.3°C) hotter than 1998, even with comparably strong El Niño events.

If another El Niño does materialize this year, it would be only the second time in the records that the Pacific went from the hot phase of an El Niño to the cold phase of a La Niña and then back to an El Niño again within three years. The relatively limited nature of those records, though, means researchers can’t be certain that such a combination is all that rare.

Even Odds

The tropical Pacific Ocean naturally vacillates between three different states about every three to five years. In its neutral state, warm water is pooled up on the western side of the basin, which fuels thunderstorms there. During a La Niña, the east-to-west trade winds that pile up that warm water intensify, amping up the normal conditions across the basin.

But during an El Niño, those trade winds relax or even reverse, allowing the warm water to spill to the eastern side of the basin, displacing storm activity toward the central Pacific.

To have confidence that an El Niño is in the offing, forecasters want to see both the ocean and the atmosphere getting into gear. While the eastern Pacific has shown some signs of warming in recent weeks, the atmosphere has yet to show signs that it is following suit.

Despite the fact that the major El Niño of 2015-16 was followed by a rather puny La Niña, the atmosphere seemed to see a lingering influence from that La Niña through the winter.


“That makes it hard for the ocean to warm up because the atmosphere is essentially trying to kick it back into a cooler state,” Michelle L’Heureux, an El Niño forecaster for the National Oceanic and Atmospheric Administration (NOAA), said. The atmosphere has only just started to look more neutral, she said.

Forecasters also aren’t seeing a big pocket of warm water below the ocean surface, which is needed to feed bigger El Niño events, though not necessarily weak ones. The presence of such a warm reservoir would give them more confidence that an El Niño was likely.

“We’re not quite seeing those indicators emerge on our radar yet,” L’Heureux said.

Consequently, the latest forecast from NOAA has equal chances — just shy of 50 percent — of continued neutral conditions or an El Niño developing in late summer or fall.

Spring Barrier

The computer models that simulate the physics of the climate system are more bullish on an El Niño than are forecasters because the forecasters know that the models just aren’t as good at predicting an El Niño development at this time of year — known as the dreaded spring predictability barrier.

“The spring barrier is this incredibly difficult thing to deal with,” L’Heureux said. It’s also somewhat tricky to define and tease apart.

Some of the lack of predictability comes from the fact that an El Niño tends to peak in late fall or early winter, and predictions become more difficult the further away from an event you are. But even predictions made in the spring for summer conditions aren’t very good, pointing to other factors at work.

Spring is also a time of transition, when any signal from the climate system is more difficult to pick out from the noise of weather because that noise is higher.

That difficulty also comes from the nature of the El Niño cycle itself. El Niño needs both the ocean heat and weak or reversed trade winds to come together to form, and those winds are much harder to predict further in advance. During a decaying El Niño, which happens in late winter and into spring, warm surface water drains north and south away from the equator, allowing colder waters to well up from below, Aaron Levine, a postdoctoral associate at NOAA’s Pacific Marine Environmental Laboratory, explained in a blog post. This tips the momentum of the system toward a La Niña, meaning that what the atmosphere is doing is less important.

People cross a flooded road after a massive landslide and flood in the Huachipa district of Lima, Peru, on March 17, 2017, when a coastal El Niño ramped up rains.
Click image to enlarge. Credit; REUTERS Guadalupe Pard

Essentially, during the spring, the ocean is a lot less helpful at cluing forecasters into whether an El Niño might happen in the coming months.

Some research has suggested that the warming of the planet from the buildup of greenhouse gases in the atmosphere might lead to more strong El Niño and La Niña events. The limitations of computer models in fully capturing the behavior of the El Niño-La Niña cycle means that much remains unclear about how that cycle might change as the planet’s temperature continues to rise.

Warming could exacerbate the global weather impacts of El Niño and La Niña, though. For example, when temperatures are drier, any drought that occurs will be worse because that added heat leads to more evaporation. On the flip side, warming could also cause the heavy rains that fall in certain areas during an event to be even stronger, because a warmer atmosphere has more moisture available for those rains.

Models are partly more bullish than forecasters on the prospects of El Niño this year because of the extremely warm waters that popped up off the coast of Peru during a so-called coastal El Niño during the winter. Those waters reached 1.8°F (1°C) above normal and helped fueled intense rains that caused major flooding in the region.

While there have been instances where a coastal El Niño bled over into full-blown one, it’s not guaranteed, and temperatures along that coastal area have been fading over the past few weeks.

Current sea surface temperatures have been hovering around the 0.9°F (0.5°C) above-average threshold that defines an El Niño, but those temperatures need to persist for much longer for forecasters to feel comfortable issuing an El Niño watch, meaning conditions are favorable for it to develop over the next six months.

“Obviously as a forecaster we would like the system to make a decision,” L’Heureux said.

If an El Niño does materialize this year, it will be only the second time in 65 years of record keeping that such a back-to-back situation occurred. The other instance was in 1963 to 1965. But that record likely isn’t long enough to have captured the full range of how an El Niño can behave.

“Some of these behaviors aren’t incredibly unusual, it’s just that we haven’t had an opportunity” to observe them yet, Levine said. “Every new event gives us new information.”



Be the first to comment - What do you think?  Posted by Editor - at 6:20 am

Categories: General   Tags:

Argentina strikes deal for two Chinese nuclear reactors

During a meeting between Chinese President Xi Jinping and Argentinean President Mauricio Macri, the two countries inked a deal for a pair of nuclear reactors


Be the first to comment - What do you think?  Posted by Editor - at 6:15 am

Categories: General   Tags:

« Previous PageNext Page »