Solar Radiation

Launching Our Grid-based Solar Forecasting API at #CIDER17

CIDER17

Today, I’m presenting at the #CIDER17 conference in Sydney.

This conference, is focused on DER integration with networks in the Asia-Pacific, and nowhere is this more relevant than here in Australia.  In my presentation, I am launching of our grid-based solar forecasting API, which has now gone live in beta with a total of 8 Australian distribution networks now available (3 more added soon...).

This API deployment is a part of our ARENA funded distributed solar forecasting and modelling project where we are tasked with the integration solar forecasts into LV network operations.  Here, we have two key outcomes. First, to make accessing solar forecasts easy, scalable and to directly integrate them with the existing software in use by the DNSPs.  And second, raise the allowable penetrations of solar PV in these networks through #orchestration and proactive (rather than reactive) LV network management.

Integration via an API Framework

With that first outcome in mind, we have chosen an API framework for the delivery of our solar forecasting service.  This API is straightforward and built to start the integration process. Data from the API can be previewed in an HTML page for ease of use or testing, but can also be downloaded in specific data formats including JSON, XML and CSV via your own scripts.  Over the next year, we’ll be focusing in on building integrations for your load flow & network modelling software of choice – so be sure to provide us your feedback on what development you’d like to see in this space.

Getting to the grid-based API specifics, lets start off with “the how” we create the forecasting aggregations exposed in this API service.  Using the installed DER metadata for small-scale solar PV sites, provided by DNSPs, we have been able to:

  1. Generate PV power output forecasts & estimated actuals by matching the radiation estimates from our satellite nowcasting system to the locations and characteristics of the installed PV sites
  2. We then aggregate the PV power output forecasts to the upstream distribution transformers and zone substations; providing the total PV power output forecast for those networks assets
  3. Finally, we expose this forecast data via an API framework, which currently makes these aggregations available the zone substation level

Once these aggregations are completed, participating DNSPs are able to retrieve forecasts (as well as estimated actuals, more on that soon) via a wide variety of methods and file formats (as discussed above).  DNSPs first must register for the Solcast API services, and then are granted access to these aggregations for their network via their unique API key. We have instructions available on how to retrieve these forecasts & use the API, which is available on request (just send us an email!).

At this time, we have a total of 3 API endpoints unique to each DNSP:

  1. The ability to retrieve the API’s list of zone substation names available for forecast retrieval. This makes it easy to access the next two endpoints…
  2. An endpoint which provides a 0-7 day forecast, at 30 minute intervals (updated behind the scenes every 10 minutes) for the specified zone substation
  3. An endpoint which allows the DNSP to retrieve the estimated actuals for the downstream PV sites at that asset.  The estimated actuals are a look back at the past 7 days, at 30 minute intervals using the valid satellite scan from that time [no forecasting involved] (Note: this will be expanded to reach back further soon!)

#CIDER17 Demo of the API:

In the interest of showing off the capabilities of this API system, I am attaching the following screenshots and images of the API in actions. Several of these images are from forecasts & estimated actuals retrieved just prior to this talk, the remainder are from other events of interest. 

(Follow-up to Solar Analytics talk yesterday - ANU solar PV Simulations from 2014....)

 

Help Us Improve: Our Priority Here is DNSP Testing & Feedback

At this stage, we are asking DNSPs to access, analyse and critically review this API offering.  We want to know how they will use it, what features they’d like to see added and where we can integrate with existing software and operational strategies.

Looking Ahead: LV Network Orchestration

Getting back to outcome #2, raising the allowable penetration levels of solar PV in our LV networks, we turn to what I believe becomes possible with this grid-based solar forecasting API: using this service in the #orchestration of DER services to LV networks.  By mapping solar variability to our LV networks, we can pair it up with solutions in demand management, energy storage and smart inverters.  With 10GW of small-scale solar, 4GW of large scale solar and 500k energy storage systems on the way by 2020 – I believe this our API can directly contribute to the successful integration of the incoming high penetrations of DER. 

Zooming out, orchestration is really the forefront of the renewables powered transition now underway across the Asia Pacific.  In this effort, we must look to turn intermittency into opportunity (on that topic, watch the video embedded here). Our ANU/Solcast team is working on precisely that! And we plan to make the API open to collaboration in that regard (have you seen our data sharing campaign for researchers?), wherever our DNSP partners determine that to be beneficial.

In Closing, Our Message: Let's Collaborate!

Thanks for taking the time to read this blog post and review the above information. If you are an Australian DNSP project partner, all of the API related services are open to use across your entire organisation. Please share it widely and freely – we can handle any amount of requests that you throw at the API, so have fun.

Himawari 8 domain/imagery

Himawari 8 domain/imagery

If you are interested in deploying a similar solution (bottom up DER mapping of solar, deployment of a solar forecasting API service for solar farms and small-scale solar sites) elsewhere across the Asia-Pacific, get in touch with Solcast directly.  We’re looking for opportunities to collaborate across the Himawari 8 domain!

It’s all part of teaming up for the solar powered future!

-Dr. Nick Engerer-

PhD Thesis: What I did, What I found, Why it Matters

Several years ago, I decided I wanted to write a Masters Thesis that didn't just sit on a shelf, collecting dust.  Call me idealistic, but I wanted to adopt a philosophy of "science off the shelf" 

[April 2016 Update: My PhD Thesis is now finalised! Download it here: Part 1  |  Part 2 ]
 


[download my submitted thesis - part 1]

[download my submitted thesis - part 2]

Back at the University of Oklahoma School of Meteorology, there was a running joke amongst graduate students that they could stick a $20 bill in their thesis, check back in 5-10 years, and find it still nestled between the pages, safe and sound.  While it was humorous, and mostly in jest, it reflected the mentality of postgraduate education/research work - what we were doing was mostly just to tick boxes & get our degree so that we could one day do something that mattered. In reality, our 'research' would sit on a library shelf, acquiring dust.

Yearly kWh production for a simulated 2kW Sanyo array in Oklahoma

Yearly kWh production for a simulated 2kW Sanyo array in Oklahoma

But I wasn't content to resign myself to this fate. I felt compelled to produce something that mattered, that would have real-world relevance.  So I brought on an industry partner, a local solar installer in Oklahoma, and put together a solar map for Oklahoma, not based on solar radiation - but on kWh hours of electricity generation, which immediately translates to payoff times and dollars. [download my M.S. Thesis]

It was a tremendously successful experience, and as I moved forward into my PhD project, I was very determined to not only maintain that approach, but to expand it.  Now, looking back, I can see that approach has paid off in great ways, which I'd like to share with you.  Because what I've discovered, what I've developed and what I am now positioned to do with the technology I created, is all highly relevant, commercially viable and ready to hit the ground running.

What I did and why

Australian PV installations by year, as provided by the Australian PV Institute

There is ~4GW of solar energy installed in Australia, which is dominated by ~1.4 million small-scale photovoltaic (PV) arrays.  These arrays are relatively small (average size 1.5kW in 2011, growing to 4.5kW by the end of 2014), and the vast majority of these installations are un-monitored.  This means that their minute-by-minute performance is not recorded.  In fact, the only information collected for most of these systems is the total quarterly production as reported on electricity bills.  Long story short, this lack of information limits the number of PV systems that can be installed in a given region (like here).

Measured PV power output, divided by that PV systems clear sky power output

Measured PV power output, divided by that PV systems clear sky power output

So what did I do about it? I developed a method for estimating the power output from many thousands of PV systems using a small selection of monitored PV systems (which report their minute-by-minute or hour-by-hour generation).  This method is called  “the clear-sky index for photovoltaics”, KPV. [read more]

Along the way, I had to do significant work in the field of solar radiation modelling, validating clear sky radiation models and developing a new type of solar radiation model fit to Australian radiation data (a "separation model").  I also had to show that my new method (using solar panels as a sensor network) was able to do the job just as well as professional grade radiation equipment.  

Pyranometer based methods versus KPV based methods.  The takeaway? Get the sites close enough (within 5km) and my new method is just as good as using a more sparse network of professional grade solar radiation sensors.  

It was also necessary to develop new quality control routines for the PV system power output data.  This type of data is messy, often provided by non-experts and hasn't been used in the way I've proposed before.  This was accomplished through the development of a new quality control routine called QCPV (now going through the review process in a major journal).

~200 solar PV systems installed in Canberra, which are reporting their data in real-time

Once I sorted out the quality control work, it became possible to work with data from many hundreds of solar PV systems, as well as scale my KPV method up to many thousands of systems.  So I forged ahead, using Canberra as a proof-of-concept, creating a city-wide distributed PV simulation of its 12,000+ embedded PV generators (based on December 2012 installation data).  

I then paired this simulation system with weather events that cause broad-scale, rapid changes in the power output of all of the PV systems at the same time [check it out].  It is these types of events which are the most likely to cause future grid stability problems.  The basic idea is that, when you have a wide-spread solar network, the negative effects of partly cloudy days are "smoothed out" by those systems being positioned over broad region, but during certain weather events, it is not possible to smooth out these impacts, because the cloud features are too widespread, sudden and thick.

"when you have a wide-spread solar network, the effects of partly cloudy days are "smoothed out" by those systems being positioned over large region, but during certain weather events, it is not possible to smooth out these impacts, because the cloud features are too widespread, sudden and thick"

What I did find?

Let me keep this as focused and brief as possible...

KPV estimates (color) versus measurements (black) under a positive ramp event

Firstly, I demonstrated that my newly proposed KPV method was much better than existing methods, showing that it was well-behaved under all cloud cover conditions, and performed well under positive and negative ramp events. [download publication][read blog post]

Next I found that for clear sky radiation models, operating in Australia, global clear sky simulations, are best computed by the Solis, Esra and REST2 approaches, while the Iqbal, Esra and REST2 methods are the most proficient clear sky beam models. [download publication][read blog post]

The Engerer 2 separation model at work (blue model estimates, grey observations)

 After that, I found that only the Perez separation model performed satisfactorily for high resolution (one minute) solar radiation data.  In response to this, I developed three new separation models, which gave slight improvements over the Perez model and greatly exceeded the performance of all other existing model techniques. [download publication]

Once that was handled, I compared radiation sensor based methods to my PV data based approaches, with a student project.  This study found that the approaches were equally as good for separation distances of 5km or less. Given that PV sensors are "cheap" (someone else pays for them) - this was a great finding. [download publication][read blog post]

Post QC KPV estimates, very tight correlation, great results

Then I dug into the development of the QCPV algorithm (quality control), demonstrating that the method I created can result in a 43% reduction in Mean Absolute Percent Error (MAPE) over the raw data. [pre-print coming soon]

Second to last, with another student project, we categorised the weather events that cause those large scale, collective changes in PV power output discussed earlier.  Positive collective ramp events (sudden clearing) were caused by Australian northwest cloud bands and radiation fog dissipation. Negative collective ramp (sudden cloud cover arrival) events were caused most frequently by the passage of cold fronts and thunderstorms.[download manuscript][read blog post]

Finally, I put it all together, with the city-wide PV simulation system, using it to simulate the changes in total power and energy output from these collective ramp events.  I was able, for the first time, to quantify (aka determine a representative number) the amount of power that (dis)appears on the electrical grid during these events.  For example, a thunderstorm event on 19 February 2014 removed 20.78 MW of power generation from the local grid over an 85 minute period, which equates to approximately 14.54 MWh of energy generation forgone over that period.  That's probably enough to change prices on the energy market - not very much, but as the solar installation numbers continue to grow, that influence will grow significantly.  

Here's the thunderstorm ramp event from 19 February 2014, along with some satellite imagery.

What is my overall conclusion?

The overall conclusion, is that the developed regional simulation system for distributed solar PV, made possible by an upscaling of my KPV methodology, represents a significant, unique and promising tool for scientific, engineering and operational purposes.  

In the simplest of terms: I built a very handy tool, with cheap inputs that can be run anywhere that solar PV systems are reporting their power output data.

Where next?

I have a full-time lecturer position ("professor" in the American use of the word) at The Australian National University, where I work in the Fenner School of Environment and Society (employed since July 2013).  I am using the freedom and security this position provides me with, to apply for funding to scale this simulation system up, Australia-wide.  I'll join it with the new Himawari 8/9 satellite data, and pair up with the energy market/utilities in Australia, in order to help large amounts of distributed solar to be added to the grid.  

You could say the future is sunny and bright (#punny).  And with HUGE amounts of solar being installed globally, the solar century is before us.  There are plenty of opportunities for this science to stay off of that dust shelf.  So I'd say, overall, this whole PhD thing has been a smashing success! Even if it was a bit of a wild ride.  I hope to have more exciting news soon - for now, I'll get back to working getting this simulator to run real-time in Canberra...

Workshop with the Japanese Meteorological Agency: a Research Update

This week, I had the pleasure to present my current research to Mr. Toshi Kurino of the JMA

And I thought, why not use this change to provide an update on my current  research efforts? I've already put together a powerpoint on the topic - which did all the work for me on organising my thoughts.  So here goes.  What am I up to?

I've tried to make this easy. There's a slideshow below with accompanying audio hosted on soundcloud.  If you'd prefer, you can grab the presentation here:

[download presentation]


Solar 2014: Estimating Hourly Energy Generation of Distributed Photovoltaic Arrays

 

If you were tasked with estimating the energy generation from an entire city of PV systems - how would you do it?

A simulation probably jumps into your mind right away.  Scale up a model of PV system performance and that must get you close right?  Well that's a step in the right direction, and you could do that very accurately if you knew the amount of radiation arriving at the surface of all those PV arrays.

But that's a bit trickier than it sounds!  First, where are we going to get an estimate of the available solar radiation at a given location in the city?

The most common answer I get to this is: a pyranometer.  And that's a great start - you'd get a measurement of global horizontal irradiance (GHI) at a point location, which is very helpful.  But you're left with two major problems:

1. How representative is that pyranometer of the rest of the city's radiation resource?  Those clouds are tricky!

2. How do you estimate the amount of radiation arriving on all of those various tilted surfaces around the city?

So, OK, now we need multiple pyranometer sites around the city and at each site we need to tilt and orient them in various directions in order to get a representative sample.

Well, the bad news is that pyranometers cost a few thousand dollars each, need regular cleaning/calibration/maintenance and it's actually pretty difficult to find appropriate sites for them.  If you'd like to find out just how difficult it is to install scientific equipment on buildings - be my guest! (Hint: paperwork, approvals, PITAs galore!).

But I think I've got a better idea...

But I think I've got a better idea - what if we used the photovoltaic arrays that are already installed in a given region as our primary input to our city-scale modelling project?

They're pointed in many different directions, there are many of them already reporting data publicly in real-time, someone else has paid for the equipment AND they're representative of the systems we are trying to estimate in the first place. Sounds like a pretty sweet deal to me!

But, they are subject to shading, soiling and wiring inefficiencies, not to mention that they are not really the most scientific form of equipment.  Still - they are inherently a type of radiation sensor.  And we can probably deal with a lot of those things with some fancy machine-learning algorithms.

So, here is where I introduce our paper:

Estimating Hourly Energy Generation of Distributed Photovoltaic Arrays: a Comparison of Two Methods

J. Tan, N. A. Engerer and F. P. Mills

[download it]

 In it, we compare a two methods for estimating the energy generation of distributed PV arrays.  

The first uses pyranometers, radiation models and PV system modelling for the estimation.

Method 1: Based off of my Masters Thesis (see the 'Publications' page')

Method 1: Based off of my Masters Thesis (see the 'Publications' page')


The second uses a monitored PV system and my KPV methodology to make the estimation.  

Method 2: Based off of my KPV Methodology, Read the Solar Energy journal publication at the 'Publications' page

Method 2: Based off of my KPV Methodology, Read the Solar Energy journal publication at the 'Publications' page

 

I'll let you read the paper to get the details, as that's not the point of blogging (all the boring stuff is for the papers - Ok, I really do actually think that stuff is fun too, #supernerd).  But I will let you know that we've found a few interesting things:

1.  The pyranometer methods does tend to do a bit better (RMSE  15-20% versus 15-25%)

2. BUT when we start to leverage the prolific availability of the PV systems (there are many more of them out there!), we find the KPV method actually does best! (for distances less than 5km) 

3. We actually detected a calibration error in one of the pyranometers using the PV systems - so much for pyranometers being the pinnacle of scientific monitoring!

Overall, I find this result very encouraging.  If we can use PV systems as our primary input to our city-wide modelling idea, then we are one step closer to making the estimate we need.  And we can do it on the cheap - which is really good for solar! 

Now it's time to scale it up, test it on different time scales and handle all those pesky quality control issues.  But don't worry, you can count on me to bring you the results soon! 

Until then, enjoy my new webpage!

-Nick-

 

 

 

/*