Grid Scale, Solar

How Many 9s of Accuracy: The Future of Solar Performance Benchmarking

When Vivint Solar acquired Solmetric Corporation in January 2014, the industry murmured concern. It was another in a string of “arms race-level” events between several top national solar installers.

But while SolarCity’s purchase of Zep directly affected only a small percentage of the industry, Vivint Solar’s move to acquire Solmetric cast a more philosophical shadow.

Almost everyone in the solar industry depends on a SunEye — released in the mid-2000s to automate and increase the accuracy of data recorded using the Solar Pathfinder, a long-standing industry standard for shade analysis — at some point in the project lifecycle to benchmark system performance and quantify shade impacts. From incentive programs to third-party financiers to performance modeling tools, measurement verification with Solmetric products has been deeply integrated into nearly the entire ecosystem of solar projects.

Now, a year later, Vivint Solar announced its decision that the SunEye had “reached the end of the product lifecycle” and has discontinued the hardware as well as related software and services (e.g. PV Designer, Solmetric Shade Training, and SunEye Extension Platform). 

According to the press release from Vivint Solar, “the Solmetric team will be fully integrated into the Vivint Solar operating structure and be known as Vivint Solar Labs. The new research and development team will focus on proprietary photovoltaic installation instruments and software.” 

With the end of the SunEye, the industry must now address two critical questions:

  1. What is the future of performance benchmarking?
  2. What is the right level of accuracy needed across all aspects of system design and performance modeling?

Solmetric Defined Measurement

While Solar Pathfinder released PV Studio and Assistant Software, many installers in the new rapidly growing grid-tied PV market found working with the Pathfinder was error-prone and took too long to analyze afterwards. Installers easily lost or misfiled Sunpath trace sheets, holding up a customer from receiving a quote or a rebate reservation.

When the first iPhone came out in the summer of 2007, many solar professionals commented how great it would be if Solmetric built software for the iPhone. Don’t forget, the SunEye 100 was built on a Hewlett-Packard iPAQ PDA and retailed at $1,355. 

In Feburary 2010, Solmetric released “iPV,” the evolution of the SunTracker App developed by Imeasure Systems and acquired by Solmetric. I was offered at $39.99. Why hasn’t someone built a software replacement for the SunEye? 

Even with advances in native panorama image-stitching now available in iOS 8.2, increased onboard processing capabilities, and higher-resolution cameras, the size of the addressable market is still so small that it would be challenging to build a business around such a tool, even at a $100 or $200 price point.

Software-Based Competition 

A number of software-driven shade analysis solutions have come to market in the last few years including:

  • Bright Harvest, a company that conducts remote system design with per-module production estimates for a given address based on 3D modeling of a property generated by photogrammetry techniques used on aerial imagery. Estimation of shade impacts via sunpath analysis are a subset of the Bright Harvest design process.
  • SolarCensus, a technology company developing automated, LIDAR-based and analyzing the reflected light solutions to remote shade analysis. LIDAR is a remote sensing technology that measures distance by illuminating a target with a laser. Using this method, SolarCensus can develop an understanding of roof heights and slopes, obstructions, and then perform sunpath analysis on those areas, resulting in an insolation “heat map” of a roof. They currently hold several patents for their algorithms.
  • Aurora, a cloud-based integrated proposal and design platform that leverages its patented algorithms to generate 3D images of customer sites complete with sunpath-based shade analysis on the resulting model.
  • SunNumber uses publically available LIDAR datasets as part of their process to develop their proprietary SunNumber site suitability score.

A solution like SunNumber is intended as feasibility and pre-qualification of customer sites as part of customer acquisition efforts. Aurora, Bright Harvest, or SolarCensus could substantially increase the accuracy of production estimations during the sales process. But could any of these tools eventually eliminate a site visit altogether? Estimating the average site visit truck roll at around $200, that adds up for installers operating at increasingly higher volumes and under more challenging price pressures. 

The trouble with eliminating the site visit is that currently only a qualified individual on premises can determine interconnection method and structural suitability. Currently, many other aspects can be estimated remotely except these two items. Many top solar companies are grappling with balancing the number of times to visit a site against potential savings or rework necessary later on.

#rewpage#

How Do These Alternatives Measure Up?

In summer of 2011, Fin Macdonald, a Canadian sustainability professional, conducted a side by side comparison of the Solar Pathfinder with Solmetric’s iPV app. Though not a rigorous and well-controlled scientific study, he observed a 2 percent difference between solar access percentages between the two methods. A greater sample size and series of controls would be required to fully trust this evaluation, however.

NREL, in partnership with SolarCensus and funded by a DOE SunShot Incubator grant, conducted site visits at four homes to determine the accuracy of Solar Census’ remote online shading tool. In April 2014, NREL conducted a blind study to determine the accuracy of the remote shading algorithms in SolarCensus and found the data to be equivalent to onsite shading measurements with a tolerance interval of ± 3.4 solar access values.

Aurora’s software remotely performs shading analysis (irradiance modeling, solar access percentage calculation), by utilizing its 3D modeling capability and an 8760 (annualized) sun-path simulation. In tests conducted both internally and with a nationwide PV installer, Aurora’s shading analysis was found to be accurate. Aurora will be working with the Department of Energy’s National Renewable Energy Lab for a comprehensive comparison between its output, and that of SunEyes and similar tools. 

How many “9s of Accuracy” Do We Need? 

In information technology, “9s of availability” is a marketing term that refers to the number of 9s in desired availability for an IT system. For example, 99.9 percent is three nines. The relationship between costs to procure additional “9s of availability” is often significant between two to three and than three to four. Is the additional cost worth the promise of another significant digit of availability? 

If we extend this thought experiment to the designing and siting of solar PV systems, we are immediately confronted with differing measurement accuracy standards. For example, solar professionals use precision equipment like the SunEye to measure solar access percentages — sometimes using upwards of four, six or even nine images per SunEye session — but generally use PVWATTS default values and monthly kWh totals to estimate system production and energy offsets.

Furthermore, analysis performed on SunEye sessions can yield different results depending on how the designer uses the data. For example, if given the same set of SunEye data, designers can which images are chosen to produce the solar access percentages, if and how the individual SunEyes are corrected to adjust for the auto-fill of shaded areas, and what type of reports and session data are produced.

Would the additional time, effort, and cost configuring PVWATTS derate values, acquiring 15 minute electricity interval data, and upgrading to proposal generation and/or system modeling tools to account for a similar level of accuracy create value for the customer? Would it make the bidding installer more competitive or differentiated?

As an industry, these topics need to be more deeply explored as we continue to look at an increasingly competitive landscape. While many tools and services are coming to market to make solar quoting easy and faster, others are focusing more on aggregating more granular details to paint a more complete picture of options for a homeowner. 

How will the financial services sector respond to the loss of the SunEye? Will the new software providers fill the need? And what about the other parameters used in performance estimation? The accuracy of those results is only as granular as the least granular data used whether that is the electrical usage, utility tariff, and equipment details. So what is the right level of granularity — how many 9s — should we aim for when we determine our estimated performance benchmark and what process and tools will help us achieve those goals in the future?

Lead image: Solar panels. Credit: Shutterstock.