Subcontractor: Dr. Alex Cronin, University of Arizona

NREL Contract 99043, “Study Degradation Rates of Photovoltaic (PV) Modules Deployed in Arizona” NREL Technical Monitor: Dr. Sarah Kurtz

Description:First Monthly Report

Authors: Steve Pulver and Alex Cronin

Date: August 30, 2009

Our xext conference call is scheduled for Sept 1 at 3pm Colorado time.

In our last conference call of August 11, 2009 we discussed the following:

The TEP data from 2003-2009 appear to be of sufficient quality to study relative degradation rates.

How to deal with gaps in the data?

  1. use only continuous data
  2. make up a function “guess” to replace missing data
  3. use as many systems as possible to define the average. Then when one or two systems are off, the average may still be well behaved.
  4. fit continuous segments of data and then discuss how to shift chunks up or down to allow for the possibility of replaced inverters or other equipment
  5. eliminate variations that are correlated with temperature

We agreed to focus on the figures and methods for a paper that will study degradation. We will target this as a 3-pg abstract for PVSC. A longer description of the TEP test yard data will be compiled too.

Since then we have analyzed the following …

Figure 1. Binary data indicating when each system was on. “On” (shown as high) indicates daily kwh greater than 1% of typical.
Figure 2. Number of Active systems. This shows the total number of systems that were ‘on’ at any date. The Average Performance Ratio (APR) can be determined from this many systems each day.
Figure 3. Here is the Average Performance Ratio (APR) defined as the average of all systems’ kwh_AC divided by rated_kw_DC. The number of systems that were used to calculate APR changed with date as shown in Figure 2. There are undulations with a period of 6 months.
The average APR is 4.5. Compared to the sunny-day range of approx 5.0, this quantifies the effect of weather in Tucson.
Figure 4. Performance Ratio Normalized to the Average (PRNA).
Defined as [kwh_AC divided by kw_rated_DC] divided by average performance ratio.
The PRNA undulates with a period of 1-year for several systems.
System 2 is plotted in black dots in “front”. It appears to degrade the least.
System 9 is most erratic. Perhaps it should be omitted from the analysis altogether.

Next, we removed the 1-year undulations by using a best-fit sin function.

Figure 5. Here is the sin-fit to PRNA for system 4. The range of fit was restricted to the nearly continuous chunk of data from day 1500 to 2800. Only days when the system was ‘on’ were included in the fit. The period of the fit was held at 1year.
That the residuals (data-fit) trend upward indicates that system 4 is degrading less than the average. Caution: this average may be declining mostly due to erratic system 9.
Figure 6. Here are the data from system 4 with the oscillations subtracted.
Y-axis: System 4 PRNA minus [k1*sin(2*pi*day/365 + k3)] where ki are the fit parameters.
Figure 7. Here the best fit sin function for each system was subtracted from its PRNA.
System 2 again is highlighted in black in “front”. We will use this next.

Instead of PRNA, maybe we should look at PR / PR2. This is shown for each system next.

Figure 8. Performance ratio for each system normalized to that of system 2.
(PR / PR2).

Using linear fits to the reduced data shown in figure 8 we can report degradation of each system RELATIVE to system 2.

Figure 9. Degradation rates relative to system 2. Fits are to days 0-2800 (solid black circles). Fits to the days 1500-2800 are shown in red open circles. Note: this restricted range of time was needed for meaningful studies of s4 because of the jump in PR4.
Error bars from the linear fitting routine are shown. I would not yet take these error bars seriously.

Figure 10. Irradiance data from AZMET. These data start Jan 1 2003. The obvious cycles have one maximum every year.
Fig. 11. Temperature data from AZMET. This starts in 2002. Y-axis in deg. C.