In order to ensure the accuracy of the monitoring model, the use of ground-measured data is an important step in the data calculation process. It is recommended to use the following three criteria to determine the accuracy of the model: ① the overall deviation; ② the dispersion; ③ the ability to replicate the statistical distribution. The above three standards are respectively represented by Mean Bias Error (MBE), Root Mean Square Error (RMSE) and Kolmonov-Smirnov integral (KSI). Compared with RMSE, many researchers prefer to use mean absolute error (MAE) as a measure of dispersion. There are two reasons: ①MAE is less sensitive to outliers; ②When expressed in relative (proportion) terms At that time, MAE is less limited by what it explains. MBE and RMSE provide the expected error range within the specified geographical or seasonal range, and only use high-frequency (at least per hour) ground measurements and quality-controlled ground measurements to verify the reliability of the satellite model. Under normal circumstances, these data are measured using well-maintained high-quality meteorological radiometers (second-class standard products, or at least first-class products according to WMO classification). Figure 1 is an example of satellite-ground comparison.
For users, it is best to use the mean deviation (MBE) to express the radiation system error of the annual or monthly mean value of a specific location. Accumulated experience has shown that if the latest satellite model is standardized as daytime radiation, the MBE can be used to estimate the annual GHI within ±3.5% of the error range.
Whether there is a deviation in this value depends on the terrain: in complex tropical areas, in severe air pollution, high latitudes, areas with mountains and complex terrain, and in areas with low sunshine angles and snow, the error is larger (±7% ). Generally, the MBE estimated by the DNI at a specific location is about twice the GHI estimate. In other words, in arid and semi-arid regions with low aerosol variability, single landscape, and unchanged altitude, the annual DNI MBE estimated by the high-performance model is less than 7%. As the availability of verification information for more than 100 representative locations on five continents has increased, the credibility of research results has also increased. In areas where the atmosphere and clouds are complex and changeable, the geographical situation is more complex, and the availability of ground inspection data is limited, the error range of the MBE used to estimate the DNI is expected to be ±12%, and sometimes larger.
The root mean square error (RMSE) is a good representation of the hourly deviation or the within-hour deviation. This standard can be used as a basis for evaluating the performance of benchmark measurement models and monitoring models. The increase in RMSE is mainly due to clouds, and to a certain extent is related to changes in snow cover and aerosol increase. Therefore, the hourly RMSE of GHI can reach 7% to 20% (standardized average hourly irradiation) during sunny seasons in arid and semi-arid regions. In areas with more clouds, more complex weather, stronger atmospheric composition, and more complex geomorphology, or mid-latitude areas, the RMSE range is expected to reach 15% to 30%. In high mountains and high latitudes, in seasons with low sunshine angles and thick snow, the relative RMSE of GHI is in the range of 25% to 35% or higher. DNI can also be estimated by observing similar patterns of RMSE, and its error is twice the GHI error. In arid and semi-arid regions where solar energy technology is most conducive to the implementation of the RMSE, the range of RMSE is between 18% and 30%. In areas with more clouds and stronger aerosol changes, the RMSE generally ranges from 25% to 45%. In high latitudes and high mountains, the RMSE may exceed 45%.
It should be noted that the dispersion measure (ie RMSE and MAE) is a decreasing function of the model time step. Figure 2 shows how MAE decreases as a function of time step somewhere in the southwestern United States.
It should be pointed out that for a short time step of a few hours, most of the observed dispersion (that is, the difference between the satellite observation result and the observation result of the ground observation station at a given moment) stems from the nature of the two The objects of measurement are different: ground stations measure time-integrated integrated data at a specific point, while satellites measure instantaneous space extension data. Once such measurement contradictions are explained, the effective dispersion of the satellite model is reduced by nearly half compared with the apparent dispersion. In particular, satellites are the most accurate choice for measurement within a range of 20~25km away from the ground station. More importantly, within a very short distance, the dispersion between the two measurement stations still exists, which is called nugget. The effect can be used to measure the inherent difference between satellite observation points and ground observation points. In fact, the true satellite dispersion is estimated to be close to 10%, that is, the apparent RMSE minus the nugget effect, while in arid regions, the estimated value is even less than 10%.