Introduction

Addressing climate change necessitates a rapid transition to renewable energy; one path to a sustainable future lies in solar energy production. Since 2010, advances in solar cell efficiency have decreased the cost of utility-scale photovoltaics (PV) by 82%1, increasing the feasibility of large-scale solar power. As of 2020, solar power constituted only 3% (80 gigawatts) of U.S. energy production, but the U.S. aims to receive 45% of its energy from solar power by 20502. Meeting this target requires rapid widespread solar energy development, which means identifying locations with suitable solar resources given current and future climate, population, land use, and technological constraints.

Solar resources are assessed by the amount of shortwave direct normal (DNI) and global horizontal (GHI) irradiance a location receives. Concentrating solar-thermal power (CSP) relies on DNI while PV relies primarily on GHI3,4. However, combined, GHI and DNI indicate the amount of diffuse radiation, which affects PV panel technologies differently5. The most prominent commercially-employed panels are constructed from monocrystalline silicon (c-Si; ~95% of installations) and polycrystalline cadmium telluride (CdTe; most of the remaining 5% of installations)6,7. CdTe and c-Si are direct and indirect bandgap materials, respectively, meaning that CdTe better absorbs solar radiation. About 1 μm of thin film CdTe absorbs nearly 99% of incident light8 by continuing to function under diffuse radiation. c-Si requires a thick wafer of over 100 μm and performance is reduced under diffuse radiation9. The tradeoff between direct and diffuse conditions–resulting from clouds and aerosols–carries important implications for solar energy production.

Anthropogenic pollution10,11 and dust12 reduce PV generation by decreasing solar radiation via atmospheric attenuation (absorption/scattering) and deposition on panels (soiling). In northern and eastern China, air pollution decreases annual average (peak) irradiance at PV arrays by 20–25% (34%)10. Air pollution in India reduced GHI by 29% from 2001 to 2018, decreasing PV output by 12–41% depending on panel architecture13. Globally, particulate matter (PM) reduces PV output by >50% in polluted and desert regions, and soiling accounts for over two-thirds of energy lost11. In some regions, PM’s impact on PV generation can rival that of clouds11. PM includes a wide range of aerosol types, including pollution, dust, and smoke, which interact with solar radiation differently based on their optical properties. As such, general examinations of PM’s impact on solar energy may not fully capture the specific effects of increasingly abundant wildfire smoke.

Since the mid-1980s, higher temperatures, earlier snowmelt, and increased fuel aridity have led to extended wildfire seasons, longer-burning fires, and larger burn areas14,15,16, trends projected to continue as the climate changes. Most of this increase in wildfire activity has occurred in the Pacific Northwest and Southwest15, the latter having the highest U.S. solar resource potential. More active wildfire seasons result in more wildfire smoke emissions17,18 that travel far from fires19 to impact air quality across the contiguous U.S. (CONUS). Historical20,21 and projected18,22 air quality trends show that wildfire smoke emissions offset declining anthropogenic PM emissions. Smoke’s increasing impact on air quality is detectable over the Pacific Northwest20,21 and will grow more significant in the future23.

Recent case studies document sizable smoke-driven reductions in PV output during severe local wildfires over short time intervals and at specific locations and aerosol optical depths (AOD). Average PV output declined 34% over two days at a PV plant in Spain due to smoke24, 7% at a solar lab in Australia during a controlled burn when AOD reached 0.25 (27% at peak)25, 8.3% across 53 utility-scale PV plants in the western U.S. during the 2018 wildfire season26, and ~10–50% at southern California solar plants during the 2020 wildfire season when AOD ranged from 0.5 to 4.527. Other models of the California 2020 wildfire season support a 7.7% decline in PV generation and project an additional 0.4% reduction under further enhanced smoke conditions28. Examinations of solar resource and forecast models demonstrate the need to include aerosol parameters that account for smoke. Solar forecasts improved nearly 50% when using aerosol inputs from a 3-h reanalysis product rather than a monthly aerosol climatology29. Similarly, using a satellite-derived AOD product that better detects smoke’s presence and spatial variability improved a solar resource model’s accuracy compared to ground-based observations30.

In this paper, we study wildfire smoke’s impact on baseline (i.e., average) solar resource availability across multiple temporal and spatial resolutions. We leverage daily case studies, monthly aggregations, and multi-year analyses to quantify smoke’s impact on DNI and GHI at state, regional, and national levels. At the finest scale, we examine daily localized smoke impacts in California during the 2020 wildfire season under different aerosol and cloud conditions to demonstrate the range of smoke’s influence on irradiance. We ask how smoke plume optical thickness affects the daily severity of smoke-driven irradiance changes and how cloud-driven irradiance changes compare. At larger scales, we compare regional and national solar resources under high (2020) and low (2019) smoke conditions to bookend potential future and historical smoke impacts, respectively. We ask how much daily smoke-driven irradiance changes affect monthly mean DNI and GHI, how these impacts vary for local and transported smoke, and how these impacts relate to AOD. We use the resulting mean irradiances to model smoke’s impact on PV output at 13 locations across CONUS for c-Si and CdTe panels. Finally, we expand our analysis to a 16-year period (2006–2021) to characterize longer-term, regional smoke-irradiance interactions even as impacts may increase into the future. While we examine both DNI and GHI, our discussion focuses on GHI because PV panels are the main source of U.S. solar energy. Importantly, our study reaches beyond quantifying the impact of local smoke on areas close to active fires to examine the role of both local and transported smoke in determining average solar resource availability at the regional and national level. We find considerable losses in model-based daily mean DNI (32–42%) and GHI (11–17%) in California due to local smoke. Local smoke can also greatly reduce monthly mean DNI (max: 61%) and GHI (max: 25%) with impacts on DNI persisting downwind of fires. However, the impact of transported smoke on GHI is relatively minimal (<5%) on average over CONUS, even during an extreme wildfire season. The scale of the GHI reductions implies that the average impact of smoke plumes on PV resources is relatively minimal across the U.S., which is encouraging as grids incorporate more utility-scale battery storage capacity to provide greater stability when local smoke causes potentially large irradiance shifts at finer timescales.

Results

For this study, we rely on modeled clear-sky and all-sky DNI and GHI from the National Renewable Energy Laboratory’s National Solar Radiation Database (NSRDB) at 4-km spatial and 30-min temporal resolution across CONUS. All-sky values include cloud and aerosol impacts; clear-sky values include aerosols but exclude clouds. We compute (1) daily means for each variable using times when the solar zenith angle (SZA) is <75°, (2) cloud-driven changes by subtracting the clear-sky (i.e., cloud-free) and all-sky values, and (3) smoke-driven changes by subtracting clear-sky values on smoke-impacted and reference smoke-free days. We pair daily irradiances with smoke-impacted and smoke-free flags derived from the National Oceanic and Atmospheric Administration’s Hazard Mapping System (HMS) Smoke Product. Although HMS does not identify the source of smoke (e.g., wildfire, prescribed burn, agricultural), our analysis focuses on months when western U.S. and Canadian wildfire activity is greatest with most smoke traced to wildfires. We integrate satellite-derived aerosol and cloud parameters from the Multi-Angle Implementation of Atmospheric Correction (MAIAC) product and the Clouds and the Earth’s Radiant Energy System (CERES). Finally, we design a model to assess PV performance given smoke impacts.

Case study of localized smoke-irradiance interactions

Driven by increased aridity31, the 2020 wildfire season set records for acres burned in California32, and previous studies and news articles documented large reductions in PV output27. Smoke’s impact on solar resources depends on its optical properties, which vary by fuel type and load, combustion style, age, composition, and density19,33,34. Similarly, clouds introduce variability into solar energy forecasts based on differences in cloud coverage, type, shape, optical depth, and altitude35,36,37,38,39,40. Typically, clouds scatter light more efficiently than smoke and reach far greater optical depths. In California, monthly averaged cloud fractions are typically <50% in August and reach a minimum of 30–45% in September as estimated from 25 years of CERES data. Given that smoke is becoming commonplace and localized plumes are often optically thick and absorbing, we explore whether smoke-driven changes in DNI (ΔDNI) and GHI (ΔGHI) may rival those from clouds.

We introduce a case study of four California days to demonstrate how smoke’s impact on clear-sky DNI and GHI compares to the magnitude of cloud-driven changes. We calculated smoke-driven ΔDNI and ΔGHI across California by comparing two smoke-impacted days in 2020 against analogous smoke-free conditions on the same days in 2019 (Fig. 1). August 20 and 30 were selected because both featured expansive smoke plumes in 2020, different mean AODs at 550 nm (AOD550), no smoke in 2019, and minimal cloud cover in 2019 and 2020. The AOD550 in smoke-impacted locations averaged 0.56 on August 20 (median: 0.40) and 0.43 on August 30 (median: 0.32) according to MAIAC observations. The NSRDB assumed a higher average AOD (August 20: 0.99; August 30: 0.6), and the lower resolution distributed the highest AODs near fires over larger areas. As such, the NSRDB may compute larger reductions in irradiance further from a fire than finer resolution models. We calculated cloud-driven ΔDNI and ΔGHI as the difference between modeled all-sky and clear-sky irradiances on two cloudy days in 2019 that were relatively smoke-free with different cloud optical depths (COD). The average COD observed (all-sky modeled) by CERES (NSRDB) was 2.93 (2.14) on September 7 and 10.76 (19.26) on September 18.

Fig. 1: Comparison of smoke, clouds, and irradiance conditions across California on four case study days.
figure 1

All maps show the region from 124.8°W to 113.9°W and 31.5°N to 43.1°N. Smoke-driven changes in irradiance under optically thinner (jl; purple) and thicker (gi; brown) wildfire smoke in 2020 were calculated relative to smoke-free conditions on the same days in 2019. Cloud-driven changes in irradiance under smoke-free conditions with optically thin (df; light gray) and thick (ac; dark gray) clouds in 2019 are calculated as the difference between modeled all-sky and clear-sky values on the dates indicated. Locations included in the analysis are shown in the mean daily cloud optical depth (COD) (b, c, e, f) and aerosol optical depth (AOD) (h, i, k, l) maps with the source of each metric listed as Satellite (b, e, h, k) or National Solar Radiation Database (NSRDB) (c, f, i, l). Satellite CODs are from the Clouds and the Earth’s Radiant Energy System (CERES) Synoptic 1-degree hourly product, and Satellite AODs are from the Multi-Angle Implementation of Atmospheric Correction (MAIAC) product. Corrected reflectance true-color imagery (NASA Worldview) from the Terra satellite’s Moderate Resolution Imaging Spectroradiometer (MODIS) instrument (a, d, g, j) are included for each date with the image border color corresponding to the associated color in the box plots (m, n). The box plots show the distribution of changes in daily average direct normal (ΔDNI) (m) and global horizontal (ΔGHI) (n) irradiance including the median change (black line), mean (white diamonds), first quartile (box bottom), third quartile (box top), and range (whiskers) excluding outliers (data beyond 1.5 times the interquartile range).

Optical depth modulates the impact of both smoke and clouds on DNI and GHI, but clouds can drive substantially larger declines in irradiance. Mean DNI declined 42% (32%) under the optically thicker (thinner) smoke (Fig. 1). Decreases in GHI were smaller under both optically thicker (−17%) and thinner (−11%) smoke conditions. The influence of optical depth was more pronounced for DNI than GHI as DNI decreased 10% compared to 6% for GHI with the increase in AOD. Overall, ΔDNI consistently exceeded ΔGHI due to the heightened sensitivity of DNI to cloud droplets and aerosol particles. Losses were greater for optically thicker smoke and clouds than optically thinner smoke and clouds, respectively. Daily ΔDNI and ΔGHI values were more variable for clouds than smoke, and the maximum cloud-driven reduction far exceeded the maximum smoke-driven reduction. ΔDNI reached −910 W m−2 for thicker clouds and −797 W m−2 for thinner clouds compared to −756 W m−2 for thicker smoke. ΔGHI reached −547 W m−2 for thicker clouds and −421 W m−2 for thinner clouds compared to −350 W m−2 for thicker smoke. For context, PV panel efficiency is tested using a standard irradiance value of 1000 W m−2. Thinner smoke and clouds resulted in similar irradiance reductions, particularly for GHI, while thicker smoke typically drove equal or greater losses than thinner clouds. Occasionally, thicker smoke achieved similar reductions to thicker clouds. Since localized wildfire smoke can persist longer than clouds–which develop, move, and dissipate throughout the day, limiting their impact–adding more smoke to a relatively cloud-free region functionally increased the number of cloudy (i.e., high optical depth) days in California in 2020.

2020 monthly regional and national smoke impacts as a future analogue

Exceptional fire years like 2020 will become more frequent as the west continues to warm41 and relatively low fire years like 2019 will stand as reminders of past conditions. We contrast 2020 and 2019 as an analogue for future smoke impacts on solar resources relative to more traditional smoke conditions. Since smoke impacts the U.S. every year from wildfires, prescribed burns, and agricultural fires, we focus on low and high smoke conditions rather than smoke-impacted and smoke-free conditions. Smoke-impacted days increased in 2020 across 72% of CONUS in August (Fig. 2a) and 88% in September (Fig. 2b). In 2020, the percentage of CONUS experiencing smoke overhead for at least 15 days was 22% in August (2019: 1%) and 40% in September (2019: 4%). The percentage experiencing smoke for at least eight days was 38% in August (2019: 17%) and 47% in September (2019: 15%). As wildfires grew across California, Oregon, Washington, and Colorado, smoke was transported north and east leading to a widespread increase in smoke-impacted days.

Fig. 2: Comparison of smoke, aerosol, and irradiance conditions between a low (2019) and high (2020) smoke year.
figure 2

Maps showing the change in the number of smoke-impacted days (a, b), the change in monthly mean aerosol optical depth at 550 nm (ΔAOD550) (c, d), and the percent decrease in surface-level mean clear-sky direct normal (DNI) (e, f) and global horizontal (GHI) (g, h) irradiance from 2019 to 2020 across the contiguous U.S. (longitudes: 126°W to 65°W; latitudes: 24.5°N to 46°N) in August (a, c, e, g) and September (b, d, f, h).

AOD provides context for interpreting surface-level irradiance changes since extinction efficiency varies with smoke particle density, plume vertical extent, and composition. Smoke-driven changes in observed MAIAC monthly mean AOD550 across CONUS were minor in August (Fig. 2c), increasing 23% (2019: 0.13; 2020: 0.16), and more substantial in September (Fig. 2d), increasing 127% (2019: 0.11; 2020: 0.25). Few areas surpassed an AOD550 of 0.5 in August 2019 (<0.01%), September 2019 (<0.01%), or August 2020 (0.08%). However, in September 2020, AOD550 exceeded 0.5 across 3% of CONUS, in areas close to active fires. The more substantial increase in smoke frequency in September was associated with a more than doubling of the mean CONUS AOD550. Observed monthly mean AODs align well with those underlying the NSRDB model. A comparison of the two products is available in Supplementary Figs. 1 and 2. Overall, AOD patterns and changes are consistent, but the NSRDB is biased low at high AODs and distributes high AODs over larger distances. Thus, projected changes in irradiance may be lower directly above active fires but extend further than implied by AOD observations.

As smoke frequency and AOD550 increased, modeled clear-sky DNI (Fig. 2e, f) and GHI (Fig. 2g, h) typically decreased, particularly in areas close to large wildfires in California and Colorado in August and California, Colorado, Oregon, and Washington in September. For clear-sky DNI, monthly means decreased across 63% and 96% of CONUS in August (Fig. 2e) and September, respectively (Fig. 2f). In August, mean clear-sky DNI decreased over 20% in areas of California (max: 30%; 255 W m−2) and Colorado (max: 24%; 203 W m−2) near active wildfires. In September, average losses exceeded 20% across much of California (max: 61%; 483 W m−2) and in parts of Oregon (max: 29%; 238 W m−2), Northern Colorado (max: 24%; 225 W m−2), and Southern Wyoming (max: 25%; 229 W m−2). Clear-sky DNI losses reached 40–60% in some areas of California’s Central Valley and Northern Coastal Ranges. DNI reductions persisted downwind to affect areas far from emission sources but within the transport pathway for western wildfire smoke. In September, clear-sky DNI decreased 5–10% as far east as Maine and 10–15% in Nebraska and parts of South Dakota, Minnesota, Iowa, Wisconsin, Kansas, and Texas.

Mean clear-sky GHI (Fig. 2g, h) decreased less than DNI, but spatial trends in GHI losses largely mirrored those for DNI. In 2020, clear-sky GHI decreased across 66% and 95% of CONUS in August and September, respectively. On average, reductions were <5% in nearly all areas (≥95%) that experienced GHI losses in August and September. The largest reductions occurred in California. By percent, the greatest decrease was 12% in August and 25% in September, and by value, the greatest decrease was 83 W m−2 in August and 156 W m−2 in September. While substantial reductions in clear-sky GHI occurred across California, Oregon, and Colorado, severe decreases were restricted to areas immediately adjacent to large wildfires (e.g., California Central Valley and Coastal Ranges; Oregon Willamette Valley and Coast Range). Outside these areas, the reduction in monthly mean clear-sky GHI was minimal (<5%), which indicates that monthly mean clear-sky GHI changes little with increased smoke in areas primarily affected by transported plumes.

Figure 3 summarizes how monthly mean irradiance changed alongside smoke frequency at the national and regional scale for the entire wildfire season (April–October). Nationally, clear-sky DNI varied with changes in smoke frequency throughout the fire season, while clear-sky GHI remained relatively stable (Fig. 3a). Early in the season (April–July), smoke frequency across CONUS was the same or slightly lower in 2020 and irradiance was slightly higher (maximum increase of 2.5% for clear-sky DNI and 0.8% for clear-sky GHI). Later in the season (August–October), 2020’s smoke-impacted days far exceeded 2019’s with smoke frequency more than doubling in August (110%) and more than tripling in September (219%). Corresponding clear-sky DNI decreased by 3% and 8% while clear-sky GHI decreased by 1% and 2%, respectively. Decreases in CONUS-averaged clear-sky irradiance were relatively small compared to the large increases in smoke frequency, especially when considering GHI.

Fig. 3: National and regional smoke and irradiance trends during the 2019 and 2020 wildfire seasons.
figure 3

Average monthly number of smoke-impacted days (teal), surface-level mean clear-sky direct normal irradiance (DNI) (purple), and surface-level mean clear-sky global horizontal irradiance (GHI) (orange) in 2019 (solid) and 2020 (dashed) nationally (a) and aggregated by region (ch). A map of the different regions is presented (b) showing the Southwest (SW), Pacific Northwest (PNW), South Central (SC), North Central (NC), Southeast (SE), and Northeast (NE).

Regionally (Fig. 3b), smoke frequency increased and clear-sky irradiance decreased in the late summer and early fall, particularly in the Southwest (SW), Pacific Northwest (PNW), and North Central (NC) regions. (Fig. 3c, d, f). The SW recorded the largest rise in smoke-impacted days of any region, increasing by 16.6 days in August (+882%) and 19.2 days in September (+963%). The largest decrease in clear-sky irradiance during these months also occurred in the SW with DNI losses of 10% in August and 12% in September. GHI losses only reached 3%. In September, smoke-impacted days increased substantially in the PNW (+544%), NC (+180%), and Northeast (NE; +1169%) regions; clear-sky DNI decreased 11%, 8%, and 5%, respectively; and clear-sky GHI decreased 3%, 1%, and 1%, respectively. Weaker trends in smoke frequency and irradiance characterized the South Central (SC) and Southeast (SE) regions (Fig. 3e, g). Trends in the SW, PNW, and NC regions largely drove the variation in CONUS averages (Fig. 3a). Additionally, the limited fluctuation of GHI over a wide range of smoke frequencies indicates the relative stability of the baseline PV resource.

We focus on smoke-driven irradiance shifts, but the main societal concern is how much these changes, particularly in GHI, affect PV power output. We modeled average daily output at an idealized 1 MW fixed-tilt PV installation for September 2019 and 2020 at 13 locations across CONUS using the monthly mean clear-sky DNI and GHI computed in this study (Supplementary Table 1). While proportional changes in output are similar across fixed-tilt and single-axis tracking panels, the absolute change will be greater for tracking panels, which have higher baseline production. Below we present results assuming c-Si panels (Fig. 4), and results for CdTe panels are provided in Supplementary Fig. 3.

Fig. 4: Smoke-driven changes in modeled PV output in 13 U.S. cities.
figure 4

Comparison of modeled average power output in MWh (a) and percent change (b) for September at 13 U.S. locations during a low (blue; 2019) and high (brown; 2020) smoke year assuming a 1 MW photovoltaic (PV) installation of monocrystalline silicon (c-Si) panels. Site regions are indicated via grayscale shading. Power output was modeled using four approaches: two with panel temperatures set to a standard temperature of 25 °C and two with location-based panel temperatures. At 25 °C, power output was modeled using clear-sky direct normal (DNI) (yellow) or global horizontal (GHI) (red) irradiance. For location-based temperatures, power output was modeled using clear-sky GHI, and panel temperature was determined either by ambient air temperature (light purple) or by ambient air temperature, wind speed, and changes in GHI (dark purple). The monthly average clear-sky DNI and GHI for each location are reported in Supplementary Table 1. Results for polycrystalline cadmium telluride (CdTe), the second most common panel material, are provided in Supplementary Fig. 3.

Under standard test conditions (STC; 25°C, air mass 1.5), power loss is greatest in areas near active wildfires with optically dense, persistent smoke plumes (Fig. 4). For example, given an 11% or 65 W m−2 reduction in mean September GHI in Modesto, CA (2019 baseline: 620 W m−2), the model projects an 11% (−6.6 megawatt hours (MWh)) reduction in average daily power output. PV output reductions are substantially lower (~1 MWh) in areas with optically thinner, transported smoke plumes (e.g., NC, SC, and NE regions). However, using the STC ignores how site location, wind conditions, and smoke-driven reductions in surface radiation influence panel temperature, which greatly affects PV performance as panels operate more efficiently at lower temperatures42,43. PV output is lower under ambient air temperatures, which are hotter than STC in September, but proportional reductions remain the same. Cooling effects from wind and smoke-driven GHI reductions increase PV output relative to that calculated solely using air temperature.

Smoke-irradiance interactions across CONUS from 2006–2021

Although 2020 smoke conditions exemplify future conditions, U.S. wildfire activity has been increasing since the mid-1980s14,15,16. As such, we examined regional daily fractional smoke spatial coverage and clear-sky irradiance trends from 2006 to 2021. Mean irradiance refers to the daily average clear-sky DNI and GHI for each region. Long-term analyses better constrain smoke’s impact on baseline resources given spatio-temporal variations in smoke extent and optical properties. At a high level, regional smoke coverage has increased in recent years (Fig. 5), and smoke-irradiance interactions exhibit a clear seasonality (Fig. 6a, b). Overall, correlations were significant (p < 0.05) for 86% of DNI (Fig. 6a) and 60% of GHI analyses (Fig. 6b). Correlations and regression equations are provided in Supplementary Table 2.

Fig. 5: Longitudinal change in irradiance and smoke coverage by region.
figure 5

Time series (af) of the August (solid) and September (dashed) daily mean smoke coverage (teal line), clear-sky direct normal (DNI) (purple line), and clear-sky global horizontal (GHI) (orange line) irradiance for each contiguous U.S. (CONUS) region from 2006 to 2021. Line plot (g) indicating the percent change in irradiance during years with high (i.e., above average) smoke coverage compared to years with low smoke coverage based on the regional average smoke coverage for each month. Smoke coverage refers to the percent of each region that was smoke-impacted on a given day.

Fig. 6: Long-term regional relationships between smoke coverage and irradiance at the daily level.
figure 6

Comparison of daily regional smoke coverage (i.e., the fraction of each region that is smoke-impacted) to daily mean regional irradiance for all April-October days in the contiguous U.S. (CONUS) from 2006 to 2021. Pearson’s correlation coefficients (r) (a, b) are displayed for each month from April-October for clear-sky direct normal (DNI) (a) and clear-sky global horizontal (GHI) (b) irradiance. Simple linear regression results (c, d) are displayed for clear-sky DNI (solid) and clear-sky GHI (dashed) for August (c) and September (d). Shading represents the 95% confidence interval.

Mean clear-sky DNI was more sensitive to plume size (i.e., regional coverage) in areas where plumes are denser, indicating nearby fires and fresh smoke. During years with above average regional smoke coverage (Fig. 5), DNI reductions were largest in the SW (August: –7%; September: –7%) and PNW (August: –11%; September: –6%) where plumes from local wildfires are fresh and optically thick19,44. Average smoke coverage was greatest in the PNW (August: 51%; September 27%) and NC regions (August: 47%; September: 27%), with the latter influenced primarily by transported smoke. Despite similar smoke coverage, the decline in clear-sky DNI was smaller in the NC region (August: –5%; September: –4%) due to differences in smoke density and optical properties.

Correlations (Fig. 6a) and regression slopes (Fig. 6c, d) indicate that the connection between smoke coverage and DNI is strongest in the SW and PNW, particularly during the local fire season. August (rsw = –0.81; rpnw = –0.68) and September (rsw = −0.68; rpnw = –0.65) correlations are high, and the regression analysis (Fig. 6c) projects a 17% (156 W m–2) decline in August clear-sky DNI in the PNW and a 16% (144 W m–2) decline in the SW as smoke coverage increases from 0% to 100%. Similar decreases of 16% (147 W m–2) and 14% (125 W m–2) are projected for September (Fig. 6d) in the PNW and SW, respectively. In August, smoke coverage explains 66% of the variance in clear-sky DNI in the SW, but the greatest rate of change in DNI is in the PNW.

Clear-sky DNI was less sensitive to regional smoke coverage in areas with optically thin, laterally extensive, dilute, aged plumes overhead, such as the NC, SC, NE, and SE regions. The strongest correlation in the NC, SC, and NE regions occurred in September (rnc = –0.46; rsc = −0.31; rne = –0.3), and the strongest in the SE occurred in July and August (rse = –0.32). Smoke coverage explained less than 10% of the variance in clear-sky DNI for these regions and months, except in the NC region during July (17%), August (16%), and September (21%).

Regional clear-sky GHI remained remarkably stable across this 16-year timeframe regardless of smoke coverage (Fig. 5). During high smoke years, each region’s mean clear-sky GHI decreased by ≤3% compared to low smoke years. GHI declines peaked at 3% in August (PNW) and 2% in September (PNW and SW). Regions mainly affected by transported smoke experienced decreases in clear-sky GHI of ≤1%.

Correlations between clear-sky GHI and smoke coverage were generally weaker with fewer significant values (Fig. 6b). The strongest correlations occurred in the SW and PNW regions in July (rsw = –0.33, p < 0.001; rpnw = –0.51, p < 0.001) and August (rsw = –0.56, p < 0.001; rpnw = –0.44, p < 0.001). These moderate correlations were associated with decreases in clear-sky GHI of 3% and 5%, respectively (July: –22 to –23 W m–2; August: –30 to –35 W m–2), as regional smoke coverage increased from 0% to 100% (Fig. 6c, d). In the SW, correlations were weaker for September (r = –0.21, p < 0.001) with smoke coverage explaining <5% of the variance in clear-sky GHI. The regression analysis projects a 3% (–16 W m–2) decline in September’s clear-sky GHI in the SW when smoke increases from 0% to 100% coverage. In other regions, correlations were weaker with mixed significance. Ultimately, across these 16 years, smoke failed to drive major shifts in regional mean clear-sky GHI.

While we discuss how increasing smoke coverage from 0% to 100% impacts irradiance, regions rarely experienced 100% smoke coverage (Supplementary Fig. 4). At most, 5% of August days in the PNW and NE regions were entirely smoke-impacted, and fewer days reached 100% smoke coverage in September (2% for the NE, NC, SW, and PNW regions). The PNW and NC regions reached ≥50% smoke coverage on nearly half of August days and a quarter of September days. In the SW, 1% (16%) of August and 2% (11%) of September days reached 100% (50%) smoke coverage.

Discussion

We examine the historical impact of smoke, particularly wildfire smoke, on two irradiance parameters–DNI and GHI–relevant for solar energy production at different spatial and temporal scales. Consistent with prior studies, solar resources decline substantially under local optically thick smoke. However, transported smoke that extends across CONUS minimally affects baseline monthly mean GHI in the underlying regions, even during extreme wildfire seasons. Thus, while most other studies report that smoke will negatively affect solar PV generation on a day-to-day or minute-to-minute basis24,25,26,27,28,29, we demonstrate through an expanded CONUS-wide analysis of a longer time horizon that, on average, smoke will not greatly affect baseline solar PV resource availability. Although transported smoke increasingly affects larger areas, the long-term stability of GHI resources with regard to smoke supports continued PV development alongside the growth of utility-scale battery storage capacity to address short-term smoke impacts from local fires. Resources for CSP production (i.e., DNI), show greater sensitivity to local and transported smoke–particularly to fresh, dense, local plumes–but CSP represents a relatively small portion of U.S. solar production.

Local smoke can substantially decrease surface-level clear-sky DNI and GHI during severe wildfire events. Using daily case studies with thick smoke (AOD550 0.43–0.56), we show average clear-sky DNI for California declined 32–42% and clear-sky GHI declined 11–17%, which is consistent with reductions in PV output of ~10% modeled at an AOD of 0.5 at ten southern California PV plants in 202027. Such reductions also align with observations from Greece during the 2021 wildfire season where GHI declined 10–20%45; Australia where DNI (GHI) reductions reached 26% (32%) ~1 km from a controlled burn25; and in Colorado where smoke from the 1988 Yellowstone National Park fires decreased daily DNI (GHI) by 37% (9%)46. While average GHI reductions of 11–17% can substantially impact the power grid, the spatial variability of smoke means that some areas may see GHI reduced up to 50%. Such a loss represents a substantial risk to PV production in California, particularly since clouds are rare, irradiance is high, and demand is at its peak during the summer. Since the current grid lacks large-scale electricity storage capacity, short-term declines in solar generation must be offset by alternative generation sources, typically natural gas, with nonzero marginal costs. As such, more wildfire smoke could increase short-term use of dirtier and more expensive—in $ per MWh—natural gas generation or reliance on battery storage. Additionally, localized short-term impacts are not isolated to California. Increased wildfire activity across the western U.S. is exposing other areas rich in solar resources to local, dense smoke plumes (e.g. 2022 Calf Canyon/Hermits Peak fire in New Mexico, 2024 Smokehouse Creek fire in Texas).

Lofted smoke from western U.S. and Canadian wildfires frequently travels across CONUS19. As such, understanding how smoke may affect solar energy production requires looking beyond the influence of local fires on adjacent PV installations. Comparing the heavy smoke season of 2020 to the more traditional smoke season of 2019, monthly mean clear-sky DNI and GHI decreased across CONUS as smoke frequency increased in August and September. Clear-sky DNI was more sensitive to smoke than clear-sky GHI. Local smoke was associated with clear-sky DNI reductions of over 20% in the SW and PNW, with losses reaching 50–60% in parts of California. Reductions in clear-sky DNI persisted into areas affected by transported smoke, decreasing by 10–15% in parts of the NC region and 5–10% in parts of the SC and NE regions. In contrast, clear-sky GHI reductions amounted to <5% in areas affected by transported smoke. Ultimately, in a heavily smoke-impacted season indicative of potential future conditions, regional and national declines in monthly mean clear-sky GHI were small. However, locations near large fires remained at risk of substantial resource losses given the large decrease in GHI (>10%) in areas of the SW, where solar potential and production are greatest.

Similar patterns emerge over longer timescales. Again, the relationship between regional smoke coverage and mean clear-sky irradiance is stronger for DNI than GHI. Smoke-irradiance trends are strongest in July, August, and September in the SW and PNW where plumes are typically optically thicker and fresher. Average clear-sky DNI can decrease 14–17% in the SW and PNW in August and September if wildfire smoke fully covers these regions, although such conditions are infrequent. During these months, the correlation between smoke area and irradiance is stronger in the SW, but the rate of change is slightly greater in the PNW. The SW averages areas with high smoke impacts, like California and Colorado, with low smoke impacts, like Arizona and New Mexico, which have earlier fire seasons and are outside primary smoke transport pathways. In contrast, the PNW region simultaneously experiences local and transported smoke, reducing the areas with little to no smoke impacts. In regions mainly affected by transported smoke (i.e., the SE, NC, NE, and SC regions), we project clear-sky DNI losses of 3–9% in August and 7–10% in September on days when smoke covers the entire region. On the other hand, clear-sky GHI decreases little, if at all, with varying smoke coverage. GHI correlations are moderate for the SW and PNW in July and August with regression analyses predicting a 3–5% change in daily clear-sky GHI with 100% smoke coverage. Unlike DNI, September correlations for both regions are relatively weak or insignificant. Among significant negative correlations in other regions and months, clear-sky GHI decreases 1–5% under entirely smoke-impacted conditions. By examining 16 years of smoke and irradiance data, we find that daily regional clear-sky GHI changes little, typically <5%, with increasing smoke coverage, consistent with our analysis of 2020 and 2019.

The two primary limitations of this study are the spatial and temporal resolution at which we conduct the aggregate analyses. We explore smoke-irradiance trends at the monthly level, which obscures day-to-day and hour-to-hour changes in surface irradiance due to variable smoke conditions. Understanding this variability is essential for producers given the need for accurate generation forecasts for electricity load balancing on finer time scales. Additionally, spatial variability in smoke-irradiance interactions is masked by our focus on trends at the state, regional, and national level. Finer temporal and spatial resolution would better constrain the range of smoke-irradiance conditions that a location experiences. Our national and regional assessments capture the broader impacts of the spatially extensive smoke plumes produced by large western wildfires that are growing larger, more frequent, and longer in duration with climate change.

Most of the existing and planned solar installations in the U.S. use PV panels, which rely on GHI. As such, understanding how smoke affects surface-level GHI is crucial for locating PV development and accurately forecasting PV production. We show that smoke is associated with large reductions in GHI close to major wildfires but minimal to no loss of GHI from transported smoke across most of CONUS. Even during extreme wildfire seasons with heavy smoke, as seen in 2020, we project little change in average PV resource availability, except in areas with optically thick, fresh, local smoke plumes. These findings inform where long-term solar resources are most stable given climate-induced increases in wildfire smoke that drive higher AODs. Combining assessments of long-term smoke-irradiance interactions with projected fire vulnerability and cloud conditions could highlight ideal locations for solar deployment.

Methods

Below we describe the data sets and variables used in this study as well as the data processing and analysis, which were performed using Python version 3.10.

Irradiance data

We assessed solar resources using modeled clear-sky and all-sky DNI and GHI from the National Renewable Energy Laboratory’s National Solar Radiation Database (NSRDB)47 Physical Solar Model (PSM) version 3.2.2. The NSRDB combines atmospheric and land surface properties from satellite and reanalysis data in a two-step physical model. Cloud properties are derived from Geostationary Operational Environmental Satellite (GOES) observations using the Advanced Very High Resolution Radiometer (AVHRR) Pathfinder Atmospheres-Extended (PATMOS-x) algorithms. AODs are determined using a combination of data from the Modern-Era Retrospective analysis for Research and Applications, Version 2 (MERRA-2) reanalysis product and monthly values from the Moderate Resolution Imaging Spectroradiometer (MODIS) instrument aboard the Terra and Aqua satellites. When MODIS monthly data are missing, only the MERRA-2 data are used, as is standard for the eastern and northern U.S. Otherwise, an optimal linear combination of the two data sets is computed. Surface albedo inputs are sourced from MODIS and the Interactive Multisensor Snow and Ice Mapping System. MERRA-2 is used for all other inputs. The NSRDB regrids these data to a 4 × 4-km spatial resolution and interpolates to a 30-min temporal resolution before being input into the Fast All-sky Radiation Model for Solar Applications (FARMS)48. GHI is directly computed using FARMS, as is DNI when the sky is cloud-free. To compute DNI when clouds are present, the Direct Insolation Simulation Code is used to decompose the DNI component of GHI. REST2 is used for clear-sky irradiance computations49. Comparisons to ground observations of radiation indicate a 5% bias for GHI and 10% bias for DNI with the NSRDB47.

Smoke plume locations

We use the HMS to distinguish between smoke-free and smoke-impacted days at each NSRDB grid location across CONUS. The HMS was developed by the National Oceanic and Atmospheric Administration’s (NOAA) National Environmental Satellite, Data, and Information Service (NESDIS) and provides daily U.S. smoke plume locations50. Analysts use visual-band imagery mainly from the National Aeronautics and Space Administration (NASA) GOES to identify smoke from wild, agricultural, and prescribed fires. No attempt is made to determine the smoke’s source. Analysts manually outline plume extents and qualitatively estimate plume concentration. Occasionally, analysts use polar orbiting satellite imagery to identify smoke or infrared imagery to distinguish smoke from clouds51.

The HMS is considered a conservative estimate of smoke plume number and extent19. While error and confidence metrics are not currently available for the HMS smoke product, we anticipate minimal sensitivity to the known limitations. Edge detection issues are the most prominent limitation relevant to this study. The HMS can struggle to distinguish plume edges when (1) the smoke is diffuse and mixed with anthropogenic haze or (2) the plume edge is obscured by clouds. The former issue mainly affects observations over the southeastern U.S., but the fire season in the southeast generally occurs much earlier in the year and is characterized by smaller agricultural fires. By focusing on the western U.S. wildfire season, we work with plumes that the HMS is best suited to characterize. For local smoke, the HMS performs well in detecting the sharp edges of fresh, dense, western plumes. The smoke transported far downwind from these fires mainly affects areas over the Northern Great Plains and Midwest where anthropogenic haze mixing is less common. We anticipate high accuracy for the HMS detections in areas where smoke’s impact on irradiance is greatest (i.e., optically dense plumes) with edge detection issues isolated to locations where plumes are most diffuse and light extinction is lowest.

Cloud and aerosol observations

We use the cloud mask and 550 nm AOD retrieval from the MAIAC daily atmosphere product (MCD19A2). MAIAC processes MODIS observations from both Terra and Aqua overpasses, which capture morning (~10:30 AM LT) and afternoon (~1:30 PM LT) conditions, respectively. MAIAC uses the contrasting spatial and temporal variability of land surfaces and aerosols to improve cloud masking and retrieve 1-km resolution AOD52,53. Unlike other algorithms, MAIAC specifically tests for the presence of smoke, and when detected, retrieves AOD for both clear and possibly cloudy skies. This results in more AOD retrievals during smoke-impacted conditions.

We combine cloud observations from MAIAC and CERES. We use the cloud mask from MAIAC to determine cloud locations on a fine spatial scale (1-km) and the CERES Synoptic 1-degree hourly (SYN1deg-1Hour) Edition 4.1 product to obtain the cloud visual optical depth. The SYN1deg cloud properties are derived from visible and infrared MODIS, Visible Infrared Imaging Radiometer Suite (VIIRS), and geostationary (GEO) satellite imagery54,55,56. Values are spatially averaged and temporally interpolated to the SYN1deg-1Hour resolution.

Data processing

We downloaded the 30-min, 4-km, yearly netCDF files for the NSRDB PSM version 3.2.2 for all years from 2006 to 2021 from the Registry of Open Data on Amazon Web Services (AWS), which is part of the Department of Energy’s Open Energy Data Initiative. We extracted the clear-sky DNI, clear-sky GHI, all-sky DNI, all-sky GHI, and solar zenith angle (SZA) data sets for April–October of each year and for grid locations within the 48 states comprising the CONUS. The SZA values were used to filter the irradiance data to exclude times when the SZA is greater than 75°. This removes times near sunrise and sunset when solar energy generation is low. Using an SZA filter provides consistency in analysis across time zones.

The four data products used in this analysis have different spatial and temporal resolutions. Spatially, we converted the HMS, MAIAC, and CERES data to the NSRDB 4-km grid using two spatial gridding processes. First, the HMS smoke polygons were gridded by determining if the center point (latitude, longitude) of an NSRDB grid cell was located beneath a smoke-plume polygon. If a polygon overlapped an NSRDB point, the location was considered smoke-impacted, and if not, the location was considered smoke-free. Second, we mapped the MAIAC and CERES grids to the NSRDB using a nearest-neighbor approach in which the haversine distance between each grid point is calculated and the minimum distance determined. Each NSRDB location was assigned a MAIAC AOD, MAIAC cloud mask, and CERES COD based on the closest location match. A few NSRDB grid points (0.01%) had more than one matching MAIAC neighbor, and as such, an average AOD was calculated for these pixels. Interpolating the CERES COD in this manner provides a rough estimate of the observed cloud variability on each day but does not explicitly resolve the cloud shape within each grid box. We include the NSRDB COD in Fig. 1 to illustrate the finer variability resolved by the model.

Temporally, we averaged the MAIAC, CERES, and NSRDB data to the daily level of the HMS smoke product. AOD values were averaged across the different Terra and Aqua retrievals in MAIAC to determine a daily mean AOD at 550 nm. The MAIAC cloud mask classifies each pixel as cloudy, possibly cloudy, or clear. If a location is identified as cloudy or possibly cloudy in any of the Terra or Aqua retrievals for the day, we consider the location to be cloudy. The daily COD was calculated by averaging the hourly CERES COD values for times when the SZA was less than 75°. Similarly, the NSRDB 30-min irradiance values were averaged for each day to obtain daily clear-sky and all-sky DNI and GHI. Once converted to the same spatial and temporal resolution, we merged all four data products for analysis.

Analysis of California case study dates

Four dates were selected for this case study to (1) highlight the role that AOD plays in determining smoke’s impact on irradiance and (2) compare the impact of smoke and clouds on solar resources. We focused on August and September when selecting dates for this case study because major fire activity and smoke production in California is generally greatest during these months and was so in 2020. We identified days based on visual examination of the NASA Worldview true-color imagery of California for all August and September days in 2019 and 2020. We selected August 20 and August 30 to illustrate the influence of smoke optical properties, via AOD, on smoke-driven changes in solar resources. We restricted our analysis to locations that were smoke-impacted in 2020 and smoke-free in 2019 based on the HMS polygons. In 2020, the observed (MAIAC) average AOD550 was 0.56 on August 20 and 0.43 on August 30, indicating optically thicker and thinner smoke plumes, respectively. We calculated the smoke-driven change in DNI (ΔDNI) and GHI (ΔGHI) as the difference between clear-sky values in 2020 (smoke-impacted) and 2019 (smoke-free). Comparing the same date across both years better controls for variation in solar position and climate. We then compared these smoke-driven changes in irradiance to those from clouds of different optical depths. We analyzed locations across California that were categorized as cloudy or possibly cloudy by MAIAC on September 7, 2019 (CODmean = 3.26) and September 18, 2019 (CODmean = 10.75). We calculated the difference between the all-sky and clear-sky values on each day to determine the ΔDNI and ΔGHI attributable to clouds. A cloud filter was not applied when assessing smoke-driven changes because MAIAC does not retrieve an AOD when a pixel is categorized as cloudy and only retrieves an AOD under possibly cloudy conditions when smoke is detected. Boxplots were used to present the distribution of either the clear-sky DNI and GHI values or the ΔDNI and ΔGHI on each of the case study days. Negative values indicate the reduction of irradiance.

Analysis of monthly CONUS trends in 2019 and 2020

We computed monthly smoke frequency by summing the number of HMS-indicated smoke-impacted days at each NSRDB grid point for each month. We then calculated the difference between the number of smoke-impacted days in 2019 and 2020. Similarly, we computed the mean monthly AOD550 for all CONUS pixels by averaging the daily MAIAC AOD550 and calculated the difference between the 2019 and 2020 monthly values. For irradiance, we averaged daily clear-sky DNI and GHI to obtain the mean monthly clear-sky DNI and GHI. We then calculated the average smoke-driven changes in DNI (ΔDNI) and GHI (ΔGHI) by subtracting the monthly clear-sky value in 2019 from that in 2020 for each irradiance measure. National and regional monthly averages were then calculated for smoke frequency and irradiance from the monthly means at each grid location to summarize trends at different spatial scales.

Modeling photovoltaic array performance

We designed a model of a 1 MW solar PV installation to assess the change in PV output under average low smoke (2019) and high smoke (2020) conditions for September (i.e., when smoke was most abundant in 2020). To provide a comprehensive and realistically relevant analysis of PV performance under non-standard (i.e., real world) conditions, we used monthly mean irradiance values computed in this study for low and high smoke years as input. Based on the computed clear-sky DNI and GHI values, we derived the solar cell temperature since different solar panel technologies exhibit different temperature dependencies, and thus, different temperature coefficients for each PV technology. Below we describe in detail the model design and relevant computations for evaluating the performance of the PV modules under different conditions.

Traditionally, all PV modules are tested, and their performance is rated under STC57. The STC for terrestrial solar panels use an irradiance level of 1000 W m–2, cell temperature of 25 °C (77 °F), and air mass of 1.5. A solar radiation intensity of 1000 W m–2 is approximately the average solar radiation level on a clear day at noon. The cell temperature refers to the temperature of the solar cells themselves, not the ambient temperature. Although the STC rating is based on this standardized cell temperature, solar panels often operate at higher temperatures under real-world conditions. The air mass represents the spectral distribution of the sunlight and is a measure of the path length through the Earth’s atmosphere. An air mass of 1.5 approximates the sun’s angle in the sky at mid-latitudes on an average day.

The temperature coefficient of solar panels is a measure of how much a solar panel’s performance is affected by changes in temperature. Specifically, the measure quantifies the change in electrical output (usually power or voltage) for each degree Celsius (°C) rise in temperature above a standard reference point, typically 25 °C (STC) and is expressed as a percentage change per °C. For example, a temperature coefficient of –0.3% per °C means that for each °C increase in temperature above 25 °C, the solar panel’s output power decreases by 0.3%. The temperature coefficient (Tcoefficient) is specified via Eq. (1).

$${{{\rm{T}}}}_{{{\rm{coefficient}}}}=\frac{{{{\rm{Output}}}}\; {{{\rm{at}}}}\; {{{\rm{T}}}}_{{{\rm{operating}}}}\; -\; {{{\rm{Output}}}}\; {{{\rm{at}}}}\; {{{\rm{T}}}}_{{{\rm{STC}}}}}{({{{\rm{T}}}}_{{{\rm{operating}}}}-{{{\rm{T}}}}_{{{\rm{STC}}}})\;\times\; {{{\rm{Output}}}}\; {{{\rm{at}}}}\; {{{\rm{T}}}}_{{{\rm{STC}}}}}$$
(1)

Often, the operating temperatures (Toperating) for PV panels significantly exceeds the STC temperature (TSTC) with actual temperatures of over 65 °C in hot desert climates and even reaching 35–45 °C in cold or arctic climates. Therefore, we account for the effect of operating temperatures on performance when evaluating the effect of smoke on the performance of PV panels using two methods. First, we calculated operating temperature using a simplified approach that relies solely on ambient air temperature to estimate the module operating temperature58,59. Operating temperature (°C) of the PV cell or module (Tcell) is calculated using the relationship in Eq. (2).

$${{{{\rm{T}}}}}_{{{{\rm{cell}}}}}={{{{\rm{T}}}}}_{{{{\rm{air}}}}}+\frac{{{{\rm{NOCT}}}}-20}{800}{{{\rm{S}}}},$$
(2)

where Tair is the ambient air temperature (°C), NOCT is Nominal Operating Cell Temperature (°C), and S is the solar insolation in W cm–2. NOCT is defined as the temperature reached by open-circuited cells in a PV module under incident irradiance of 800 W m–2, air temperature of 20 °C, wind velocity of 1 m s–1, and the assumption that the panel mount exposes the back surface of the panel to air (e.g., as opposed to residential mountings where the roof covers the back surface). In the context of PV panels, S refers to the amount of solar radiation energy received on a given surface area in a given amount of time. The fraction (NOCT−20)/800 represents the rate at which the cell temperature increases per unit increase in solar irradiance above a baseline of 20 °C at an irradiance of 800 W m–2. When using Eq. (2), the actual module temperature will be lower than calculated when wind velocity is high and higher under still conditions58,60.

Second, a more accurate estimate for operating temperature of PV panels in the field can be calculated using the relationship in Eq. (3)61.

$${{{{\rm{T}}}}}_{{{{\rm{cell}}}}}={{{{\rm{T}}}}}_{{{{\rm{air}}}}}+\frac{{{{\rm{NOCT}}}}-20}{800}\times {{{\rm{GHI}}}}\times {{{\rm{Wind}}}}\; {{{\rm{Factor}}}}$$
(3)

where GHI is the Global Horizontal Irradiance, and the Wind Factor is a term that accounts for the cooling effect of the wind. We approximated the Wind Factor via Eq. (4).

$${{{\rm{Wind\; Factor}}}}={{{\rm{Wind\; Coefficient}}}}\times ({{{\rm{Wind\; Speed}}}}-1)$$
(4)

where the wind coefficient parameter depends on the installation specifics and panel type.

For our simulations, we evaluated panel performance at 13 locations across CONUS (Fig. 4) for the month of September in 2019 and 2020, when smoke impacts were greatest. We assumed fixed tilt panels oriented at the angle of each location’s latitude, consistent with industry standards. As inputs, we used the monthly mean clear-sky GHI calculated in the current study; normalized NOCT values of 46 °C for c-Si PV and 45 °C for CdTe PV; and temperature coefficients of –0.45% for c-Si and –0.21% for CdTe62. We assumed a 1 MW solar installation of each solar PV (c-Si and CdTe) system modeled. The highest reported record efficiencies for each panel technology were used (i.e., 24.5% and 20.2% for c-Si and CdTe, respectively). Under STC, the solar panel efficiency equates to the percent of incident irradiance that would be efficiently converted to useful electrical power63. In other words, at 25 °C and 1000 W m–2 incident solar irradiance, c-Si and CdTe PV modules were assumed to produce 245 Wp m–2 and 202 Wp m–2 of the PV module, respectively. Hence, the total area required for 1 MW installation for c-Si and CdTe PV was estimated to be 4081.63 m2 and 4950.5 m2, respectively. For a better perspective, this is approximately the volume of solar PV installation that can power houses for about 130 American families.

The ambient temperatures and wind speeds were drawn from the NSRDB. Considering only daytime hours (i.e., when enough solar irradiance was available to produce measurable PV power output), we calculated the average ambient temperature and wind speed for September across both years. We averaged across 2019 and 2020 to ensure that changes in operating temperature resulted from smoke-driven GHI adjustments rather than independent meteorological conditions in a given year. We then calculated the average PV panel operating temperatures. Ultimately, output was modeled at each location in 2019 and 2020 for c-Si and CdTe solar panels using four model input configurations: two at the STC temperature and two at location-specific temperatures. First, we estimated power output at 25 °C using only clear-sky DNI, which excludes the impact of clouds, operating temperatures, and diffuse radiation but includes the impact of smoke. Second, we estimated power output at 25 °C under more realistic irradiance conditions that used GHI as the input. Third, we estimated power output using GHI as the irradiance input and a location-based cell temperature calculated via Eq. (2), which accounts for ambient air temperature but excludes the effects of wind and GHI on temperature. Fourth, we again estimated power output using GHI but used location-specific cell temperatures calculated via Eq. (3), which includes temperature changes driven by wind speed and actual clear-sky GHI. This fourth configuration is considered the most realistic evaluation of smoke-driven changes in PV performance that we present.

Analysis of longitudinal data from 2006-2021

Using the daily mean all-sky and clear-sky DNI and GHI, we calculated the mean DNI and GHI values for all days from 2006 to 2021 for six regions: the Southwest, Pacific Northwest, North Central, South Central, Southeast, and Northeast (Figs. 56). We computed percent smoke coverage for each region on these days by determining the percentage of pixels flagged as smoke-impacted relative to the number of pixels in the region. We performed a linear regression analysis for each region and month combination for both clear-sky DNI and GHI using the scipy.stats.linregress function in Python. We performed 84 linear regressions and selected an alpha of 0.05 for significance.