We identify 182 flares on 158 stars within 100 pc of the Sun in both the near-ultraviolet (NUV; |$1750\!-\!2750$| Å) and far-ultraviolet (FUV; |$1350\!-\!1750$| Å) using high-cadence light curves from the Galaxy Evolution Explorer. Ultraviolet (UV) emission from stellar flares plays a crucial role in determining the habitability of exoplanetary systems. However, whether such UV emission promotes or threatens such life depends strongly on the energetics of these flares. Most studies assessing the effect of flares on planetary habitability assume a 9,000 K blackbody spectral energy distribution that produces more NUV flux than FUV flux (|mathcal R \equiv F_rm FUV / F_rm NUV \approx \frac16$|). Instead, we observe the opposite with the excess FUV reaching |mathcal R \approx \frac12\!-\!2$|, roughly |$3\!-\!12$| times the expectation of a 9,000 K blackbody. The ratio of FUV to NUV time-integrated flare energies is 3.0 times higher on average than would be predicted by a constant 9,000 K blackbody during the flare. Finally, we find that the FUV/NUV ratio at peak tentatively correlates (|sim 2 \sigma$| significance) both with total UV flare energy and with the G − RP colour of the host star. On average, we observe higher FUV/NUV ratios at peak in |$E_text{UV}\gt 1032$| erg flares and in flares on fully convective stars.
Leave a reply