Using artificial intelligence to provide space weather alerts in real-time

Using artificial intelligence to provide space weather alerts in real-time

A new study suggests that computers can learn to detect solar flares and other occurrences in vast streams of solar images, helping NOAA forecasters issue space weather alerts more in real-time. The machine-learning technique was improved by scientists at CIRES and NOAA's National Centers for Environmental Information (NCEI).

The technique gathers huge amounts of satellite data to pick out features significant for space weather.

According to NOAA Space Weather Prediction Center (SWPC) forecaster Rob Steenburgh, it is important to be able to process solar data in real-time "because flares erupting on the Sun impact Earth over the course of minutes." The changing conditions of the Sun and space can affect some technologies on Earth and could block radio communications, damage power grids, and diminish the accuracy of navigation systems.

"These techniques provide a rapid, continuously updated overview of solar features and can point us to areas requiring more scrutiny," he added.

Forecasters summarize current conditions on the Sun twice daily for them to predict incoming space weather.

In the present, they use hand-drawn maps labeled with various solar features including active regions, filaments, and coronal hole boundaries.

Solar imagers, on the other hand, produce a new set of observations every few minutes.

multi-colored-maps

Image credit: Seaton, D. and Hughes, J. M./CU Boulder, CIRES, and NCEI

suvi-images

​Image credit: Seaton, D. and Hughes, J. M./CU Boulder, CIRES, and NCEI

For instance, the Solar Ultraviolet Imager (SUVI) on NOAA's GOES-R Series satellites runs on a four-minute cycle, gathering data in six different wavelengths per cycle. Turns out, keeping up with all of that information could take up a lot of a weatherman's time.

"We need tools to process solar data into digestible chunks," said study co-author Dan Seaton, a CIRES scientist working at NCEI.

Lead author J. Marcus Hughes, a computer science graduate student at CU Boulder, CIRES scientist in NCEI created a computer algorithm that can manage all the SUVI images simultaneously and detect patterns in the data. Along with his colleagues, Hughes made a database of expert-labeled maps of the Sun and used the images to teach a computer to discern solar features significant for forecasting.

"We didn't tell it how to identify those features, but what to look for--things like flares, coronal holes, bright regions, filaments, and prominences. The computer learns the 'how' through the algorithm" Hughes said.

The algorithm distinguishes solar features with a decision-tree approach that follows a set of simple rules to identify between different traits.

It analyzes an image, one pixel at a time, then decides whether a pixel is brighter or dimmer than a particular threshold before sending it down a branch of the tree. The algorithm learns hundreds of decision trees and makes hundreds of decisions along each tree--to distinguish between different solar features and identify the "majority vote" for each pixel.

The system can classify millions of pixels in seconds once it is trained.

"This technique is really good at using all the data simultaneously," Hughes said. "Because the algorithm learns so rapidly it can help forecasters understand what's happening on the Sun far more quickly than they currently do."

"It can sometimes find features we had difficulty identifying correctly ourselves. So machine learning can direct our scientific inquiry and identify important characteristics of features we didn't know to look for," Seaton said.

The algorithm's ability to find patterns is not only vital for short-term forecasting, but also for long-term solar data evaluation "because the algorithm can look at 20 years' worth of images and find patterns in the data, we'll be able to answer questions and solve long-term problems that have been intractable," Seaton added.

Reference

"Real-time solar image classification: Assessing spectral, pixel-based approaches" - Hughes, J. M. - et al - Journal of Space Weather and Space Climate - https://doi.org/10.1051/swsc/2019036

Abstract

In order to utilize solar imagery for real-time feature identification and large-scale data science investigations of solar structures, we need maps of the Sun where phenomena, or themes, are labeled. Since solar imagers produce observations every few minutes, it is not feasible to label all images by hand. Here, we compare three machine learning algorithms performing solar image classification using Extreme Ultraviolet (EUV) and Hα images: a maximum likelihood model assuming a single normal probability distribution for each theme from Rigler et al. (2012) [Space Weather 10(8): 1–16], a maximum-likelihood model with an underlying Gaussian mixtures distribution, and a random forest model. We create a small database of expert-labeled maps to train and test these algorithms. Due to the ambiguity between the labels created by different experts, a collaborative labeling is used to include all inputs. We find the random forest algorithm performs the best amongst the three algorithms. The advantages of this algorithm are best highlighted in: comparison of outputs to hand-drawn maps; response to short-term variability; and tracking long-term changes on the Sun. Our work indicates that the next generation of solar image classification algorithms would benefit significantly from using spatial structure recognition, compared to only using spectral, pixel-by-pixel brightness distributions.

Featured image credit: Seaton, D. and Hughes, J. M./CU Boulder, CIRES, and NCEI

Comments

No comments yet. Why don't you post the first comment?

Post a comment

Your name: *

Your email address: *

Comment text: *

The image that appears on your comment is your Gravatar