Space-Based AI Shows the Promise of Big Data
Elizabeth Howell, Ph.D, a space writer and journalist based in Ottawa, Canada
July 12, 2022

At a distance of a million miles from Earth, the James Webb Space Telescope is pushing the edge of data transfer capabilities.
The observatory launched Dec. 25 2021 on a mission to look at the early universe, at exoplanets, and at other objects of celestial interest. But first it must pass a rigorous, months-long commissioning period to make sure that the data will get back to Earth properly.
Mission managers provided an update Feb. 11 noting that the primary mirror is aligning well, and that the instruments are starting to receive data from deep space.
“This is the first time we’re getting data on mirrors that are actually at zero gravity,”
said Lee Feinberg, Optical Telescope Element Manager for the James Webb Space Telescope at the NASA Goddard Space Flight Center, during the February press conference.
“So far, our data is matching our models and expectations,” Feinberg added. Webb is continuing the alignment procedure for several more weeks and is expected to start sending back its first operational science data in the summer of 2022.
How to store and analyze data in space
But when the telescope is ready for work, a new problem will arise. Webb’s gimbaled antenna assembly, which includes the telescope’s high-data-rate dish antenna, must transmit about a Blu-ray’s worth of science data — that’s 28.6 gigabytes — down from the observatory, twice a day. The telescope’s storage ability is limited — 65 gigabytes — which requires regular sending back of data to keep from filling up the hard drive.
The problem is deciding where to look first through this richness. Luckily, Webb’s tools are largely available in Python and parts of the data may be shared with institutes around the world to get more help. That said, scientists have limited time. Although researchers can recruit “citizen scientists” to help look at images through crowdsourcing ventures such as Zooniverse, astronomy is turning to artificial intelligence (AI) to find the right data as quickly as possible.
AI requires good data and strong training algorithms, such as through machine learning, to make decisions about what data to send back to decision-makers. Happily, there’s a space industry Webb can borrow from; AI systems are getting more adept by the month in interpreting Earth observations from satellites. There are many companies and space agencies out there using AI to parse information quickly on fast-moving events such as climate-change related wildfires or flooding.
The process (in an ideal world) begins up in space, when the satellite makes decisions on board about what to send back to Earth. For example, the European Space Agency’s ɸ-sat-1 (“phi-sat-1”) satellite launched in 2020 to test this in-space filtering on images with too much cloud in them to be otherwise usable. To note, previous satellites had trouble with clouds and this satellite launched with technology to fix the issue.
“To avoid downlinking these less than perfect images back to Earth, the ɸ-sat-1 artificial intelligence chip filters them out so that only usable data is returned,” ESA said in a blog post. “This will make the process of handling all this data more efficient, allowing users access to more timely information, ultimately benefiting society at large.”
This filtering is necessarily limited in space since only so much hardware will fit on a satellite. The images that make it down to Earth have a more robust set of techniques applied upon them with ground computers. It’s a process that some companies call geospatial intelligence (GI).

said Lee Feinberg, Optical Telescope Element Manager for the James Webb Space Telescope at the NASA Goddard Space Flight Center, during the February press conference.
“So far, our data is matching our models and expectations,” Feinberg added. Webb is continuing the alignment procedure for several more weeks and is expected to start sending back its first operational science data in the summer of 2022.
How to store and analyze data in space
But when the telescope is ready for work, a new problem will arise. Webb’s gimbaled antenna assembly, which includes the telescope’s high-data-rate dish antenna, must transmit about a Blu-ray’s worth of science data — that’s 28.6 gigabytes — down from the observatory, twice a day. The telescope’s storage ability is limited — 65 gigabytes — which requires regular sending back of data to keep from filling up the hard drive.
The problem is deciding where to look first through this richness. Luckily, Webb’s tools are largely available in Python and parts of the data may be shared with institutes around the world to get more help. That said, scientists have limited time. Although researchers can recruit “citizen scientists” to help look at images through crowdsourcing ventures such as Zooniverse, astronomy is turning to artificial intelligence (AI) to find the right data as quickly as possible.
AI requires good data and strong training algorithms, such as through machine learning, to make decisions about what data to send back to decision-makers. Happily, there’s a space industry Webb can borrow from; AI systems are getting more adept by the month in interpreting Earth observations from satellites. There are many companies and space agencies out there using AI to parse information quickly on fast-moving events such as climate-change related wildfires or flooding.
The process (in an ideal world) begins up in space, when the satellite makes decisions on board about what to send back to Earth. For example, the European Space Agency’s ɸ-sat-1 (“phi-sat-1”) satellite launched in 2020 to test this in-space filtering on images with too much cloud in them to be otherwise usable. To note, previous satellites had trouble with clouds and this satellite launched with technology to fix the issue.
“To avoid downlinking these less than perfect images back to Earth, the ɸ-sat-1 artificial intelligence chip filters them out so that only usable data is returned,” ESA said in a blog post. “This will make the process of handling all this data more efficient, allowing users access to more timely information, ultimately benefiting society at large.”
This filtering is necessarily limited in space since only so much hardware will fit on a satellite. The images that make it down to Earth have a more robust set of techniques applied upon them with ground computers. It’s a process that some companies call geospatial intelligence (GI).
The problem is deciding where to look first through this richness. Luckily, Webb’s tools are largely available in Python and parts of the data may be shared with institutes around the world to get more help. That said, scientists have limited time. Although researchers can recruit “citizen scientists” to help look at images through crowdsourcing ventures such as Zooniverse, astronomy is turning to artificial intelligence (AI) to find the right data as quickly as possible.
Article by

