{"id":288530,"date":"2025-07-24T18:10:11","date_gmt":"2025-07-24T18:10:11","guid":{"rendered":"https:\/\/www.europesays.com\/uk\/288530\/"},"modified":"2025-07-24T18:10:11","modified_gmt":"2025-07-24T18:10:11","slug":"how-nasa-is-testing-ai-to-make-earth-observing-satellites-smarter","status":"publish","type":"post","link":"https:\/\/www.europesays.com\/uk\/288530\/","title":{"rendered":"How NASA Is Testing AI to Make Earth-Observing Satellites Smarter"},"content":{"rendered":"<p>A technology called Dynamic Targeting could enable spacecraft to decide, autonomously and within seconds, where to best make science observations from orbit.<\/p>\n<p>In a recent test, NASA showed how artificial intelligence-based technology could help orbiting spacecraft provide more targeted and valuable science data. The technology enabled an Earth-observing satellite for the first time to look ahead along its orbital path, rapidly process and analyze imagery with onboard AI, and determine where to point an instrument. The whole process took less than 90 seconds, without any human involvement.<\/p>\n<p>Called <a href=\"https:\/\/ai.jpl.nasa.gov\/public\/documents\/papers\/dt-spaceops-2025.pdf\" rel=\"noopener\" target=\"_blank\">Dynamic Targeting<\/a>, the concept has been in development for more than a decade at NASA\u2019s Jet Propulsion Laboratory in Southern California. The first of a series of flight tests occurred aboard a commercial satellite in mid-July. The goal: to show the potential of Dynamic Targeting to enable orbiters to improve ground imaging by avoiding clouds and also to autonomously hunt for specific, short-lived phenomena like wildfires, volcanic eruptions, and rare storms.<\/p>\n<p>\u201cThe idea is to make the spacecraft act more like a human: Instead of just seeing data, it\u2019s thinking about what the data shows and how to respond,\u201d says Steve Chien, a technical fellow in AI at JPL and principal investigator for the Dynamic Targeting project. \u201cWhen a human sees a picture of trees burning, they understand it may indicate a forest fire, not just a collection of red and orange pixels. We\u2019re trying to make the spacecraft have the ability to say, \u2018That\u2019s a fire,\u2019 and then focus its sensors on the fire.\u201d<\/p>\n<p>This first flight test for Dynamic Targeting wasn\u2019t hunting specific phenomena like fires \u2014 that will come later. Instead, the point was avoiding an omnipresent phenomenon: clouds.<\/p>\n<p>Most science instruments on orbiting spacecraft look down at whatever is beneath them. However, for Earth-observing satellites with optical sensors, clouds can get in the way as much as two-thirds of the time, blocking views of the surface. To overcome this, Dynamic Targeting looks 300 miles (500 kilometers) ahead and has the ability to distinguish between clouds and clear sky. If the scene is clear, the spacecraft images the surface when passing overhead. If it\u2019s cloudy, the spacecraft cancels the imaging activity to save data storage for another target.<\/p>\n<p>\u201cIf you can be smart about what you\u2019re taking pictures of, then you only image the ground and skip the clouds. That way, you\u2019re not storing, processing, and downloading all this imagery researchers really can\u2019t use,\u201d said Ben Smith of JPL, an associate with NASA\u2019s Earth Science Technology Office, which funds the Dynamic Targeting work. \u201cThis technology will help scientists get a much higher proportion of usable data.\u201d<\/p>\n<p>The testing is taking place on CogniSAT-6, a briefcase-size <a href=\"https:\/\/www.nasa.gov\/what-are-smallsats-and-cubesats\/\" target=\"_blank\" rel=\"noopener\">CubeSat<\/a> that launched in March 2024. The satellite \u2014 designed, built, and operated by Open Cosmos \u2014 hosts a payload designed and developed by Ubotica featuring a commercially available AI processor. While working with Ubotica in 2022, Chien\u2019s team conducted tests <a href=\"https:\/\/esto.nasa.gov\/new-ai-algorithms-streamline-data-processing-for-space-based-instruments\/\" rel=\"noopener\" target=\"_blank\">aboard the International Space Station<\/a> running algorithms similar to those in Dynamic Targeting on the same type of processor. The results showed the combination could work for space-based remote sensing.<\/p>\n<p>Since CogniSAT-6 lacks an imager dedicated to looking ahead, the spacecraft tilts forward 40 to 50 degrees to point its optical sensor, a camera that sees both visible and <a href=\"https:\/\/science.nasa.gov\/ems\/08_nearinfraredwaves\/\" rel=\"noopener\" target=\"_blank\">near-infrared<\/a> light. Once look-ahead imagery has been acquired, Dynamic Targeting\u2019s advanced algorithm, trained to identify clouds, analyzes it. Based on that analysis, the Dynamic Targeting planning software determines where to point the sensor for cloud-free views. Meanwhile, the satellite tilts back toward nadir (looking directly below the spacecraft) and snaps the planned imagery, capturing only the ground.<\/p>\n<p>This all takes place in 60 to 90 seconds, depending on the original look-ahead angle, as the spacecraft speeds in low Earth orbit at nearly 17,000 mph (7.5 kilometers per second).<\/p>\n<p>With the cloud-avoidance capability now proven, the next test will be hunting for storms and severe weather \u2014 essentially targeting clouds instead of avoiding them. Another test will be to search for thermal anomalies like wildfires and volcanic eruptions. The JPL team developed unique algorithms for each application.<\/p>\n<p>\u201cThis initial deployment of Dynamic Targeting is a hugely important step,\u201d Chien said. \u201cThe end goal is operational use on a science mission, making for a very agile instrument taking novel measurements.\u201d<\/p>\n<p>There are multiple visions for how that could happen \u2014 possibly even on spacecraft exploring the solar system. In fact, Chien and his JPL colleagues drew some inspiration for their Dynamic Targeting work from another project they had also worked on: using data from ESA\u2019s (the European Space Agency\u2019s) <a href=\"https:\/\/ai.jpl.nasa.gov\/public\/projects\/rosetta\/\" rel=\"noopener\" target=\"_blank\">Rosetta<\/a> orbiter to demonstrate the feasibility of <a href=\"https:\/\/ai.jpl.nasa.gov\/public\/documents\/papers\/brown-ijcai2017-plumes.pdf\" rel=\"noopener\" target=\"_blank\">autonomously detecting and imaging plumes<\/a> emitted by comet 67P\/Churyumov-Gerasimenko.<\/p>\n<p>On Earth, adapting Dynamic Targeting for use with radar could allow scientists to study dangerous extreme winter weather events called deep convective ice storms, which are too rare and short-lived to closely observe with existing technologies. <a href=\"https:\/\/ai.jpl.nasa.gov\/public\/projects\/smices\/\" rel=\"noopener\" target=\"_blank\">Specialized algorithms<\/a> would identify these dense storm formations with a satellite\u2019s look-ahead instrument. Then a powerful, focused radar would pivot to keep the ice clouds in view, \u201cstaring\u201d at them as the spacecraft speeds by overhead and gathers a bounty of data over six to eight minutes.<\/p>\n<p>Some ideas involve using Dynamic Targeting on <a href=\"https:\/\/ai.jpl.nasa.gov\/public\/documents\/papers\/fame-spaceops-2025.pdf\" rel=\"noopener\" target=\"_blank\">multiple spacecraft<\/a>: The results of onboard image analysis from a leading satellite could be rapidly communicated to a trailing satellite, which could be tasked with targeting specific phenomena. The data could even be fed to a constellation of dozens of orbiting spacecraft. Chien is leading a test of that concept, called Federated Autonomous MEasurement, beginning later this year.<\/p>\n<p>Melissa Pamer<br \/>Jet Propulsion Laboratory, Pasadena, Calif.<br \/>626-314-4928<br \/><a href=\"https:\/\/www.nasa.gov\/science-research\/earth-science\/how-nasa-is-testing-ai-to-make-earth-observing-satellites-smarter\/mailto:melissa.pamer@jpl.nasa.gov\" target=\"_blank\" rel=\"noopener\">melissa.pamer@jpl.nasa.gov<\/a><\/p>\n<p>2025-094<\/p>\n","protected":false},"excerpt":{"rendered":"A technology called Dynamic Targeting could enable spacecraft to decide, autonomously and within seconds, where to best make&hellip;\n","protected":false},"author":2,"featured_media":288531,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[3844],"tags":[11480,16640,16641,17387,70,413,16,15],"class_list":{"0":"post-288530","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-space","8":"tag-artificial-intelligence-ai","9":"tag-earth-science","10":"tag-earth-science-technology-office","11":"tag-jet-propulsion-laboratory","12":"tag-science","13":"tag-space","14":"tag-uk","15":"tag-united-kingdom"},"share_on_mastodon":{"url":"https:\/\/pubeurope.com\/@uk\/114909559215083852","error":""},"_links":{"self":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts\/288530","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/comments?post=288530"}],"version-history":[{"count":0,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts\/288530\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/media\/288531"}],"wp:attachment":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/media?parent=288530"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/categories?post=288530"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/tags?post=288530"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}