{"id":45626,"date":"2025-09-05T16:40:14","date_gmt":"2025-09-05T16:40:14","guid":{"rendered":"https:\/\/www.europesays.com\/ie\/45626\/"},"modified":"2025-09-05T16:40:14","modified_gmt":"2025-09-05T16:40:14","slug":"earth-observation-firms-are-trying-to-solve-a-latency-problem-with-dynamic-targeting","status":"publish","type":"post","link":"https:\/\/www.europesays.com\/ie\/45626\/","title":{"rendered":"Earth observation firms are trying to solve a latency problem with &#8216;Dynamic Targeting&#8217;"},"content":{"rendered":"<p>For years, satellite sensors have transmitted vast quantities of imagery to the ground, where analysts then discarded views obscured by clouds, identified natural features or objects and shared pictures with customers. The entire process took days.<\/p>\n<p>Recently, companies have cut the timeline to hours \u2014 or even less \u2014 through frequent contact with ground stations, intersatellite communications links and automated analysis. And now NASA experts are eyeing a reality in which satellites could spot an erupting volcano or the convective core of a powerful storm system and share its location in minutes.<\/p>\n<p>NASA\u2019s Jet Propulsion Laboratory took an important step toward that goal in mid-July by demonstrating Dynamic Targeting, a research experiment in automated Earth observation. For the first time, a commercial cubesat looked ahead along its orbital path, analyzed the imagery and determined where to point its instrument without human intervention.<\/p>\n<p>\u201cIt\u2019s a big deal because a lot of things have to come together to make it work,\u201d Steve Chien, JPL principal investigator for Dynamic Targeting, said in an interview. \u201cThe difference between 10 years ago and now is that we now can do all of the pieces reliably and fast.\u201d<\/p>\n<p>Speedy insights<\/p>\n<p>Researchers at JPL have worked on Dynamic Targeting for more than a decade because it promises smarter satellites. Instead of gathering pictures of everything they pass over or carrying out a series of assigned tasks, researchers want satellites to respond to their surroundings like a person.<\/p>\n<p>\u201cIf you\u2019re sitting in a car looking out the window and you see something interesting, you stare at it,\u201d Chien said. \u201cThat\u2019s exactly what we\u2019re trying to do with Dynamic Targeting.\u201d<br \/>Making satellites operate that way isn\u2019t easy.<\/p>\n<p>The Dynamic Targeting experiment pairs a briefcase-size cubesat from U.K. startup Open Cosmos with JPL\u2019s machine-learning algorithm and a commercial AI processor from Irish startup Ubotica Technologies.<\/p>\n<p>\u201cThe focus is very much on latency, using AI to extract value from images directly onboard and get that insight down to the end user as quickly as possible,\u201d Aubrey Dunne, Ubotica co-founder and chief technology officer, said at the SmallSat 2025 Conference in Salt Lake City in August.<\/p>\n<p>In ongoing tests, Open Cosmos\u2019 CogniSat-6 turns its hyperspectral sensor forward to scan the horizon for clouds. The onboard processor analyzes the imagery and then tells the satellite how to turn the sensor to acquire the best cloud-free images. The whole operation takes less than 90 seconds.<\/p>\n<p>Seconds count<\/p>\n<p>Mars rovers perform similar feats, surveying their surroundings and relying on JPL algorithms to pick the best targets. That\u2019s harder to do in a 500-kilometer low Earth orbit where a satellite races over the ground at about 7.5 kilometers a second. The Dynamic Targeting algorithm, trained to spot clouds, must also take into account Earth\u2019s rotation and curvature.<\/p>\n<p><a href=\"https:\/\/i0.wp.com\/spacenews.com\/wp-content\/uploads\/2025\/09\/JPL_Dynamic_Targeting_Experiment_Credit_JPL-1.png?ssl=1\" rel=\"nofollow noopener\" target=\"_blank\"><img loading=\"lazy\" data-recalc-dims=\"1\" decoding=\"async\" width=\"780\" height=\"480\" src=\"https:\/\/www.europesays.com\/ie\/wp-content\/uploads\/2025\/09\/JPL_Dynamic_Targeting_Experiment_Credit_JPL-1.png\" alt=\"\" class=\"wp-image-549266\"  \/><\/a><\/p>\n<p>\u201cThen you have about 50 seconds to interpret the data and figure out if you\u2019re going to take an image, how you\u2019re going to point your instrument,\u201d Chien said.<\/p>\n<p>JPL\u2019s initial concept for Dynamic Targeting called for a satellite with two sensors: one to look ahead and another pointing down. But researchers couldn\u2019t find a satellite with a dedicated look-ahead sensor. So, the CogniSat-6 cubesat points its camera ahead to look for clouds then turns it down and left or right to get the best view.<\/p>\n<p>\u201cThat\u2019s not ideal, but we\u2019re just trying to demonstrate the technology,\u201d Chien said.<\/p>\n<p>In the future, two or more satellites could share the location of targets of interest. For instance, a satellite with a wide-field-of-view sensor that detects smoke could send instructions to another spacecraft to gather imagery with a higher-resolution instrument.<\/p>\n<p>Beyond clouds<\/p>\n<p>Clouds were the initial target for JPL\u2019s Dynamic Targeting algorithm because they obscure roughly two-thirds of the surface at Earth\u2019s mid-latitudes, making them a significant problem for optical sensors. Human analysis conducted after the recent tests confirmed that the CogniSat-6 sensor succeeded in gathering cloud-free imagery.<\/p>\n<p>\u201cCogniSat-6 is the first of a whole bunch of missions that we hope would use this technology,\u201d Chien said. \u201cCommercial entities are very interested in this. And a number of NASA science mission concepts could benefit from an agile instrument that is actively choosing targets.\u201d<\/p>\n<p>Dunne calls Dynamic Targeting \u201ca paradigm shift in efficiency.\u201d With traditional Earth-observation sensors, a small fraction of the pixels acquired and downlinked prove useful.<\/p>\n<p>\u201cThis is about trying to get more valuable data down,\u201d he said in an interview.<\/p>\n<p>For severe storms and volcanic eruptions, look-ahead sensors could detect them. The next challenge is rapidly sharing the information with people on the ground.<br \/>It\u2019s a problem industry is racing to solve.<\/p>\n<p>\u201cThe beautiful thing about people wanting internet access from space is that companies are providing more downlink capabilities, both in terms of data volume and reducing the latency,\u201d Chien said. \u201cYou just have to get your data to their network.\u201d<\/p>\n<p>The short answer is: satellite autonomy meant different things to different people.<br \/>Widespread consensus at the SmallSat 2025 Conference was that satellites should be equipped to handle routine maintenance and communications without help from ground controllers.<\/p>\n<p>\u201cThere\u2019s no reason to do a normal care and feeding contact in a nonautomatic fashion,\u201d said Col. Owen Stephens, contracting director at the U.S. Space Force\u2019s Space Rapid Capabilities Office. \u201cYou shouldn\u2019t need a human to do that. In fact, the machine will do it better than a human could, because humans will screw up their command entry.\u201d<\/p>\n<p>Many speakers also called for sensors paired with machine-learning algorithms running on edge processors to handle complex tasks like rendezvous and proximity operations or automated collision avoidance.<\/p>\n<p>\u201cIn a situation like that, autonomy could be very valuable,\u201d said Benjamin Bahney, Lawrence Livermore National Laboratory space program leader.<\/p>\n<p>Similarly, remote-sensing operations near the moon or Mars will require extensive autonomy because of limited communications bandwidth.<\/p>\n<p>\u201cCommunicating in those regimes is very challenging so there\u2019s tremendous benefit if you can operate autonomously,\u201d Bahney said.<\/p>\n<p>This article first appeared in the September 2025 issue of SpaceNews Magazine with the title \u201cWhen images are available in minutes.\u201d<\/p>\n<p>\n\tRelated<\/p>\n","protected":false},"excerpt":{"rendered":"For years, satellite sensors have transmitted vast quantities of imagery to the ground, where analysts then discarded views&hellip;\n","protected":false},"author":2,"featured_media":45627,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[270],"tags":[34211,3022,18,16890,8420,19,17,1024,133,3977,451],"class_list":{"0":"post-45626","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-space","8":"tag-dynamic-targeting","9":"tag-earth-observation","10":"tag-eire","11":"tag-feature","12":"tag-from-the-magazine","13":"tag-ie","14":"tag-ireland","15":"tag-nasa","16":"tag-science","17":"tag-sn","18":"tag-space"},"share_on_mastodon":{"url":"","error":""},"_links":{"self":[{"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/posts\/45626","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/comments?post=45626"}],"version-history":[{"count":0,"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/posts\/45626\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/media\/45627"}],"wp:attachment":[{"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/media?parent=45626"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/categories?post=45626"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/tags?post=45626"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}