Many Ukrainian drones have been disabled by Russian jamming: Their latest models navigate by sight alone

by lost_library_book

7 comments
  1. Copy of article text:

    As Ukraine’s stocks of artillery shells have dwindled, its army’s reliance on drones has grown. These are able to deliver ammunition with great precision over long distances—provided they can maintain connections with GPS satellites (so they know where they are) and their operators (so they know what to do). Such communication signals can be jammed, however, and Russia’s electronic warfare, as signals scrambling is known, is fearsomely effective. With large numbers of its drones in effect blinded, Ukraine’s drone technologists have been forced to get creative.

    Enter Eagle Eyes, a remarkable software package for drones. Developed by Ukraine’s special forces, it allows drones to navigate by machine sight alone, with no need for outside input. Using artificial-intelligence (ai) algorithms, the software compares live video of the terrain below with an on-board map stitched together from photographs and video previously collected by reconnaissance aircraft. This allows for drones to continue with their missions even after being jammed.

    Eagle Eyes has also been trained to recognise specific ground-based targets, including tanks, troop carriers, missile launchers and attack helicopters. The software can then release bombs, or crash-dive, without a human operator’s command. “Bingo for us,” says a captain in White Eagle, a special-forces corps that is using and further developing the technology. The software has been programmed to target jamming stations as a priority, says the captain, who requested anonymity. Russia’s vaunted s-400 air-defence batteries are priority number two.

    Optical navigation, as this approach to guidance is known, has a long history. An early version was incorporated in America’s Tomahawk cruise missiles, for example, first fired in anger during Operation Desert Storm in 1991. But lightweight, inexpensive optical navigation for small drones is new. In the spring of last year Eagle Eyes was being tested in combat by just three special-forces teams, each with two or three drone handlers. Today Eagle Eyes is cheap enough for kamikaze drones and is in wide use, says Valeriy Borovyk, commander of a White Eagle unit fighting in Ukraine’s south. With a range of about 60km, the system also guides fixed-wing drones that have struck energy infrastructure in Russia, he says.

    Last autumn the number of Ukrainian drones with optical navigation probably numbered in the hundreds. Today the figure is closer to 10,000, says an industry hand in Odessa whose design bureau builds prototype systems for two Ukrainian manufacturers. Anton Varavin, chief technologist at a competing design bureau, Midgard Dynamics in Ternopil in western Ukraine, says optical navigation is increasingly seen as a “must have”, especially for drones with a range above 20km.

    Optical navigation works best near distinctive features such as crossroads, power lines, isolated trees, big buildings and nearby bodies of water. For small drones with inexpensive optical navigation, the ideal cruising altitude is about 500 metres, says Andy Bosyi, a co-founder of [MindCraft.ai](http://MindCraft.ai), a developer of optical-navigation prototypes with workplaces at undisclosed locations in and near Lviv. That altitude is low enough for the software to work out terrain details, and yet high enough for a sufficient field of view. The height is also beyond the range of small-arms fire.

    MindCraft.ai shipped its first models, appropriately dubbed NOGPS, to manufacturers in December. While cruising, the system needs to fix on at least one object per minute to avoid drifting more than 50 metres off course. That’s good enough for reconnaissance, if not precision bombing. To improve accuracy and allow night flights, MindCraft.ai is incorporating a heat-sensing infrared camera. The upgrade should be ready by the end of this year.

    1/

  2. A) very clever B) I’ve seen the terminator and therefore it’s very scary C) Presumably this means they can’t confirm kills so it has tactical downsides while still being obviously better than a dead drone

  3. _Book, thank you for the article and posting the text.

  4. This was inevitable. This sort of tech is going to be extended to attacking targets at long range and loitering over the battle field waiting for Russian targets to appear. They can also report back on what they are seeing before attacking.

    Is it a game-changer? It’s certainly significant. Options like tracking vehicles so Ukraine finds the supply depots, for example, so they can then destroy them with bigger weapons.

  5. Don’t understand why AI needs to be highlighted, when terrain imaging and matching is old tech in cruise missiles. Something like a Taurus/Storm Shadow can also fly with jammed GPS by identifying specific terrain details (not just contours, but also e.g. a specific building) and have images of targets already and in the end approach search for its primary target (and depending on how pre-programmed, if it can’t find it it can have a secondary target or a designated crash site).

    Because that is basically what this is. Still an impressive achievement, but like, you don’t need to dress old tech adapted to fit new purposes.

  6. I’m been describing this method for months now. Its good to see validation of the concept. It does not need advanced processors, it can use old mobile phone electronics.

    Knowing altitude and camera perspective also allows you to do object and terrain recognition with drastically reduced computational requirement. Maps can be constantly updated by drone footage and include topographical data.

    You would run the view through filters to simplify the image, and then run comparisons with reference (simplified) images that can be synthetically modified and loaded so that the appearance of a given target can be recognised from a variety of different angles, but matching the expected image from a given known altitude and perspective greatly reduces computational requirement. Using altitude and perspective, you can also calculate object size, and modify the reference image so that you can evaluate if its roughly the right size, and cross check that to similarly sized objects in your reference database.

    The processors on older mobile phones are powerful enough potentially to do this.

Leave a Reply