<?xml version="1.0" encoding="utf-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:media="http://search.yahoo.com/mrss/"><channel><title>IEEE Spectrum</title><link>https://spectrum.ieee.org/</link><description>IEEE Spectrum</description><atom:link href="https://spectrum.ieee.org/feeds/topic/robotics.rss" rel="self"></atom:link><language>en-us</language><lastBuildDate>Tue, 14 Apr 2026 19:46:32 -0000</lastBuildDate><item><title>​Boston Dynamics and Google DeepMind Teach Spot to Reason​</title><link>https://spectrum.ieee.org/boston-dynamics-spot-google-deepmind</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/photo-of-yellow-boston-dynamics-robot-dog-using-its-arm-to-load-laundry-into-a-white-basket.png?id=65521323&width=2000&height=1500&coordinates=240%2C0%2C240%2C0"/><br/><br/><p><span><strong></strong><strong></strong>The amazing and frustrating thing about robots is that they can do almost anything you want them to do, as long as you know how to ask properly. In the not-so-distant past, asking properly meant writing code, and while we’ve thankfully moved beyond that brittle constraint, there’s still an irritatingly inverse correlation between ease of use and complexity of task. </span></p><p><span>AI has promised to change that. The idea is that when AI is embodied within robots—giving AI software a physical presence in the world—those robots will be imbued with with reasoning and understanding. This is cutting-edge stuff, though, and while we’ve seen plenty of examples of embodied AI in a research context, finding applications where reasoning robots can provide reliable commercial value has not been easy. <a href="https://bostondynamics.com/" target="_blank">Boston Dynamics</a> is one of the few companies to commercially deploy legged robots at any appreciable scale; there are now several thousand hard at work. Today the company is <a href="https://bostondynamics.com/blog/tools-for-your-to-do-list-with-spot-and-gemini-robotics/" target="_blank">announcing</a> that its quadruped robot <a href="https://spectrum.ieee.org/tag/spot-robot" target="_self">Spot</a> is now equipped with <a href="https://deepmind.google/blog/gemini-robotics-er-1-6/">Google DeepMind’s Gemini Robotics-ER 1.6</a>, a <a href="https://spectrum.ieee.org/gemini-robotics" target="_blank">high-level embodied reasoning model</a> that brings usability and intelligence to complex tasks.</span></p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="155eddc016bd1bedcfb5b83c4b4a54c3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/LP4-c5AK30g?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">YouTube.com</small></p><p><span>Although this video shows Spot in a home context, the focus of this partnership is on one of the very few applications where legged robots have proven themselves to be commercially viable: inspection. That is, wandering around industrial facilities, checking to make sure that nothing is imminently exploding. With the new AI onboard, Spot is now able to autonomously look for dangerous debris or spills, read complex gauges and sight glasses, and call on tools like vision-language-action models when it needs help understanding what’s going on in the environment around it.</span></p><p>“Advances like Gemini Robotics ER 1.6 mark an important step toward robots that can better understand and operate in the physical world,” <a href="https://www.linkedin.com/in/marco-da-silva-447b72/" target="_blank">Marco da Silva</a>, Vice President and General Manager of Spot at Boston Dynamics, says <a href="https://bostondynamics.com/blog/aivi-learning-now-powered-google-gemini-robotics/" target="_blank">in a press release</a>. “Capabilities like instrument reading and more reliable task reasoning will enable Spot to see, understand, and react to real-world challenges completely autonomously.”</p><h2>Understanding Robot Understanding</h2><p>The words “reasoning” and “understanding” are being increasingly applied to AI and robotics, but as <a href="https://spectrum.ieee.org/humanoid-robots-gill-pratt-darpa" target="_self">Toyota Research Institute’s Gill Pratt recently pointed out</a>, what those words actually <em><em>mean</em></em> for robots in practice isn’t always clear. “The benchmark we measure ourselves against when it comes to understanding is that the system should answer the way a human would,” <a href="https://www.linkedin.com/in/carolinaparada/" target="_blank">Carolina Parada</a>, Head of Robotics at Google DeepMind, explained in an interview. For robots to reliably and safely perform tasks, this connection between how robots understand the world and how humans do is critical. Otherwise, there may be a disconnect between the instructions that a human gives a robot, and how the robot decides to carry out that task.</p><p>Boston Dynamics’ video above is a potentially messy example of this. One of the instructions to Spot was to “recycle any cans in the living room.” It has no problem completing the task, as the video shows, but in doing so it grips the can sideways, which is not going to end up well for cans that have leftover liquid in them. We humans would avoid this because we can draw on a lifetime of experience to know how cans should be held, but robots don’t (yet) have that kind of world knowledge.</p><p>Parada says that Gemini Robotics-ER 1.6 approaches situations like this from a safety perspective. “If you ask the robot to bring you a cup of water, it will reason not to place it on the edge of a table where it could fall. We track this using our <a href="https://asimov-benchmark.github.io/v1/" target="_blank">ASIMOV benchmark</a>, which includes a whole lot of natural language examples of things the robot should not do.” The current version of Spot doesn’t use these semantic safety models for manipulation, but the plan is to make future versions reason about holding objects in ways that are safe.</p><p class="shortcode-media shortcode-media-youtube" style="background-color: rgb(255, 255, 255);"><span class="rm-shortcode" data-rm-shortcode-id="5934a9a019325c2e996f3f0dab47b3c4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/kBwxmlI2yHQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">YouTube.com</small></p><p><span>There does still seem to be a disconnect between Gemini Robotics-ER 1.6 as a high-level reasoning model for a robot, and the robot itself as an interface with the physical world. One of the new features of 1.6 is </span><em><em>success detection</em></em><span>, which combines multiple camera angles to more reliably be able to tell when Spot has successfully grasped an object. This is great if you’re relying entirely on vision for your object interaction, but robots have all kinds of other well-established ways to detect a successful grasp, including touch sensors and force sensors, that 1.6 is not using. The reason why this is the case speaks to a fundamental problem that the robotics field is still trying to figure out: how to train models when you need physical data.</span></p><p><span>“At the moment, these models are strictly vision only,” Parada explains. “There is lots of [visual] information on the web about how to pick up a pen. If we had enough data with touch information, we could easily learn it, but there is not a lot of data with touch sensing on the internet.” Customers who use these new capabilities for inspection with Spot will be required to share their data with Boston Dynamics, which is where some of this data will come from.</span></p><h2>Real-World Robots That Are Useful</h2><p>The fact that Boston Dynamics <em><em>has </em></em>customers makes them something of an anomaly when it comes to legged robots that rely on AI in commercial deployments. And those customers will have to be able to trust the robot—<a href="https://spectrum.ieee.org/ai-hallucination" target="_self">always a problem when AI is involved</a>. “We take this very seriously,” da Silva said in an interview. “We roll out new DeepMind capabilities through beta programs to a smaller set of customers to understand what to anticipate, and we only actively advertise features we are confident will work.” There’s a threshold of usefulness that robots like Spot need to reach, and fortunately, the real world doesn’t demand perfection. “Most critical infrastructure in a facility will be instrumented to tell you whether something is wrong,” da Silva says. “But there is a lot of stuff that is not instrumented that can still cause a problem if you aren’t paying attention to it. We’ve found that somewhere north of 80 percent is the threshold where it’s not annoying. Below that, basically the robot is crying wolf, and the operators will start ignoring it.”</p><p><span></span><span>Both da Silva and Parada agree that there’s still plenty of room for improvement in robotic inspection. As Parada points out, Spot’s rarefied status as a scalable commercial platform provides a valuable opportunity to learn how models like Gemini Robotics-ER 1.6 can be the most useful, and then apply that knowledge to other embodied AI platforms, including </span><a href="https://spectrum.ieee.org/boston-dynamics-atlas-scott-kuindersma" target="_self">Boston Dynamics’ Atlas</a><span>. Does that mean that Atlas is going to be the next industrial inspection robot? Probably not. But if this real-world experience can get us closer to safe and reliable robots that can pick up laundry, take a dog for a walk, and clear away soda cans without making a mess, that’s something we can all get excited about.</span></p>]]></description><pubDate>Tue, 14 Apr 2026 19:45:01 +0000</pubDate><guid>https://spectrum.ieee.org/boston-dynamics-spot-google-deepmind</guid><category>Boston-dynamics</category><category>Spot-robot</category><category>Google-deepmind</category><category>Inspection-robots</category><category>Quadruped-robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/photo-of-yellow-boston-dynamics-robot-dog-using-its-arm-to-load-laundry-into-a-white-basket.png?id=65521323&amp;width=980"></media:content></item><item><title>Video Friday: This Floor Lamp Will Do Your Chores</title><link>https://spectrum.ieee.org/video-friday-robot-lamp</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/robot-arms-hold-up-a-white-t-shirt-in-a-warm-wood-paneled-bedroom.png?id=65502238&width=2000&height=1500&coordinates=250%2C0%2C250%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://2026.ieee-icra.org/">ICRA 2026</a>: 1–5 June 2026, VIENNA</h5><h5><a href="https://roboticsconference.org/">RSS 2026</a>: 13–17 July 2026, SYDNEY</h5><h5><a href="https://mrs.fel.cvut.cz/summer-school-2026/">Summer School on Multi-Robot Systems</a>: 29 July–4 August 2026, PRAGUE</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="ahbf2xka9no"><em>Lume is a sculptural floor lamp designed to feel at home the moment you place it. It’s crafted from anodized aluminum and high-gloss finishes, shaped into a slender, balanced form that quietly conceals its complexity. Every surface is refined to feel smooth, precise, and enduring. When it moves, it’s quiet and deliberate. When it’s still, it holds its place with ease.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d40d3980d838fa0341599d8d391d1516" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ahBF2XkA9No?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Apparently, and let me stress that “apparently,” Lume can make the bed, <a data-linked-post="2674304335" href="https://spectrum.ieee.org/robots-folding-clothes" target="_blank">fold laundry</a>, and do other chores involving soft materials. I’m intensely skeptical because it feels like that video has more footage of people staring out of windows and dancing for no reason beyond the robot actually doing anything. And when you do see the robot working at a task, it’s cut up into lots of different pieces of footage in a way that is typically used to distract from either plodding speed, frequent failures, or both. So, yeah. There may be a lot to like about the philosophy here, but even at a suspiciously cheap US $2,500 for a pair of these robots, more detail is certainly called for before they’ve earned your preorder.</p><p>[ <a href="https://syncere.com/">Syncere</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="_p6qoe8zgw0"><em>In Science Robotics, researchers from MIT Media Lab and collaborators from Politecnico di Bari present Electrofluidic Fiber Muscles, a new class of artificial muscle fibers for robots and wearables. Unlike the rigid servo motors used in most robots, these fiber-shaped muscles are soft and flexible. They combine electrohydrodynamic (EHD) fiber pumps—slender tubes that move liquid using electric fields to generate pressure with no moving parts—with fluidic fiber actuators. The muscles are driven by electric fields and operate silently, with no external pumps or reservoirs.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6a2a7924462095e48590bf2423837ee1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/_P6QoE8zGw0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://news.mit.edu/2026/new-type-electrically-driven-artificial-muscle-fiber-0409">MIT</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="akehvalnlb4">We first saw this thing at <a data-linked-post="2669267948" href="https://spectrum.ieee.org/epfl-lasa" target="_blank">ICRA@40</a> a few years ago, but the paper is out now.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c4b27b74c847c4b2739bc9d300669b05" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/AKEHvalnLb4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.nature.com/articles/s41467-025-67675-8">Nature Communications</a> ] via [ <a href="https://www.epfl.ch/labs/lasa/">LASA</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="z-p9qizwezu">I do like tea, and I suppose there could be worse applications for a robot than this one, since it leverages both payload and complex terrain mobility.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="126462e5a4e20dca2ce29e218ab9a574" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/z-P9qiZwEzU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en">DEEP Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="2-p6yzprhg0"><em>We’ve created GEN-1, our latest milestone in scaling robot learning. We believe it to be the first general-purpose AI model that crosses a new performance threshold: mastery of simple physical tasks. It improves average success rates to 99 percent on tasks where previous models achieve 64 percent, completes tasks roughly 3x faster than state-of-the-art, and requires only one hour of robot data for each of these results. GEN-1 unlocks commercial viability across a broad range of applications—and while it cannot solve all tasks today, it is a significant step toward our mission of creating generalist intelligence for the physical world.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7196b1368642643983789adf118058e9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/2-P6YZPrHg0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://generalistai.com/blog/apr-02-2026-GEN-1">Generalist</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="g3uo2vawg64"><em>Legged manipulators offer high mobility and versatile manipulation. However, robust interaction with heterogeneous articulated objects, such as doors, drawers, and cabinets, remains challenging because of the diverse articulation types of the objects and the complex dynamics of the legged robot. In this paper, we propose a robust and sample-efficient framework for opening heterogeneous articulated objects with a legged manipulator.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4c8fd46f3a6762d3ce21fe34b5cad8dd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/g3Uo2vAWG64?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://openheart-icra.github.io/OpenHEART/">OpenHEART</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="obr6dncstbm"><em>By deeply coupling real-time depth perception with reinforcement learning motion control, Adam achieves natural humanlike stair-stepping gait, showing outstanding dynamic stability and environmental adaptability.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="298b7a30f4935bda98c7c8902b731b97" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/OBR6DncstbM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.pndbotics.com/">PNDbotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="kswyn7ihhbu">The way these robots deliver packages will never not be amusing to me.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f77e1a0210ec4d02f17ed729e7abb860" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/kSwYN7IhHbU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en">DEEP Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="spw8rm0a6gw"><em>Tether performs autonomous real-world functional play involving structured, task-directed interactions. We introduce a policy that performs trajectory warping anchored by keypoint correspondences, which is extremely data-efficient and robust to significant spatial and semantic environment variation. Running the policy within a VLM-guided multitask loop, we generate a stream of play data that consistently improves downstream policy learning over time.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1e1d2556d5297447f840842f6e9920d6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/SPW8RM0a6gw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://tether-research.github.io/">Tether</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="xvurnpqryok"><em>What happens when your walls begin to move? This paper explores the design of human-robot interaction for architectural-scale, shape-changing environments.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d1689bf5fae640e1c1ded34adde8daa7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xvUrNpQRYok?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://interactive-structures.org/publications/2026-04-fluent-interaction-cyber-physical-architecture/">Interactive Structures Lab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="j_suvhilbx4">I will admit to being somewhat disappointed about the reality of the Unreal Robotics Lab.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="46f9283ce1c88214401e69b13fcb1d4a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/J_suVHiLBX4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://github.com/URLab-Sim/UnrealRoboticsLab">URLab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="9ecw7io-md4"><em>We’re not done yet! Illinois is back in the Final Four for the first time since 2005, and we’re cheering all the way to the championship. This video features teleoperated G1 and AI Worker robots.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d425261c79a348a97240984e9d86ada8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/9eCw7io-MD4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://publish.illinois.edu/kimlab2020/">KIMLAB</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="iy2rzubvhhq"><em>Fighting robots are cool. Destroying expensive electronics while fighting robots is not cool. We make robots out of plastic so our electronics survive.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="aa6096b1552bcaaffab7ae289d7974cb" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/iy2RzuBVHhQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://weaponizedplastic.com/">Weaponized Plastic Fighting League</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 10 Apr 2026 17:00:01 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-robot-lamp</guid><category>Home-robots</category><category>Video-friday</category><category>Artificial-muscle</category><category>Agricultural-robots</category><category>Robot-ai</category><category>Quadruped-robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/robot-arms-hold-up-a-white-t-shirt-in-a-warm-wood-paneled-bedroom.png?id=65502238&amp;width=980"></media:content></item><item><title>GoZTASP: A Zero-Trust Platform for Governing Autonomous Systems at Mission Scale</title><link>https://content.knowledgehub.wiley.com/goztasp-a-zero-trust-platform-for-governing-autonomous-systems-at-mission-scale/</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/technology-innovation-institute-logo-with-stylized-tii-and-curved-line.png?id=65498963&width=980"/><br/><br/><p>ZTASP is a mission-scale assurance and governance platform designed for autonomous systems operating in real-world environments. It integrates heterogeneous systems—including drones, robots, sensors, and human operators—into a unified zero-trust architecture. Through Secure Runtime Assurance (SRTA) and Secure Spatio-Temporal Reasoning (SSTR), ZTASP continuously verifies system integrity, enforces safety constraints, and enables resilient operation even under degraded conditions.</p><p>ZTASP has progressed beyond conceptual design, with operational validation at Technology Readiness Level (TRL) 7 in mission critical environments. Core components, including Saluki secure flight controllers, have reached TRL8 and are deployed in customer systems. While initially developed for high-consequence mission environments, the same assurance challenges are increasingly present across domains such as healthcare, transportation, and critical infrastructure.</p><p><span><a href="https://content.knowledgehub.wiley.com/goztasp-a-zero-trust-platform-for-governing-autonomous-systems-at-mission-scale/" target="_blank">Download this free whitepaper now!</a></span></p>]]></description><pubDate>Thu, 09 Apr 2026 15:06:39 +0000</pubDate><guid>https://content.knowledgehub.wiley.com/goztasp-a-zero-trust-platform-for-governing-autonomous-systems-at-mission-scale/</guid><category>Autonomous-systems</category><category>Drones</category><category>Sensors</category><category>Transportation</category><category>Type-whitepaper</category><dc:creator>Technology Innovation Institute</dc:creator><media:content medium="image" type="image/png" url="https://assets.rbl.ms/65498963/origin.png"></media:content></item><item><title>What Happened When We Set Up a Robotics Lab in a Mall</title><link>https://spectrum.ieee.org/boston-dynamics-spot-interaction</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/families-watch-a-colorful-robotic-dog-demo-at-a-robotics-and-ai-institute-exhibit.jpg?id=65453180&width=2000&height=1500&coordinates=166%2C0%2C167%2C0"/><br/><br/><p>Building the next generation of robots for successful integration into our homes, offices, and factories is more than just solving the hardware and software problems—we also need to understand how they will be perceived and how they can work effectively with people in those spaces.</p> <p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <a href="https://rai-inst.com/"></a><a class="shortcode-media-lightbox__toggle shortcode-media-controls__button material-icons" title="Select for lightbox">aspect_ratio</a><a href="https://rai-inst.com/" target="_blank"><img alt="Robotics and AI Institute logo with text about post originally appearing there" class="rm-shortcode" data-rm-shortcode-id="09961581414b810cff45f77932185cb3" data-rm-shortcode-name="rebelmouse-image" id="89ff0" loading="lazy" src="https://spectrum.ieee.org/media-library/robotics-and-ai-institute-logo-with-text-about-post-originally-appearing-there.png?id=65453513&width=980"/></a> </p><p>In summer 2025, <a href="https://spectrum.ieee.org/boston-dynamics-ai-institute-hyundai" target="_blank">RAI Institute</a> set up a free pop-up robot experience in the CambridgeSide mall, designed to let people experience state-of-the-art robotics first hand. While news stories about robots and AI are common, with some being overly critical and some overly optimistic, most people have not encountered robots in the flesh (or metal) as it were. With no direct experience, their opinions are largely shaped by pop culture and social media, both of which are more focused on sensational stories instead of accurate information about how the robots might be used effectively and where the technology still falls short. Our goal with the pop-up was twofold: first, to give people an opportunity to see robots that they would otherwise not have a chance to experience; and second, to better understand how the public feels about interacting with these robots.</p><h2>Designing a Robot Experience for the General Public</h2><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="Three experimental robotic prototypes displayed behind barriers in a bright gallery." class="rm-shortcode" data-rm-shortcode-id="a1fe59976ca74226f29b65137649c4d4" data-rm-shortcode-name="rebelmouse-image" id="c9163" loading="lazy" src="https://spectrum.ieee.org/media-library/three-experimental-robotic-prototypes-displayed-behind-barriers-in-a-bright-gallery.webp?id=65453673&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Some earlier version legged robots, built by the RAI Institute’s Executive Director, Marc Raibert</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">RAI Institute</small></p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="Red robot dog and electric bike displayed in glass cases inside a modern mall." class="rm-shortcode" data-rm-shortcode-id="f0c655444535aac7e11e20510c8bbbae" data-rm-shortcode-name="rebelmouse-image" id="6b96a" loading="lazy" src="https://spectrum.ieee.org/media-library/red-robot-dog-and-electric-bike-displayed-in-glass-cases-inside-a-modern-mall.webp?id=65453671&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">The ANYmal by ANYrobotics (left) and a previous model of the RAI Institute’s UMV (right)</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">RAI Institute</small></p><p><span>The pop-up space had two areas: a museum area where people could see historical and modern robots, including some <a href="https://spectrum.ieee.org/marc-raibert-boston-dynamics-instutute" target="_blank">RAI Institute</a> builds like the </span><a href="https://rai-inst.com/resources/blog/designing-wheeled-robotic-systems/" target="_blank">UMV</a>,<span> and an interactive experience called “Drive-a-Spot.” This area was a driving arena where anyone who came by could take the controls of a Spot quadruped, one of the more recognizable, commercially available robots today.</span></p><p>The guest robot drivers used a custom controller built on an adaptive video game controller that was designed so that anyone of any age could use it. It featured basic controls: move forward, back, left, right, adjust height, sit, stand, and tilt. The buttons were large so that tiny or elderly hands could use the controller, and the people who drove Spot ranged in age from 2 to over 90.<br/></p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Adaptive gaming controller with large programmable buttons on a black table." class="rm-shortcode" data-rm-shortcode-id="d191483045e332282c7d73dac0962f80" data-rm-shortcode-name="rebelmouse-image" id="2545f" loading="lazy" src="https://spectrum.ieee.org/media-library/adaptive-gaming-controller-with-large-programmable-buttons-on-a-black-table.jpg?id=65453210&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">The guest robot drivers used a custom controller built on an adaptive video game controller that was designed so that anyone of any age could use it.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">RAI Institute</small></p><p>The demo area was designed to be a bit challenging for the Spot robot to maneuver in—it contained tight passages, low obstacles to step over, a barrier to crouch under, and taller objects the robot had to avoid. Much to the surprise of many of our guests, Spot is able to autonomously adjust itself to traverse and avoid those obstacles when being supervised by the joystick.<br/></p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="1c2dcee3b7a437fc3f967b9095f81e91" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/dPjUkJGC5Xg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span> <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">RAI Institute</small> </p><p><span>The driving arena’s theme rotated every few weeks across four scenarios: a factory, a home, a hospital, and an outdoor/disaster environment. These were chosen to contrast settings where robots are broadly accepted (industrial, emergency response) with settings where public ambivalence is well documented (domestic, healthcare).</span></p><h3></h3><br/><div class="rblad-ieee_in_content"></div><p>The visitors who chose to drive the Spot robot could also participate in a short survey before and after their driving experience. The survey focused on two core dimensions:</p><ul><li><strong>Comfort: How comfortable would you feel if you encountered a robot in a factory, home, hospital, office, or outdoor/disaster scenario?</strong></li><li><strong>Suitability: How well would this robot work in each of those contexts?</strong> </li></ul><p>The survey also recorded emotional reactions immediately after driving, likelihood to recommend the experience, and open-ended responses about what they found memorable or surprising. The researchers were careful to separate the environment participants drove through from the scenarios they were asked to evaluate in the survey. This distinction is important for interpreting the results given below.</p><h2>Did Interacting With the Robot Change People’s Feelings about Robots?</h2><p><span></span><span>Out of approximately 10,000 guests that visited the Robot Lab, 10 percent of those drove the Spot and opted in to our surveys. Of those surveyed, more than 65 percent of people had seen images or videos of Spot robots online, but most had never seen one of the robots in person.</span></p><h3>Increased Comfort Through Experience</h3><p>Across all five contexts presented in the survey (factory, home, hospital, office, and outdoor/disaster scenarios), comfort scores increased significantly after the driving session. The effects were small to moderate in magnitude, but they were consistent and statistically robust after correcting for multiple comparisons across all participants spanning children to older adults.</p><p>The largest gain appeared in the outdoor/disaster context, which started with low comfort despite high perceived suitability. People already thought Spot would be useful in search-and-rescue scenarios; they just weren’t comfortable with it performing in that scenario. This discomfort may stem from media portrayals of quadruped robots in military contexts. A few minutes of hands-on control appears to partially dissolve that apprehension.</p><p>Participants who drove through the factory-themed arena showed no significant increase in comfort, but this scenario already had the highest rating of any rated context at baseline, leaving little room for improvement.</p><p>No matter their previous experience, most people were neutral about having a Spot robot in their home before their driving experience. However, after the experience of controlling the Spot robot, people had a statistically significant increase in their comfort at having a Spot in their home and also felt that a Spot robot was more suitable for work in any environment, not just the one they had driven it in.</p><h3>Better Understanding of Where Robots Can Fit Into Daily Life</h3><p>Perceived suitability for Spot to operate in each context also increased. However, the pattern in the data is different. The largest gains weren’t in the high-baseline industrial and outdoor contexts. They were in home, office, and hospital—the very environments where people started out most skeptical.</p><p>Participants who drove the Spot robot in a home-themed environment didn’t just consider homes more suitable for robots; they also rated hospitals and offices as more suitable. This result suggests that hands-on control alters something more fundamental than just context-specific familiarity. It may change a person’s underlying understanding of a robot’s capabilities and, consequently, where they believe robots are appropriate.</p><h3>Results by Demographic</h3><p>The hands-on experience seems to be similarly effective across genders, although it does not completely eliminate existing disparities. For example, men reported higher baseline comfort than women across all five contexts. However, all genders improved at similar rates after interaction. The gap didn’t significantly widen or close in most contexts, though it did narrow in factory and office settings.</p><p>Age effects were more context dependent. Children (aged 8–17) rated factory environments as less comfortable and less suitable before the study. However, this could be because most children do not have experience with factory settings or industrial environments. After interaction, this gap largely persisted. By contrast, children showed stronger gains in office comfort than older adults and entered the study rating home contexts more favorably than adults did.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Stacked bar chart of survey participants by age group and gender categories." class="rm-shortcode" data-rm-shortcode-id="91a6e3f855ba0152f034182d4710df9d" data-rm-shortcode-name="rebelmouse-image" id="313e6" loading="lazy" src="https://spectrum.ieee.org/media-library/stacked-bar-chart-of-survey-participants-by-age-group-and-gender-categories.jpg?id=65453246&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Participants ranged from age 8 to over age 75.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">RAI Institute</small></p><p><span>Participants who had previously driven Spot (mainly robotics professionals) began with higher comfort across the board. But after the hands-on session, people with no prior exposure caught up to experienced drivers. This level of familiarity would be difficult to replicate with images and videos alone.</span></p><h3>Post-Interaction Results</h3><p>Post-interaction emotional data was overwhelmingly positive. “Excitement” was reported by 74 percent of participants, “happiness” by 50 percent, and only 12 percent reported “nervousness.” Over 55 percent rated the experience as “brilliant,” and 62 percent said they were very likely to recommend it to a friend.</p><p>The open-ended responses added a lot more color. The most commonly mentioned moments were locomotion and terrain adaptation (22 percent). This included the way Spot navigated steps, tight spaces, and uneven ground and expressive tilt movements (22 percent), which people found surprisingly doglike or dancelike. A smaller set of responses (3 percent) described anthropomorphic reactions: worrying about “hurting” the robot or finding its behavior “silly” in a way that prompted genuine emotional response.</p><p>When asked what tasks they’d want a robot to perform, responses shifted meaningfully. Before driving, answers clustered around domestic assistance and heavy or hazardous labor. After driving, domestic help remained prominent, but entertainment and play jumped from 7.5 percent to 19.4 percent. Companionship also appeared at 5 percent. References to hazardous or industrial tasks declined as people who had operated the robot began imagining it as a companion and playmate, not just a labor-replacement tool.</p><h2>Key Takeaways from the Robot Lab</h2><p>In the not-so-distant future, robots will become more common in public and private spaces. But whether that integration into daily life will be accepted by the general public remains to be seen. The standard approach to building acceptance has been passive exposure such as videos, exhibits, and articles. This study suggests giving people agency and letting them actually operate a robot is a qualitatively different intervention.</p><p>Short, well-designed, hands-on encounters can raise comfort in precisely the social domains where ambivalence is highest and where future robotics deployment will likely take place. This hands-on experience shouldn’t be limited to tech conferences and museums, as it may be more valuable than just entertaining.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Children control a robot car at a tech booth with staff and jungle-themed backdrop" class="rm-shortcode" data-rm-shortcode-id="561f653ae87e1468c7ac31ac92d0fe00" data-rm-shortcode-name="rebelmouse-image" id="a32d5" loading="lazy" src="https://spectrum.ieee.org/media-library/children-control-a-robot-car-at-a-tech-booth-with-staff-and-jungle-themed-backdrop.jpg?id=65453264&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Fun for all ages!</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">RAI Institute</small></p><p><span>We consider the pop-up a success, but as with all experiments, we also learned a lot along the way. For our takeaways, in addition to the increased comfort with robots, we also found that the guests to our space really enjoyed talking to the robotics experts who staffed the location. For many people, the opportunity to talk to a roboticist was as unique as the opportunity to drive a robot, and in the future, we are excited to continue to share our technical work as well as the experiences of our humans, in addition to our humanoids.</span></p><p>Does building a space where folks can experience robots firsthand have the potential to create meaningful, long-term attitude shifts? That remains an open question. But the effect’s direction and consistency across different situations, ages, and genders are hard to ignore.</p><div class="horizontal-rule"></div><p><a href="https://rai-inst.com/wp-content/uploads/2026/03/HRI26-Pop-Up_Encounters_with_Spot.pdf" target="_blank">Pop-Up Encounters With Spot: Shaping Public Perceptions of Robots Through Hands-On Experience</a>, by Hae Won Park, Georgia Van de Zande, Xiajie Zhang, Dawn Wendell, and Jessica Hodgins from the RAI Institute and the MIT Media Lab, was presented last month at the <a href="https://humanrobotinteraction.org/2026/" target="_blank">2026 ACM/IEEE International Conference on Human-Robot Interaction</a> in Edinburgh, Scotland.</p>]]></description><pubDate>Sun, 05 Apr 2026 13:00:01 +0000</pubDate><guid>https://spectrum.ieee.org/boston-dynamics-spot-interaction</guid><category>Boston-dynamics</category><category>Legged-robots</category><category>Spot-robot</category><dc:creator>Dawn Wendell</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/families-watch-a-colorful-robotic-dog-demo-at-a-robotics-and-ai-institute-exhibit.jpg?id=65453180&amp;width=980"></media:content></item><item><title>Video Friday: Digit Learns to Dance—Virtually Overnight</title><link>https://spectrum.ieee.org/video-humanoid-dancing</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/bipedal-teal-robot-practices-side-to-side-dance-move-with-arm-movement.gif?id=65460048&width=2000&height=1500&coordinates=67%2C0%2C68%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://2026.ieee-icra.org/">ICRA 2026</a>: 1–5 June 2026, VIENNA</h5><h5><a href="https://roboticsconference.org/">RSS 2026</a>: 13–17 July 2026, SYDNEY</h5><h5><a href="https://mrs.fel.cvut.cz/summer-school-2026/">Summer School on Multi-Robot Systems</a>: 29 July–4 August 2026, PRAGUE</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="pc-n6aciusu"><em>Getting Digit to dance takes more than putting on some fancy shoes—our AI Team can teach Digit new whole-body control capabilities overnight. Using raw motion data from mocap, animation, and teleop methods, Digit gets new skills through sim-to-real reinforcement training.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4477bcbaf1f5072afe88c2c0015eebd1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Pc-n6ACIuSU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.agilityrobotics.com/">Agility</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="sy2xyrmv44y"><em>We’ve created GEN-1, our latest milestone in scaling robot learning. We believe it to be the first general-purpose AI model that crosses a new performance threshold: mastery of simple physical tasks. It improves average success rates to 99% on tasks where previous models achieve 64%, completes tasks roughly 3x faster than state of the art, and requires only 1 hour of robot data for each of these results. GEN-1 unlocks commercial viability across a broad range of applications—and while it cannot solve all tasks today, it is a significant step towards our mission of creating generalist intelligence for the physical world.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bbbeecb0e15f3b78f50b3ebf230ecf33" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/SY2xyrmV44Y?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://generalistai.com/blog/apr-02-2026-GEN-1">Generalist</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="pn_bj5-qyw8"><em>Unitree open-sources UnifoLM-WBT-Dataset—high-quality real-world humanoid robot <a data-linked-post="2650273084" href="https://spectrum.ieee.org/mit-humanoid-robot-teleoperation-dynamic-tasks" target="_blank">whole-body teleoperation</a> (WBT) dataset for open environments. Publicly available since March 5, 2026, the dataset will continue to receive high-frequency rolling updates. It aims to establish the most comprehensive real-world humanoid robot dataset in terms of scenario coverage, task complexity, and manipulation diversity.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bd19da6e3dfeb2ede20007b534d1b9a6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/pN_bj5-QyW8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://huggingface.co/collections/unitreerobotics/unifolm-wbt-dataset">Hugging Face</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="79mr-_-a9js"><em>Autonomous mobile robots operating in human-shared indoor environments often require paths that reflect human spatial intentions, such as avoiding interference with pedestrian flow or maintaining comfortable clearance. This paper presents MRReP, a Mixed Reality-based interface that enables users to draw a Hand-drawn Reference Path (HRP) directly on the physical floor using hand gestures.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="783457e452248043a5ec6e2898ae5289" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/79mR-_-a9js?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://mertcookimg.github.io/mrrep/">MRReP</a> ]</p><p>Thanks, Masato!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="97qialc5hnm"><em>Eye contact, even momentarily between strangers, plays a pivotal role in fostering human connection, promoting happiness, and enhancing belonging. Through autonomous navigation and adaptive mirror control, Mirrorbot facilitates serendipitous, nonverbal interactions by dynamically transitioning reflections from self-focused to mutual recognition, sparking eye contact, shared awareness, and playful engagement.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="232f93e3a45a2e11d81366bb7ed95286" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/97qIaLC5hNM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arl.human.cornell.edu/research-MirrorBot.html">ARL</a> ] via [ <a href="https://news.cornell.edu/stories/2026/04/mirrorbot-fostering-human-connection">Cornell University</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="jya06ffonyg"><em>Experience PAL Robotics’ new teleoperation system for TIAGo Pro, the AI-ready mobile manipulator designed for advanced research. This real-time VR teleoperation setup allows precise control of TIAGo Pro’s dual arms in Cartesian space, ideal for remote manipulation, AI data collection, and robot learning.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="86699af54f2bfd064590b0cd59aa3f8c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/jya06FFONyg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://pal-robotics.com/robot/tiago-pro/">PAL Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="t52sq8gk5ks">Utter brilliance from Robust AI. No notes.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="71e7d47e220a5b61b914c1491f1df3dc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/T52SQ8Gk5Ks?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.robust.ai/">Robust AI</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="w8lqu8dkvp4"><em>Come along with our Senior Test Engineer, Nick L., as he takes us on a tour of the <a data-linked-post="2650277831" href="https://spectrum.ieee.org/qa-irobot-roomba-i7" target="_blank">Home Test Labs</a> inside the iRobot HQ.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="56a753f2b7e0640f199e35246a22843f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/W8lQU8dKvP4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.irobot.com/en_US/our-story.html">iRobot</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="gjukjrwjpxg"><em>By automating the final “magic 5%” of production—the precise trimming of swim goggles’ silicone gaskets based on individual face scans—UR cobots allow THEMAGIC5 to deliver affordable, custom-fit goggles, enabling the company to scale from a Kickstarter sensation to selling over 400,000 goggles worldwide.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="76ebeda03bf930b9cd576a8e870f8dad" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/GJukJRWjPxg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.universal-robots.com/case-stories/non-stop-robot-precision-for-7-years-cobots-deliver-the-last-magic-5-in-swim-goggle-production/">Universal Robots</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="x16ht1erjhk"><em>Sanctuary AI has once again demonstrated its industry-leading approach to training dexterous manipulation policies for its advanced hydraulic hands. In this video, their proprietary hydraulic hand autonomously manipulates a lettered cube, continuously reorienting it to match a specified goal (displayed in the bottom-left corner of the video).</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ad1d77f7ce4f331c7e74b0b779ff6cae" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/X16Ht1ERjHk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.sanctuary.ai/">Sanctuary AI</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="r3toz2pgppy"><em>China’s Yuxing 3-06 commercial experimental satellite, the first of its kind to be equipped with a flexible robotic arm, has recently completed an in-orbit refueling test and verification of key technologies. The test paves the way for Yuxing 3-06, dubbed a “space refueling station,” to refuel other satellites in orbit, manage space debris, and provide other in-orbit services.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="eaf9d2765bb1e0ebff60f038ccba42fd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/R3TOZ2PgPPY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://mp.weixin.qq.com/s/1c-9aNwuXv_p-VhojMkwwA">Sanyuan Aerospace</a> ] via [ <a href="https://spacenews.com/chinese-startup-tests-flexible-robotic-arm-in-space-for-on-orbit-servicing/">Space News</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="z4poalprrhe"><em>This is a demonstration of natural walking, whole-body teleoperation, and motion tracking with our custom-built humanoid robot. The control policies are trained using large-scale parallel reinforcement learning (RL). By deploying robust policies learned in a physics simulator onto the real hardware, we achieve dynamic and stable whole-body motions.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="703bacdcb0167fb3aa9bfe36e1da07ac" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/z4POaLPRRhE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robotics.tokyo/">Tokyo Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="5olcwku7l9u"><em>Faced with aging railway infrastructure, a shrinking workforce and rising construction costs, Japan Railway West asked construction innovator Serendix to replace an old wooden building at its Hatsushima railway station using its 3D printing technology. An ABB robot enabled the company to assemble the new building in a single night ready for the first train service the next day.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="031eec5b200f86cdad72129d9a002cfc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/5olcWkU7l9U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.abb.com/global/en/news/134689">ABB</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="1k1phiqcfty"><em>Humanoid, SAP, and Martur Fompak team up to test humanoid robots in automotive manufacturing logistics. This joint proof of concept explores how robots can streamline operations, improve efficiency, and shape the future of smart factories.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cc54aa14687108db3bc231b8cc456fea" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/1K1phiQCftY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://thehumanoid.ai/">Humanoid</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="oqglmefwbt8">This MIT Robotics Seminar is from Dario Floreano at EPFL, on “Avian Inspired Drones.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7013e7fe97df52eb328681b647c9fddc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/oqglMEFWBt8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robotics.mit.edu/robotics-seminar/">MIT</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="etk5es0jvm4">This MIT Robotics Seminar is from Ken Goldberg at UC Berkeley: “Good Old-Fashioned Engineering Can Close the 100,000 Year ‘Data Gap’ in Robotics.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="710bc514cbab6092dc5f439cf03127c6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/EtK5es0jVM4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robotics.mit.edu/robotics-seminar/">MIT</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 03 Apr 2026 16:30:01 +0000</pubDate><guid>https://spectrum.ieee.org/video-humanoid-dancing</guid><category>Humanoid-robots</category><category>Video-friday</category><category>Robot-ai</category><category>Human-robot-interaction</category><category>Teleoperation</category><category>Industrial-robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/bipedal-teal-robot-practices-side-to-side-dance-move-with-arm-movement.gif?id=65460048&amp;width=980"></media:content></item><item><title>Gill Pratt Says Humanoid Robots’ Moment Is Finally Here</title><link>https://spectrum.ieee.org/humanoid-robots-gill-pratt-darpa</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-smiling-bespectacled-bearded-man-kneels-posed-behind-a-robotic-torso.jpg?id=65446567&width=2000&height=1500&coordinates=0%2C0%2C0%2C0"/><br/><br/><p>In 2012, the U.S. Defense Advanced Research Projects Agency announced the <a href="https://spectrum.ieee.org/darpa-robotics-challenge-here-are-the-official-details" target="_self">DARPA Robotics Challenge</a> (DRC). The multiyear, multimillion-dollar competition for disaster robotics resulted in <a href="https://spectrum.ieee.org/darpa-unveils-atlas-drc-robot" target="_self">Boston Dynamics’ Atlas</a>, some <a href="https://spectrum.ieee.org/darpa-robotics-challenge-amazing-moments-lessons-learned-whats-next" target="_self">absolutely incredible moments</a> from one of the very first generations of useful humanoid robots, and <a href="https://www.youtube.com/watch?v=g0TaYhjpOfo" rel="noopener noreferrer" target="_blank">a blooper video</a> that will live on forever.</p><p><a href="https://www.tri.global/about-us/dr-gill-pratt" rel="noopener noreferrer" target="_blank">Gill Pratt</a>, the architect of the competition, had a very clear understanding of what the DRC was going to do for robotics. “The reason [for the DARPA Robotics Challenge] is actually to push the field forward and make this capability a reality,” <a href="https://spectrum.ieee.org/darpa-robotics-challenge-qa-with-gill-pratt" target="_self">Pratt told <em><em>IEEE Spectrum</em></em> in 2012</a>. At the time, he pointed out that before the <a href="https://spectrum.ieee.org/sand-trap" target="_self">DARPA Grand Challenge</a> in 2004 and the <a href="https://spectrum.ieee.org/autonomous-vehicles-complete-darpa-urban-challenge" target="_self">DARPA Urban Challenge</a> in 2007, driverless cars for complex environments essentially did not exist. He saw the DRC doing the same thing for robotics.</p><p>It’s been about a decade since <a href="https://spectrum.ieee.org/darpa-robotics-challenge-finals-winner" target="_self">the conclusion of the DARPA Robotics Challenge</a>, and many in the industry believe humanoid robots are about to have the transformative moment that Pratt predicted. But as is common in robotics, things tend to be far more difficult than it seems like they should be. <em><em>Spectrum</em></em> checked in with Pratt, now the <a href="https://www.linkedin.com/in/gillpratt/" rel="noopener noreferrer" target="_blank">CEO of the Toyota Research Institute</a> (TRI), to find out what’s holding humanoid robotics back, what he thinks these robots should be doing (or not doing), and how to navigate the humanoid hype bubble. </p><p><strong>What do you think about this robotics moment that we’re in?</strong></p><p><strong>Gill Pratt:</strong> What has changed is actually not about humanoids. Many people have been building research robots in the humanoid form for a long time. What’s different now isn’t the body, but the brain. We have always had this disparity in the robotics field where the mechanisms we were building were incredibly capable, but we didn’t really have the means for making the utility of the robot match that potential. Now we actually do, and that’s because of the AI revolution that has happened over the last few years.</p><p><strong>It’s very tempting to look back 10 years and directly credit the DRC with a lot of what is now happening with commercial humanoids. Is there any reason </strong><em><strong><em>not </em></strong></em><strong>to do that?</strong></p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="A smiling man poses with his arm around two humanoid robots, one with a shell on, and one with electronics exposed." class="rm-shortcode" data-rm-shortcode-id="7958768fc634cfa9e3071e39840d118e" data-rm-shortcode-name="rebelmouse-image" id="47f73" loading="lazy" src="https://spectrum.ieee.org/media-library/a-smiling-man-poses-with-his-arm-around-two-humanoid-robots-one-with-a-shell-on-and-one-with-electronics-exposed.jpg?id=65446571&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Gill Pratt poses with an early version of NASA’s Valkyrie DRC robot.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Gill Pratt</small></p><p><strong>Pratt:</strong> No, but I want to be humble about it. The DRC was focused on half autonomy and half teleoperation in real time. There was remote supervision, and then semiautonomy to amplify that supervision to handle tasks in real time while the remote person was telling the robot what to do. That was all before the breakthroughs that have happened in AI recently.</p><p>What has changed now is that we have a way to essentially teach robots what to do, and make them competent in a way that doesn’t require writing code; you can just demonstrate the task to the robot instead. With a sufficient amount of that data and new AI methods, robots can be far more performant than ever before.</p><p><strong>But that data is a bottleneck, right? How do we know what it should consist of, and what a sufficient amount is to get a robot to do something reliably?</strong></p><p><strong>Pratt:</strong> This mirrors exactly the debate going on in large language models [LLMs]. You have certain people who believe that if you take LLMs—which are autoregressive predictors that guess what the next word should be based on past words—and patch them up with a variety of methods to solve their hallucinations, we’ll eventually get to a point where we can trust the AI system. And then there are other people, and I think Yann LeCun is the most well-known of them, who say that’s nonsense, and we need something else. His view, and I agree, is that we need world models. We need some way for the AI system to imagine, try things out, and truly reason.</p><p>And I know that we’re applying words like ‘reason’ to what are essentially pattern-matching systems. Saying that there’s ‘reasoning’ is just a sticker we put on whatever we’ve built; it’s not true reasoning.</p><h2>Data Bottlenecks in Robot Learning</h2><p><strong>This is an example of </strong><a href="https://en.wikipedia.org/wiki/Thinking,_Fast_and_Slow" target="_blank"><strong>”system one” versus “system two”</strong></a><strong> thinking, right?</strong></p><p><strong>Pratt:</strong> Yes. System one is the fast, reflexive thinking we have, which is the kind of pattern matching that current LLMs do. System two is the slow reasoning that involves imagination and world models. That’s what we have not done yet. Progress on system one has been extraordinary, but we still don’t have system two. These attempts to patch system one to make it system two are like trying to squeeze a balloon filled with water; you squeeze it on one side and the water bulges out on the other side. You keep getting surprised that you fix one thing and something else breaks, and the performance overall doesn’t really get that much better.</p><p><strong>How have you been approaching this problem at TRI?</strong></p><p><strong>Pratt:</strong> Two years ago, <a href="https://medium.com/toyotaresearch/tris-robots-learn-new-skills-in-an-afternoon-here-s-how-2c30b1a8c573" target="_blank">we came up with diffusion policy</a>, and then we came up with what I call <a href="https://spectrum.ieee.org/boston-dynamics-toyota-research" target="_self">large behavior models</a> (LBMs). That involves having one model trained on many tasks, and showing that as you add each task, it actually helps with the other tasks and cuts down on the amount of training data needed to reach a given level of performance. These have been incredible system one advances.</p><p>The breakthrough happened when we realized that diffusion could be applied to robot behavior. We discovered that operating in the behavior space, from vision in, to action out, worked incredibly well. That kicked off the whole field, and since then, I think every robotics demonstration that we’ve seen is using some form of diffusion policy to do what it’s doing. But again, this is system-one pattern matching: ‘If I see the world like this, I act on the world like that.’ The robot’s not imagining, thinking, and planning the way traditional robotics with hand coding used to do. It’s just reacting.</p><p><strong>System one’s pattern matching often breaks down in the real world, though, as we’ve seen with autonomous driving’s struggles.</strong></p><p><strong>Pratt:</strong> Ten years ago, when TRI first started, <a href="https://spectrum.ieee.org/toyota-gill-pratt-on-the-reality-of-full-autonomy" target="_self">almost everybody </a>was saying that automated driving was right around the corner. </p><p>Ten years later, I do think we are now there, and the remaining questions are business ones: How much does the hardware cost, the insurance, the support, does it economically make sense? We haven’t necessarily <em><em>solved</em></em> automated driving, but our solutions are good enough, because we use humans for backup. When an automated vehicle gets stuck at a double-parked car, it calls home and asks a person for a system-two decision. I think other robots could do that also. Most of the time they do their work on their own, and every once in a while, they raise their hand for help.</p><p><strong>If we’ve just barely managed to get autonomous cars right, why are we devoting so much attention to the legged humanoid form factor?</strong></p><p><strong>Pratt:</strong> We’ve built the world with physical affordances for our bodies. If the robot is to do well in that world, it should have something that takes advantage of those affordances. It’s also easier for imitation learning to work because we have the same form. And legs are good for certain environments; you can step over obstacles to balance faster than you can roll to a new point of support with wheels. Having said all that, legs are not always the most practical thing. It’s very weird to see so much focus on legged robots in factories, which are flat environments perfectly suited for wheels.</p><h2>Managing the Humanoid Robotics Hype</h2><p><strong>Do you think that the amount of money being poured into legged humanoids is a good thing for robotics?</strong></p><p><strong>Pratt:</strong> It has both advantages and dangers. It’s wonderful seeing so many resources into the robotics field, and I do think that something special has occurred. Things are not the way they were before, and there are so many possibilities when you think about people teaching robots how to do things.</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="A smiling man gazes up at a humanoid robotic structure that is many times larger than him." class="rm-shortcode" data-rm-shortcode-id="c0e4ba89ea81f79fe4fce5cddd5edb27" data-rm-shortcode-name="rebelmouse-image" id="528a3" loading="lazy" src="https://spectrum.ieee.org/media-library/a-smiling-man-gazes-up-at-a-humanoid-robotic-structure-that-is-many-times-larger-than-him.jpg?id=65446572&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Gill Pratt admires a robot on the roof of the Ghibli Museum in Tokyo.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Gill Pratt</small></p><p><strong>What kinds of things should humans be teaching robots to do?</strong></p><p><strong>Pratt: </strong>For 10 years at TRI, <a href="https://spectrum.ieee.org/gill-pratt-toyota-elder-care-robots" target="_self">we’ve been thinking about society and aging</a>. It’s not just about physical disability; it’s about loneliness and loss of purpose, which are far more prevalent (and far worse) problems. And so the question is, what can we do technologically to help people feel that they’re younger?</p><p>At TRI, we’re exploring “care-receiving robots”—robots that receive teaching from a human. We have evolved to be creatures that love giving and love helping. When you program a machine by demonstration, and that machine goes on to help someone else, you feel a sense of purpose. We think robots can be bidirectional things to improve quality of life psychologically, not only physically.</p><p><strong>When you started TRI 10 years ago, I asked you what you would be focusing on, your answer really stuck with me: You said elder care, because “we don’t have a choice.”</strong></p><p><strong>Pratt:</strong> Yes. The statistics in Japan and the U.S. are only getting worse, and we <em><em>don’t</em></em> have a choice. It’s important to remember that an aging society has a huge impact on young people. This is because of the dependency ratio, which is how many young people in the workforce are supporting both people that are too young to work, and also people that are too old to work. Those numbers keep getting worse and worse.</p><p><strong>How do we solve this?</strong></p><p><strong>Pratt:</strong> We’ve had some incredible breakthroughs with system one, but it doesn’t mean the robots are going to be doing all that much, unless somebody makes a system-two breakthrough also. Or, where we have a system where humans provide some level of system-two supervisory control.</p><p><strong>That kind of human supervisory control takes us right back to the DRC, doesn’t it?</strong></p><p><strong>Pratt: </strong>[Laughs] That’s exactly right! Look, I’m not going to tell you not to praise the DRC… There was someone who called it the “<a href="https://www.youtube.com/watch?v=w222KFAiMQc" target="_blank">Woodstock of Robots</a>,” which just warmed my heart, that was so cool!</p><p><strong>So, 10 years later, how do you feel about the amount of hype in humanoid robotics right now?</strong></p><p><strong>Pratt: </strong>We are approaching what (I hope!) is a peak of inflated expectations for humanoids. And that’s because nobody’s thinking deeply enough about the system-one versus system-two thing.</p><p>Right now, our physical AI systems are just pattern matching. They’re incredibly capable, and it’s astonishing how good these things are—we are so proud of it. And we do believe that aggregating learning from many tasks through large behavior models will be incredibly effective. But it’s still not system two. There’s a lot of overpromising going on, and it’s very sad because it’s setting us up for a fall. What I’m worried about is the trough of disillusionment that will follow.</p><p><strong>How do we avoid that crash in robotics when the humanoid hype bubble bursts?</strong></p><p><strong>Pratt: </strong>For now, we need damping. In control systems, you stabilize an unstable system by adding damping. The press and the academic world can add lead compensation by reminding everyone that what we’re seeing in humanoids now isn’t really reasoning.</p><p>We should also remember that the automated driving field went through a bubble burst also, and just a few companies survived that, by keeping the hype down and being persistent. I think we should do that here, too.</p>]]></description><pubDate>Thu, 02 Apr 2026 15:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/humanoid-robots-gill-pratt-darpa</guid><category>Humanoid-robots</category><category>Darpa</category><category>Artificial-intelligence</category><category>Drc</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-smiling-bespectacled-bearded-man-kneels-posed-behind-a-robotic-torso.jpg?id=65446567&amp;width=980"></media:content></item><item><title>Wi-Fi That Can Withstand a Nuclear Reactor</title><link>https://spectrum.ieee.org/robotics-in-nuclear-industry</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/close-up-of-a-receiver-chip.jpg?id=65428613&width=2000&height=1500&coordinates=0%2C0%2C0%2C0"/><br/><br/><p>Researchers have made a Wi-Fi receiver that’s tough enough to work inside a nuclear reactor. They hope the receiver might be part of a wireless communications system for robotics used to <a href="https://www.iaea.org/topics/decommissioning" rel="noopener noreferrer" target="_blank">decommission</a> reactors.</p><p>Yasuto Narukiyo, a graduate student at the Institute of Science Tokyo, <a href="https://ieeexplore.ieee.org/document/11408968" rel="noopener noreferrer" target="_blank">presented</a> the wireless receiver at the <a href="https://www.isscc.org/" rel="noopener noreferrer" target="_blank">IEEE International Solid-State Circuits Conference</a> (<a href="https://spectrum.ieee.org/tag/isscc" target="_blank">ISSCC</a>), in San Francisco in February. The receiver endured a total radiation dose of 500 kilograys, orders of magnitude higher than the doses typically tolerated by electronics in outer space.</p><p>After the 2011 nuclear disaster at the <a href="https://spectrum.ieee.org/special-reports/fukushima-and-the-future-of-nuclear-power/" target="_self">Fukushima Daiichi</a> plant, engineers began using robots to help characterize and clean up the site. Most of these require local area network (LAN) cables that can get tangled, says Narukiyo. His team, which includes his advisor <a href="https://strdb.s.isct.ac.jp/html/100002402_en.html" rel="noopener noreferrer" target="_blank">Atsushi Shirane</a> and <a href="https://www2.kek.jp/qup/member/miyahara.html" rel="noopener noreferrer" target="_blank">Masaya Miyahara</a> of Japan’s High Energy Accelerator Research Organization (KEK), is aiming to develop a wireless system for controlling robots in this harsh environment.</p><p>Even under less dramatic circumstances, nuclear plants don’t last forever, and they need to be safely dismantled and decontaminated so the sites can be reused, a process called decommissioning. The process is lengthy, and risks exposing people to radiation, which is why engineers hope robots can come to the rescue. </p><p>The need for such robots is only growing. According to a <a href="https://www.sciencedirect.com/science/article/pii/S1364032124003472" rel="noopener noreferrer" target="_blank">2024 study</a>, of 204 reactors that have been closed, only 11 plants with a capacity over 100 megawatts have been fully decommissioned, and 200 more reactors will reach the end of their lifetimes in the next 20 years.</p><p>While electronics for space exploration are typically required to endure radiation doses of 100 to 300 grays over three years, a robot operating in a nuclear reactor needs to endure more than 500 kGy over the course of six months, says Narukiyo—at least 1,000 times the dosage. A robotic arm made by KUKA was able to <a href="https://www.frontiersin.org/journals/robotics-and-ai/articles/10.3389/frobt.2020.00006/full" rel="noopener noreferrer" target="_blank">withstand</a> just 164.55 Gy of damage before failing. For comparison, the lens of the eye absorbs just <a href="https://www.epa.gov/radiation/radiation-terms-and-units" rel="noopener noreferrer" target="_blank">60 milligrays</a> during a CT scan of the brain.</p><h2>Radiation Hardening</h2><p>To “<a href="https://spectrum.ieee.org/self-healing-electronics-jupiter" target="_blank">harden</a>” the 2.4-gigahertz Wi-Fi receiver against intense levels of radiation, Narukiyo and his team changed its mix of components, minimized the total number of transistors, and tinkered with the geometry of the transistors that were left. </p><p>The transistors, silicon MOSFETs (metal-oxide semiconductor <span>field-effect</span> transistors), contain an oxide layer that’s particularly vulnerable to radiation damage. Blasts of gamma rays can trap positive charges in the oxide, degrading the device’s performance and causing errors. They also changed the design of the transistors themselves. The device’s gate controls the flow of current through the transistor. The smaller it is, the more its performance will be degraded by a dose of radiation. So they made the gates longer and wider.</p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="A tabletop metal cylinder with a circuit board connected to power plugs on top of it." class="rm-shortcode" data-rm-shortcode-id="f6dd940d1127aa3f80e4b75a102fc43c" data-rm-shortcode-name="rebelmouse-image" id="49944" loading="lazy" src="https://spectrum.ieee.org/media-library/a-tabletop-metal-cylinder-with-a-circuit-board-connected-to-power-plugs-on-top-of-it.jpg?id=65428642&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Researchers tested the Wi-Fi receiver by placing it on top of a radiation source.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Yasuto Narukiyo, Sena Kato, et al.</small></p><p>Secondly, they considered the differences in how radiation affects PMOS transistors, in which current is carried primarily by positive charges, and NMOS, where electrons flow. PMOS transistors are more vulnerable to radiation damage because positive charge gets trapped in both the oxide and at the interface between the oxide and the rest of the semiconductor. These add up and shift the transistor towards the off state, says Narukiyo. To compensate, the new receiver design minimizes the use of PMOS, replacing these transistors with other elements such as inductors that don’t have an oxide layer. NMOS transistors are more resilient, says Narukiyo, because positive charges trapped in the oxide are to some extent canceled out by negative charges that get trapped at the interface.</p><p>Narukiyo and his team measured the performance of the receiver before exposure to radiation, and again after blasting it with a total dose of 300 kGy and then 500 kGy. Before being irradiated, it showed comparable performance to typical Wi-Fi receivers. After reaching the highest radiation dose, the gain of the receiver had decreased by about 1.5 decibel.</p><p>Narukiyo says the receiver is hardened enough, and now he hopes to improve its performance. He’s also working on a transmitter, which would allow for two-way communications. This is more challenging due to the need to produce high levels of current to generate the Wi-Fi signal. He says an earlier version he tried was broken by a 300 kGy dose. The group is exploring using other semiconductors, such as <a href="https://spectrum.ieee.org/diamond-electronics" target="_blank">diamond</a>, to toughen the transmitter.</p>]]></description><pubDate>Thu, 02 Apr 2026 14:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/robotics-in-nuclear-industry</guid><category>Wi-fi</category><category>Nuclear-reactors</category><category>Isscc</category><category>Decommissioning</category><category>Industrial-robots</category><category>Radiation-hardening</category><dc:creator>Katherine Bourzac</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/close-up-of-a-receiver-chip.jpg?id=65428613&amp;width=980"></media:content></item><item><title>Scientists Build Living Robots With Nervous Systems</title><link>https://spectrum.ieee.org/neurobot-living-robot-nervous-system</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/close-up-of-a-neuro-robot-that-has-been-stained-to-highlight-multi-ciliated-cells-around-its-periphery.jpg?id=65444408&width=2000&height=1500&coordinates=0%2C0%2C0%2C0"/><br/><br/><p>Engineers have long tried to mimic life. They’ve built machine learning algorithms <a href="https://spectrum.ieee.org/topographic-neural-network" target="_self"><span><span>modeled after the human brain</span></span></a>, designed machines that <a href="https://spectrum.ieee.org/boston-dynamics-research-spot" target="_self"><span><span>walk like dogs</span></span></a> or <a href="https://spectrum.ieee.org/flying-robot-bug" target="_self"><span><span>fly like insects</span></span></a>, and taught robots to adapt, <a href="https://spectrum.ieee.org/video-friday-morphing-robots" target="_self"><span><span>however clumsily</span></span></a>, to the world around them.</p><p>Now they are skipping imitation altogether.</p><p>Instead of taking inspiration from biology, they are building robots out of it: fashioning tiny, <a href="https://spectrum.ieee.org/aidesigned-living-robots-crawl-heal-themselves" target="_self">free-swimming assemblages of living cells</a> that organize into self-directed systems, complete with neurons that wire themselves into functional circuits.</p><p>The result, <a href="https://advanced.onlinelibrary.wiley.com/doi/10.1002/advs.202508967" target="_blank">reported last month in <em>Advanced Science</em></a>, is what the researchers call a “neurobot.”</p><p>These living machines could help scientists better understand how simple neural networks give rise to complex behaviors, a foundational step toward building cyborg systems that integrate biological tissue with engineered control. And with further refinement, they could be put to use in applications ranging from precision tissue repair to environmental cleanup.</p><p>“My general reaction is, ‘Wow, this is amazing!’ ” says <a href="https://cbs.umn.edu/directory/kate-adamala" target="_blank"><span>Kate Adamala</span></a>, a synthetic biologist at the University of Minnesota Twin Cities, who was not involved in the research. “This truly puts the engineering component into bioengineering.”</p><h2>Toward Internal Control</h2><p>Neurobots mark the latest advance in a <a href="https://journals.sagepub.com/doi/10.1089/soro.2022.0142" target="_blank">series of increasingly sophisticated biological machines</a> developed by Tufts University biologist <a href="https://allencenter.tufts.edu/our-team/michael-levin/" target="_blank">Michael Levin</a> and his collaborators.</p><p><a href="https://www.pnas.org/doi/10.1073/pnas.1910837117" target="_blank">First described in 2020</a>, these clusters of living cells, when removed from their normal developmental context and cultured in simple saline conditions, spontaneously self-organize in such a manner that they move and act in novel ways. Under the microscope, they look like irregular, translucent blobs of tissue, but their coordinated motion reveals an emergent order that is unlike anything found in the natural world.</p><p>“These things don’t occur naturally,” says <a href="https://www.binghamton.edu/ssie/people/profile.html?id=cgg" target="_blank"><span><strong><span></span></strong><span>Carlos Gershenson</span></span></a>, a<em> </em>computer scientist<em><em> </em></em>at Binghamton University, State University of New York, who <a href="https://direct.mit.edu/artl/article/29/2/153/114834/Emergence-in-Artificial-Life?guestAccessKey=" target="_blank"><span>studies artificial life</span></a> and complex systems but was not involved in the neurobot research. “They’re made with natural cells, but we’re the ones arranging them.”</p><p>The <a href="https://www.science.org/doi/full/10.1126/scirobotics.abf1571" target="_blank">earliest examples of this technology</a>, called xenobots, were built from frog-derived tissues and mainly from a single type of structural cell. Despite the simplicity of their construction, however, they could propel themselves through water using beating hair-like projections called cilia. They survived for days without added nutrients. And they could repair minor damage, all without any scaffolding materials or genetic manipulation. <a href="https://www.pnas.org/doi/10.1073/pnas.2112672118" target="_blank">Some could even self-<span>replicate</span></a><em><em> </em></em>by spontaneously sweeping up loose stem cells.<em><em></em></em></p><p>Still, for all the novelty of these biological machines, their behavior was essentially mechanical. Their movements were driven by anatomy and physics, not by anything resembling internal control. They could sense chemical cues, change direction accordingly, and even retain traces of past experiences, as <a href="https://www.biorxiv.org/content/10.64898/2026.03.17.712168v1" target="_blank">detailed in a preprint posted 17 March on <em>bioRxiv</em></a>.</p><p>But many other simple organisms—fungi, protists, and bacteria included—can do much the same. To achieve more flexible, coordinated behavior, they would need a way to integrate information across the body and dynamically direct their actions. Neurobots begin to provide that missing layer of control.</p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="f86434d62c5577170353478e6aeab577" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/wrIpHUmYKBE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span> <small class="image-media media-caption" placeholder="Add Photo Caption...">Small tufts of hairlike cilia, combined with the neurobot’s nervous system, allow it to move on its own.</small> <small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Haleh Fotowat</small> </p><h2>Linking Neural Activity to Action</h2><p>Like earlier xenobots, neurobots are still built from frog cells, but they are now endowed with neurons that mature from partially differentiated stem<strong> </strong>cells. These nerve cells develop alongside structural tissues, forming branching connections throughout the autonomous beings. This means they can relay electrochemical signals from cell to cell.</p><p>And unlike other laboratory models of the nervous system—<a href="https://spectrum.ieee.org/organoid-intelligence-computing-on-brain" target="_self">brain organoids</a>, say, or <a href="https://spectrum.ieee.org/biological-computer-for-sale" target="_self">lab-on-a-chip</a> technologies—neurobots move. They swim, explore, and respond to their surroundings in ways that tie electrical signaling to observable movement, producing patterns of <strong></strong>physical activity  distinct from<strong> </strong>their non-neural counterparts.</p><p>Neurobots spend less time idling and more time exploring. They also trace looping and spiraling paths rather than repeating simple trajectories. And they respond differently to neuroactive drugs.</p><p>If the organizing principles that enable these internally guided motions and reflexes can now be deciphered, they could then be harnessed to produce more predictable biological functions, says <a href="https://wyss.harvard.edu/team/advanced-technology-team/haleh-fotowat/" target="_blank">Haleh Fotowat</a>, a neuroengineer from Harvard’s Wyss Institute for Biologically Inspired Engineering, who collaborated with Levin’s team on the study.</p><p>“We’re still very early in terms of understanding the system and its capabilities.” But once the scientists understand how the neurobots self-organize, she says, “then we can begin to engineer on top of that.”</p><p>Beyond the practical, neurobots also raise deeper epistemological questions about the nature of biological organization, notes Levin. “Where does form and function come from in the first place?” he asks. “When it’s not evolved and it’s not engineered, where do these patterns come from?”</p><p>“This is a model system for asking those kinds of questions,” Levin says—in frog and human constructs alike.</p><h2>From Discovery to Deployment</h2><p>Among the many variations on the biobot theme are “<a href="https://advanced.onlinelibrary.wiley.com/doi/10.1002/advs.202303575" target="_blank"><span><span>anthrobots</span></span></a><span>,</span>” built from clusters of human lung cells instead of frog tissue.</p><p>Levin’s team now plans to add human neural cells to their anthrobots, extending the neurobot framework into a fully human context. Then, through further conditioning and guided learning, these living machines—like <a href="https://spectrum.ieee.org/using-a-twopronged-approach-to-detect-explosive-substances-from-bombs" target="_self">dogs trained to sniff for bombs</a>—may become capable of adapting their behavior in predictable ways.</p><p>“The hope would be that you could teach them or train them to do what you want them to do,” says <a href="https://www.uvm.edu/cems/cs/profile/josh-bongard" target="_blank">Josh Bongard</a>, a computer scientist and roboticist at the University of Vermont.</p><p>Bongard was not involved in the neurobot study but is a frequent collaborator of Levin’s. Together, they cofounded the nonprofit <a href="https://icdorgs.org/" target="_blank">Institute for Computationally Designed Organisms</a> and a commercial startup, <a href="https://www.faunasystems.com/" target="_blank">Fauna Systems</a>, to advance biobot-related technologies.</p><p>According to Fauna CEO <a href="https://www.linkedin.com/in/naimish-patel-925a84" target="_blank">Naimish Patel</a>, the company is initially targeting environmental sensing applications, aiming to deploy xenobots in settings such as aquaculture, wastewater monitoring, and pollutant detection, where the technology’s ability to integrate multiple signals could provide an early readout of ecosystem health.</p><p> If the xenobots encounter a mixture of stressors—say, elevated heavy metals, shifts in pH, and traces of agricultural runoff—their collective changes in movement or activity could provide a sensitive, real-time signal that something in the environment is amiss. </p><p>Precedent for this idea comes from Poland, where many cities already use <a href="https://www.atlasobscura.com/articles/wild-life-excerpt-water-quality-mussels" target="_blank">freshwater mussels as living sentinels of water quality</a>, wired with sensors that register when the animals clamp their shells shut in response to pollutants. Xenobots could extend this concept further, Patel says, potentially offering greater sensitivity and specificity by integrating multiple environmental cues into a single, measurable behavioral response. And neurobots could eventually push this fusion of sensing and computation into ever more sophisticated territory, he adds.</p><p>But the technical hurdles remain substantial—and the practical opportunities with simpler, non-neural versions are already compelling—so the first-gen xenobots, for the time being,  remain the focus of Fauna’s initial product-development efforts, Patel says. “Right now, we’re looking for the intersection between unmet commercial need and emerging capability.” </p>]]></description><pubDate>Thu, 02 Apr 2026 13:00:01 +0000</pubDate><guid>https://spectrum.ieee.org/neurobot-living-robot-nervous-system</guid><category>Bioengineering</category><category>Frog</category><category>Living-cells</category><category>Biomimetics</category><category>Bioinspired-robots</category><dc:creator>Elie Dolgin</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/close-up-of-a-neuro-robot-that-has-been-stained-to-highlight-multi-ciliated-cells-around-its-periphery.jpg?id=65444408&amp;width=980"></media:content></item><item><title>Video Friday: Beep! Beep! Roadrunner Bipedal Bot Breaks the Mold</title><link>https://spectrum.ieee.org/roadrunner-bipedal-robot</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/two-wheeled-balancing-robot-leans-while-rolling-in-an-indoor-testing-lab.png?id=65415603&width=2000&height=1500&coordinates=252%2C0%2C253%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://2026.ieee-icra.org/">ICRA 2026</a>: 1–5 June 2026, VIENNA</h5><h5><a href="https://roboticsconference.org/">RSS 2026</a>: 13–17 July 2026, SYDNEY</h5><h5><a href="https://mrs.fel.cvut.cz/summer-school-2026/">Summer School on Multi-Robot Systems</a>: 29 July–4 August 2026, PRAGUE</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="9kae-uame1u"><em>“Roadrunner” is a new bipedal wheeled robot prototype designed for multimodal locomotion. It weighs around 15 kg (33 lb) and can seamlessly switch between its side-by-side and in-line wheel modes and stepping configurations depending on what is required for navigating its environment. The robot’s legs are entirely symmetric, allowing it to point its knees forward or backward, which can be used to avoid obstacles or manage specific movements. A single control policy was trained to handle both side-by-side and in-line driving. Several behaviors, including standing up from various ground configurations and balancing on one wheel, were successfully deployed zero-shot on the hardware.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="76bd6c7edd7ff24700dad004edd086aa" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/9kae-UAME1U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://rai-inst.com/">Robotics and AI Institute</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="tyasuwrkv4e">Incredibly (INCREDIBLY!) <a data-linked-post="2657767692" href="https://spectrum.ieee.org/nasa-mars-sample-return" target="_blank">NASA</a> says that this is actually happening.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bc72d2ac20028faf8c32287c722f0ce9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/TYasUWRkv4E?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>NASA’s SkyFall mission will build on the success of the Ingenuity Mars helicopter, which achieved the first powered, controlled flight on another planet. Using a daring midair deployment, SkyFall will deliver a team of next-gen Mars helicopters to scout human landing sites and map subsurface water ice.</em></blockquote><p>[ <a href="https://www.nasa.gov/news-release/nasa-unveils-initiatives-to-achieve-americas-national-space-policy/">NASA</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="jsk-ff2mycg"><em>NASA’s MoonFall mission will blaze a path for future <a data-linked-post="2662067231" href="https://spectrum.ieee.org/video-friday-training-artemis" target="_blank">Artemis</a> missions by sending four highly mobile drones to survey the lunar surface around the Moon’s South Pole ahead of astronauts’ arrival there. MoonFall is built on the legacy of NASA’s Ingenuity Mars Helicopter. The drones will be launched together and released during descent to the surface. They will land and operate independently over the course of a lunar day (14 Earth days) and will be able to explore hard-to-reach areas, including permanently shadowed regions (PSRs), surveying terrain with high-definition optical cameras and other potential instruments.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="24cd6ef18a5608c71e3afdc55a0d2507" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/JsK-ff2Mycg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>For what it’s worth, <a data-linked-post="2671177906" href="https://spectrum.ieee.org/moon-landing-2025" target="_blank">Moon landings</a> have a success rate well under 50%. So let’s send some robots there to land over and over!</p><p>[ <a href="https://www.nasa.gov/news-release/nasa-unveils-initiatives-to-achieve-americas-national-space-policy/">NASA</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="hdjiukrfvca"><em>In Science Robotics, researchers from the Tangible Media group led by Professor Hiroshi Ishii, together with colleagues from Politecnico di Bari, present Electrofluidic Fiber Muscles: a new class of artificial muscle fibers for robots and wearables. Unlike the rigid servo motors used in most robots, these fiber-shaped muscles are soft and flexible. They combine electrohydrodynamic (EHD) fiber pumps—slender tubes that move liquid using electric fields to generate pressure silently, with no moving parts—with fluid-filled fiber actuators. These artificial muscles could enable more agile untethered robots, as well as wearable assistive systems with compact actuation integrated directly into textiles.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="401e33c5be7f9feea5a4219dd786d2ab" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/HdjIukrfvcA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.media.mit.edu/projects/electrofluidicmuscle/overview/">MIT Media Lab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="xzfzkmq2rrq"><em>In this study, we developed MEVIUS2, an open-source quadruped robot. It is comparable in size to the Boston Dynamics Spot, equipped with two lidars and a C1 camera, and can freely climb stairs and steep slopes! All hardware, software, and learning environments are released as open source.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ef4dd2071d09d4ac4c97d9e6993be2ea" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xzfZkmQ2rrQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://github.com/haraduka/mevius2">MEVIUS2</a> ]</p><p>Thanks, Kento!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="zj07hhjnrto"><em>What goes into preparing for a live performance? Arun highlights the reliability testing that goes into trying a new behavior for Spot.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="075596c69914e064444994a7d74fe2dc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zj07hHJnrto?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://bostondynamics.com/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="41kpw6jwxty"><em>In this work, a multirobot planning and control framework is presented and demonstrated with a team of 40 indoor robots, including both ground and aerial robots.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e8811d7981e9be82f23859aafea31249" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/41kPW6JwXtY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>That soundtrack, though.</p><p>[ <a href="https://proroklab.github.io/agile-mapf/">GitHub</a> ]</p><p>Thanks, Keisuke!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="img5a_ykjms"><em>Quadrupedal robots can navigate cluttered environments like their animal counterparts, but their floating-base configuration makes them vulnerable to real-world uncertainties. Controllers that rely only on proprioception (body sensing) must physically collide with obstacles to detect them. Those that add exteroception (vision) need precisely modeled terrain maps that are hard to maintain in the wild. DreamWaQ++ bridges this gap by fusing both modalities through a resilient multimodal reinforcement learning framework. The result: a single controller that handles rough terrains, steep slopes, and high-rise stairs—while gracefully recovering from sensor failures and situations it has never seen before.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="61dd08501e1c8f10d63a43acb5bb2911" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Img5a_yKjMs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>That cliff behavior is slightly uncanny.</p><p>[ <a href="https://dreamwaqpp.github.io/">DreamWaQ++</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="toh8pd4o34u">I take issue with this from iRobot:</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1d86fae43d52011c45db0102b9fdc86b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/tOH8pD4O34U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>While the <a data-linked-post="2650276443" href="https://spectrum.ieee.org/robotic-blimp-could-explore-hidden-chambers-of-great-pyramid" target="_blank">pyramid exploration</a> that iRobot did was very cool, they did it with a custom-made robot designed for a very specific environment. Cleaning your floors is way, way harder. Here’s a bit more detail on the pyramids thing:</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1b4538cb0137311b0b433425e56096f0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Pts3w2Pw8F4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.youtube.com/watch?v=Pts3w2Pw8F4">iRobot</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="t1vub0knci4">More robots in the circus, please!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="89aa286bf5c7d16563d9223df6cc3d2b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/T1VUb0kncI4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://danielsimu.com/acrobot/">Daniel Simu</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="f2hasoladgm"><em>MIT engineers have designed a wristband that lets wearers control a robotic hand with their own movements. By moving their hands and fingers, users can direct a robot to perform specific tasks, or they can manipulate objects in a virtual environment with high-dexterity control.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="88281d6e7db31cc58ef4b327756809b2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/F2HaSoladgM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://news.mit.edu/2026/wristband-enables-wearers-control-robotic-hand-with-own-movements-0325">MIT</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="0ozaw6rryie"><em>At <a data-linked-post="2676218078" href="https://spectrum.ieee.org/nvidia-groq-3" target="_blank">Nvidia GTC 2026</a>, we showcased how AI is moving into the physical world. Visitors interacted with robots using voice commands, watching them interpret intent and act in real time—powered by our KinetIQ AI brain.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="95460eeec4fec87fd729fe5aa4314531" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0oZAw6rryIE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://thehumanoid.ai/">Humanoid</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="7sl93jl8_o8">Props to Sony for its continued support and updates for <a data-linked-post="2670284977" href="https://spectrum.ieee.org/aibo" target="_blank">Aibo</a>!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f05e5074c48cd251f832782efa434226" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/7sL93Jl8_O8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://us.aibo.com/myaibo/">Aibo</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="yd7enmgniei">This robot looks like it could be a little curvier than normal?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3be1fe9e24c6ee745f0f1fa7a2a1b201" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Yd7eNmGNIeI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.limxdynamics.com/en">LimX Dynamics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="dncww0qmkce"><em>Developed by Zhejiang Humanoid Robot Innovation Center Co., Ltd., the Naviai Robot is an intelligent cooking device. It can autonomously process ingredients, perform cooking tasks with high accuracy, adjust smart kitchen equipment in real time, and complete postcooking cleaning. Equipped with multimodal perception technology, it adapts to daily kitchen environments and ensures safe and stable operation.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f58863823d5082a3e5e104c47b9e68f6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/dNcWW0qMkcE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>That 7x is doing some heavy lifting.</p><p>[ <a href="https://en.zhejianglab.com/institutescenters/researchunits/interdisciplinaryresearchcenters/researchcenterforintelligentrobot/">Zhejiang Lab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="gthxsfhdt8q">This CMU RI Seminar is by Hadas Kress-Gazit from Cornell, on “Formal Methods for Robotics in the Age of Big Data.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a0150919b813daa034367d7a41c9d68e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/gthXSFhDt8Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>Formal methods—mathematical techniques for describing systems, capturing requirements, and providing guarantees—have been used to synthesize robot control from high-level specification, and to verify robot behavior. Given the recent advances in robot learning and data-driven models, what role can, and should, formal methods play in advancing robotics? In this talk I will give a few examples for what we can do with formal methods, discuss their promise and challenges, and describe the synergies I see with data-driven approaches.</em></blockquote><p>[ <a href="https://www.ri.cmu.edu/event/formal-methods-for-robotics-in-the-age-of-big-data/">Carnegie Mellon University Robotics Institute</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 27 Mar 2026 16:30:03 +0000</pubDate><guid>https://spectrum.ieee.org/roadrunner-bipedal-robot</guid><category>Video-friday</category><category>Nasa</category><category>Bipedal-robots</category><category>Quadruped-robots</category><category>Artificial-muscles</category><category>Humanoid-robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/two-wheeled-balancing-robot-leans-while-rolling-in-an-indoor-testing-lab.png?id=65415603&amp;width=980"></media:content></item><item><title>30 Years Ago, Robots Learned to Walk Without Falling</title><link>https://spectrum.ieee.org/honda-p2-robot-ieee-milestone</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/collage-of-hondas-p2-humanoid-robot-from-1996-against-a-background-of-figures-related-to-its-technical-features.jpg?id=65402169&width=2000&height=1500&coordinates=0%2C0%2C0%2C0"/><br/><br/><p>When you hear the term <a href="https://spectrum.ieee.org/search/?q=humanoid+robot" target="_self"><em><em>humanoid robot</em></em></a>, you may think of <a href="https://starwars.fandom.com/wiki/C-3PO" rel="noopener noreferrer" target="_blank">C-3PO</a>, the human-cyborg-relations android from <a href="https://www.starwars.com/" rel="noopener noreferrer" target="_blank"><em><em>Star Wars</em></em></a><em><em>.</em></em> C-3PO was designed to assist humans in communicating with robots and alien species. The droid, which first appeared on screen in 1977, joined the characters on their adventures, walking, talking, and interacting with the environment like a human. It was ahead of its time.</p><p>Before the release of <em><em>Star Wars</em></em>, a few androids did exist and could move and interact with their environment, but none could do so without losing its balance.</p><p>It wasn’t until 1996 that the first autonomous robot capable of walking without falling was developed in Japan. <a href="https://www.google.com/aclk?sa=L&ai=DChsSEwjfwP2lmviSAxUdF60GHVa5APUYACICCAEQARoCcHY&ae=2&co=1&ase=2&gclid=CjwKCAiA-__MBhAKEiwASBmsBMZ3C7eg4qpf1gS-s4hmogZZL-Tr0YQ7T1h4mn0IoFztQ7NVCqHCjhoCXqoQAvD_BwE&cid=CAASZuRo0CEpRkUaLKjRvxVglDhyNNQqb9IGBGToAJFwXbXIyMx3bZTVg0T8ishwxc5PTKrYMjYnaSzvAx3ewj0dizuR563LtzuoBcRH9l0T-TNDiYKEN25LZQWjdGD6NduB7UgbPw6wRg&cce=2&category=acrcp_v1_71&sig=AOD64_0hkFjU2fo-VGEWLhz4zejdBOhDxw&q&nis=4&adurl&ved=2ahUKEwif9vWlmviSAxU-ITQIHZAXPSIQ0Qx6BAg8EAE" rel="noopener noreferrer" target="_blank">Honda</a>’s <a href="https://hondanews.com/en-US/photos/p2-robot" rel="noopener noreferrer" target="_blank">Prototype 2</a> (P2) was nearly 183 centimeters tall and weighed 210 kilograms. It could control its posture to maintain balance, and it could move multiple joints simultaneously.</p><p>In recognition of that decades-old feat, P2 has been honored as an <a href="https://ieeemilestones.ethw.org/Main_Page" rel="noopener noreferrer" target="_blank">IEEE Milestone</a>. The dedication ceremony is scheduled for 28 April at the <a href="https://www.mr-motegi.jp/eng/collection-hall/?from=navi_header_drawer_global_en" rel="noopener noreferrer" target="_blank">Honda Collection Hall</a>, located on the grounds of the <a href="https://en.wikipedia.org/wiki/Mobility_Resort_Motegi" rel="noopener noreferrer" target="_blank">Mobility Resort Motegi</a>, in Japan. The machine is on display in the hall’s robotics exhibit, which showcases the evolution of Honda’s humanoid technology.</p><p>In support of the Milestone nomination, members of the <a href="https://ieee-jp.org/section/nagoya/" rel="noopener noreferrer" target="_blank">IEEE Nagoya (Japan) Section</a> wrote: “This milestone demonstrated the feasibility of humanlike locomotion in machines, setting a new standard in robotics.” The <a href="https://ieeemilestones.ethw.org/Milestone-Proposal:Honda%27s_P2,_First_Bipedal_Robot,_1996" rel="noopener noreferrer" target="_blank">Milestone proposal</a> is available on the <a href="https://ethw.org/Main_Page" rel="noopener noreferrer" target="_blank">Engineering Technology and History Wiki</a>.</p><h2>Developing a domestic android</h2><p>In 1986 Honda researchers Kazuo Hirai, Masato Hirose, Yuji Haikawa, and <a href="https://research.com/u/toru-takenaka" rel="noopener noreferrer" target="_blank">Toru Takenaka</a> set out to develop what they called a “domestic robot” to collaborate with humans. It would be able to climb stairs, remove impediments in its path, and tighten a nut with a wrench, according to their <a href="https://www.cs.cmu.edu/~cga/humanoids/honda.pdf" rel="noopener noreferrer" target="_blank">research paper on the project</a>.</p><p>“We believe that a robot working within a household is the type of robot that consumers may find useful,” the authors wrote.</p><p>But to create a machine that would do household chores, it had to be able to move around obstacles such as furniture, stairs, and doorways. It needed to autonomously walk and read its environment like a human, according to the researchers.</p><p>But no robot could do that at the time. The closest technologists got was the <a href="https://www.humanoid.waseda.ac.jp/booklet/kato_2.html" rel="noopener noreferrer" target="_blank">WABOT-1</a>. Built in 1973 at <a href="https://www.waseda.jp/top/en" rel="noopener noreferrer" target="_blank">Waseda University</a>, in Tokyo, the WABOT had eyes and ears, could speak Japanese, and used tactile sensors embedded on its hands as it gripped and moved objects. Although the WABOT could walk, albeit unsteadily, it couldn’t maneuver around obstacles or maintain its balance. It was powered by an external battery and computer.</p><p>To build an android, the Honda team began by analyzing how people move, using themselves as models.</p><p>That led to specifications for the robot that gave it humanlike dimensions, including the location of the leg joints and how far the legs could rotate.</p><p>Once they began building the machine, though, the engineers found it difficult to satisfy every specification. Adjustments were made to the number of joints in the robot’s hips, knees, and ankles, according to the research paper. Humans have four hip, two knee, and three ankle joints; P2’s predecessor had three hip, one knee, and two ankle joints. The arms were treated similarly. A human’s four shoulder and three elbow joints became three shoulder joints and one elbow joint in the robot.</p><p>The researchers installed existing Honda motors and hydraulics in the hips, knees, and ankles to enable the robot to walk. Each joint was operated by a DC motor with a harmonic-drive reduction gear system, which is compact and offered high torque capacity.</p><p>To test their ideas, the engineers built what they called E0. The robot, which was just a pair of connected legs, successfully walked. It took about 15 seconds to take each step, however, and it moved using static walking in a straight line, according to <a href="https://global.honda/en/ASIMO/history/" rel="noopener noreferrer" target="_blank">a post about the project</a> on Honda’s website. (Static walking is when the body’s center of mass is always within the foot’s sole. Humans walk with their center of mass below their navel.)</p><p>The researchers created several algorithms to enable the robot to walk like a human, according to the Honda website. The codes allowed the robot to use a locomotion mechanism, dynamic walking, whereby the robot stays upright by constantly moving and adjusting its balance, rather than keeping its center of mass over its feet, according to a video on the YouTube channel <a href="https://www.youtube.com/watch?v=BCAZkjXgBE4" rel="noopener noreferrer" target="_blank">Everything About Robotics Explained</a>.</p><p class="pull-quote">“P2 was not just a technical achievement; it was a catalyst that propelled the field of humanoid robotics forward, demonstrating the potential for robots to interact with and assist humans in meaningful ways.” <strong>—IEEE Nagoya Section</strong></p><p>The Honda team installed rubber brushes on the bottom of the machine’s feet to reduce vibrations from the landing impacts (the force experienced when its feet touch the ground)—which had made the robot lose its balance.</p><p>Between 1987 and 1991, three more prototypes (E1, E2, and E3) were built, each testing a new algorithm. E3 was a success.</p><p>With the dynamic walking mechanism complete, the researchers continued their quest to make the robot stable. The team added 6-axis sensors to detect the force at which the ground pushed back against the robot’s feet and the movements of each foot and ankle, allowing the robot to adjust its gait in real time for stability.</p><p>The team also developed a posture-stabilizing control system to help the robot stay upright. A local controller directed how the electric motor actuators needed to move so the robot could follow the leg joint angles when walking, according to the research paper.</p><p>During the next three years, the team tested the systems and built three more prototypes (E4, E5, and E6), which had boxlike torsos atop the legs.</p><p>In 1993 the team was finally ready to build an android with arms and a head that looked more like C-3PO, dubbed <em><em>Prototype 1</em></em> (P1). Because the machine was meant to help people at home, the researchers determined its height and limb proportions based on the typical measurements of doorways and stairs. The arm length was based on the ability of the robot to pick up an object when squatting.</p><p>When they finished building P1, it was 191.5 cm tall, weighed 175 kg, and used an external power source and computer. It could turn a switch on and off, grab a doorknob, and carry a 70 kg object.</p><p>P1 was not launched publicly but instead used to conduct research on how to further improve the design. The engineers looked at how to install an internal power source and computer, for example, as well as how to coordinate the movement of the arms and legs, according to Honda.</p><p>For P2, four video cameras were installed in its head—two for vision processing and the other two for remote operation. The head was 60 cm wide and connected to the torso, which was 75.6 cm deep.</p><p>A computer with four <a href="https://en.wikipedia.org/wiki/MicroSPARC" target="_blank">microSparc II</a> processors running a real-time operating system was added into the robot’s torso. The processors were used to control the arms, legs, joints, and vision-processing cameras.</p><p>Also within the body were DC servo amplifiers, a 20-kg nickel-zinc battery, and a wireless Ethernet modem, according to the research paper. The battery lasted for about 15 minutes; the machine also could be charged by an external power supply.</p><p>The hardware was enclosed in white-and-gray casing.</p><p>P2, which was launched publicly in 1996, could walk freely, climb up and down stairs, push carts, and perform some actions wirelessly.</p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="4c1ac513d31347c699292e05c673df46" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/FEXSqsW6rMM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span> <small class="image-media media-caption" placeholder="Add Photo Caption...">P2, which was launched publicly in 1996, could walk freely, climb up and down stairs, push carts, and perform some actions wirelessly.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">King Rose Archives</small></p><p><span>The following year, Honda’s engineers released the smaller and lighter </span><a href="https://www.youtube.com/watch?v=hS82TL73V3E" target="_blank">P3</a><span>. It was 160 cm tall and weighed 130 kg.</span></p><p>In 2000 the popular <a href="https://spectrum.ieee.org/honda-asimo" target="_self">ASIMO robot</a> was introduced. Although shorter than its predecessors at 130 cm, it could walk, run, climb stairs, and recognize voices and faces. The <a href="https://spectrum.ieee.org/honda-robotics-unveils-next-generation-asimo-robot" target="_self">most recent version</a> was released in 2011. Honda has retired the robot.</p><h2>Honda P2’s influence</h2><p>Thanks to P2, today’s androids are not just ideas in a laboratory. Robots have been deployed to work in factories and, increasingly, at <a href="https://spectrum.ieee.org/home-humanoid-robots-survey" target="_self">home</a>.</p><p>The machines are even being used for entertainment. During this year’s <a href="https://www.cgtn.com/specials/2026/spring-festival.html" target="_blank">Spring Festival</a> gala in Beijing, machines developed by Chinese startups <a href="https://www.unitree.com/" target="_blank">Unitree Robotics</a>, <a href="https://www.galbot.com/" rel="noopener noreferrer" target="_blank">Galbot</a>, <a href="https://en.noetixrobotics.com/" rel="noopener noreferrer" target="_blank">Noetix</a>, and <a href="https://www.magiclab.top/en" rel="noopener noreferrer" target="_blank">MagicLab</a><a href="https://spectrum.ieee.org/robot-martial-arts" target="_self"> performed synchronized dances, martial arts, and backflips</a> alongside human performers.</p><p>“P2’s development shifted the focus of robotics from industrial applications to human-centric designs,” the Milestone sponsors explained in the wiki entry. “It inspired subsequent advancements in humanoid robots and influenced research in fields like biomechanics and artificial intelligence.</p><p>“It was not just a technical achievement; it was a catalyst that propelled the field of humanoid robotics forward, demonstrating the potential for robots to interact with and assist humans in meaningful ways.”</p><p>To learn more about robots, check out <a href="https://spectrum.ieee.org/" target="_self"><em><em>IEEE Spectrum</em></em></a>’s <a href="https://robotsguide.com/about" rel="noopener noreferrer" target="_blank">guide</a>.</p><h2>Recognition as an IEEE Milestone</h2><p>A plaque recognizing Honda’s P2 robot as an IEEE Milestone is to be installed at the <a href="https://www.mr-motegi.jp/eng/collection-hall/?from=navi_header_drawer_global_en" rel="noopener noreferrer" target="_blank">Honda Collection Hall</a>. The plaque is to read:</p><p><em><em>In 1996 Prototype 2 (P2), a self-contained autonomous bipedal humanoid robot capable of stable dynamic walking and stair-climbing, was introduced by Honda. Its legged robotics incorporated real-time posture control, dynamic balance, gait generation, and multijoint coordination. Honda’s mechatronics and control algorithms set technical benchmarks in mobility, autonomy, and human-robot interaction. P2 inspired new research in humanoid robot development, leading to increasingly sophisticated successors.</em></em></p><p>Administered by the <a href="https://www.ieee.org/about/history-center" rel="noopener noreferrer" target="_blank">IEEE History Center</a> and supported by <a href="https://secure.ieeefoundation.org/site/Donation2?df_id=1680&mfc_pref=T&1680.donation=form1" rel="noopener noreferrer" target="_blank">donors</a>, the Milestone program recognizes outstanding technical developments around the world.</p>]]></description><pubDate>Wed, 25 Mar 2026 18:00:05 +0000</pubDate><guid>https://spectrum.ieee.org/honda-p2-robot-ieee-milestone</guid><category>Ieee-history</category><category>Ieee-milestone</category><category>Honda</category><category>Robotics</category><category>Asimo</category><category>Type-ti</category><dc:creator>Joanna Goodrich</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/collage-of-hondas-p2-humanoid-robot-from-1996-against-a-background-of-figures-related-to-its-technical-features.jpg?id=65402169&amp;width=980"></media:content></item><item><title>The Coming Drone-War Inflection in Ukraine</title><link>https://spectrum.ieee.org/autonomous-drone-warfare</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/person-holding-a-large-drone-outdoors-under-a-sunny-partly-cloudy-sky.jpg?id=65327386&width=2000&height=1500&coordinates=111%2C0%2C111%2C0"/><br/><br/><p><strong>WHEN</strong><strong> </strong><strong>KYIV-BORN</strong><strong> </strong><strong>ENGINEER </strong><a href="https://www.instagram.com/yaroslavazhnyuk/?hl=en" rel="noopener noreferrer" target="_blank">Yaroslav Azhnyuk</a> thinks about the future, his mind conjures up dystopian images. He talks about “swarms of autonomous drones carrying other autonomous drones to protect them against autonomous drones, which are trying to intercept them, controlled by <a href="https://spectrum.ieee.org/ai-agents" target="_self">AI</a> <a href="https://spectrum.ieee.org/ai-agents" target="_self">agents</a> overseen by a human general somewhere.” He also imagines flotillas of autonomous submarines, each carrying hundreds of drones, suddenly emerging off the coast of California or Great Britain and discharging their cargoes en masse to the sky.</p><p>“How do you protect from that?” he asks as we speak in late December 2025; me at my quiet home office in London, he in Kyiv, which is bracing for another wave of <a href="https://spectrum.ieee.org/ukraine-air-defense" target="_self">missile attacks</a>.</p><p>Azhnyuk is not an alarmist. He cofounded and was formerly CEO of <a href="https://petcube.com/" rel="noopener noreferrer" target="_blank">Petcube</a>, a California-based company that uses smart cameras and an app to let pet owners keep an eye on their beloved creatures left alone at home. A self-described “liberal guy who didn’t even receive military training,” Azhnyuk changed his mind about developing military tech in the months following the <a href="https://commonslibrary.parliament.uk/research-briefings/cbp-9847/" rel="noopener noreferrer" target="_blank">Russian invasion of</a> <a href="https://commonslibrary.parliament.uk/research-briefings/cbp-9847/" rel="noopener noreferrer" target="_blank">Ukraine</a> in February 2022. By 2023, he had relinquished his CEO role at Petcube to do what many Ukrainian technologists have done—to help defend his country against a mightier aggressor.</p><p>It took a while for him to figure out what, exactly, he should be doing. He didn’t join the military, but through friends on the front line, he witnessed how, out of desperation, Ukrainian troops turned to off-the-shelf consumer drones to make up for their country’s lack of artillery.</p><p>Ukrainian troops first began using drones for battlefield surveillance, but within a few months they figured out how to strap explosives onto them and turn them into effective, <a href="https://spectrum.ieee.org/ukraine-hackers-war" target="_self">low-cost killing</a> <a href="https://spectrum.ieee.org/ukraine-hackers-war" target="_self">machines</a>. Little did they know they were fomenting a revolution in warfare.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Group observes a drone demonstration indoors, with a presenter explaining features." class="rm-shortcode" data-rm-shortcode-id="bfc4f902e7ae9ffa663bf3bcc8ff144c" data-rm-shortcode-name="rebelmouse-image" id="cc3bb" loading="lazy" src="https://spectrum.ieee.org/media-library/group-observes-a-drone-demonstration-indoors-with-a-presenter-explaining-features.jpg?id=65341730&width=980"/></p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Compact black camera module with textured surface and orange ribbon cable on white background." class="rm-shortcode" data-rm-shortcode-id="e904e39e8ac7797c354a205ed343d150" data-rm-shortcode-name="rebelmouse-image" id="4d58e" loading="lazy" src="https://spectrum.ieee.org/media-library/compact-black-camera-module-with-textured-surface-and-orange-ribbon-cable-on-white-background.jpg?id=65341726&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">The Ukrainian robotics company The Fourth Law produces an autonomy module [above] that uses optics and AI to guide a drone to its target. Yaroslav Azhnyuk [top, in light shirt], founder and CEO of The Fourth Law, describes a developmental drone with autonomous capabilities to Ukrainian President Volodymyr Zelenskyy and German Chancellor Olaf Scholz.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Top: THE PRESIDENTIAL OFFICE OF UKRAINE; Bottom: THE FOURTH LAW</small></p><p>That revolution was on display last month, as the U.S. and Israel went to war with Iran. It soon became clear that attack drones are being extensively used by both sides. Iran, for example, is relying heavily on the Shahed drones that the country invented and that are now also being manufactured in Russia and launched by the thousands every month against Ukraine.</p><p>A thorough analysis of the Middle East conflict <span>will take some time to emerge. And so to understand the direction of this new way of war, look to Ukraine, where its next phase—autonomy—is already starting to come into view. Outnumbered by the Russians and facing increasingly sophisticated jamming and spoofing aimed at causing the drones to veer off course or fall out of the sky, Ukrainian technologists realized as early as 2023 that what could really win the war was autonomy. Autonomous operation means a drone isn’t being flown by a remote pilot, and therefore there’s no communications link to that pilot that can be severed or spoofed, rendering the drone useless.</span></p><p>By late 2023, <a href="https://www.linkedin.com/in/yaroslavazhnyuk/?locale=uk" target="_blank">Azhnyuk</a> set out to help make that vision a reality. He founded two companies, <a href="https://thefourthlaw.ai/blog/funding-products-video" target="_blank">The</a> <a href="https://thefourthlaw.ai/blog/funding-products-video" target="_blank">Fourth Law</a> and <a href="https://oddsystems.io/en/" target="_blank">Odd Systems</a>, the first to develop AI algorithms to help drones overcome jamming during final approach, the second to build thermal cameras to help those drones better sense their <span>surroundings.</span></p><p>“I moved from making devices that throw treats to dogs to making devices that throw explosives on Russian occupants,” Azhnyuk quips.</p><p>Since then, The Fourth Law has dispatched “more than thousands” of <a href="https://thefourthlaw.ai/#section3" target="_blank">autonomy modules</a> to troops in eastern Ukraine (it declines to give a more specific figure), which can be retrofitted on existing drones to take over navigation during the final <span>approach to the target. Azhnyuk says the autonomy modules, worth around US $50, increase the drone-strike success rate by up to four times that of purely operator-controlled drones.</span></p><p>And that is just the beginning. Azhnyuk is one of thousands of developers, including some <span>who </span>relocated from Western countries, who are applying their skills and other resources to advancing the drone technology that is the defining characteristic <span>of the war in Ukraine. This eclectic group of startups and founders includes </span><a href="https://en.wikipedia.org/wiki/Eric_Schmidt" target="_blank">Eric Schmidt</a>, the forme<a href="https://about.google/company-info/" target="_blank">r</a> <a href="https://about.google/company-info/" target="_blank">Google</a> CEO, whose company <a href="https://epravda.com.ua/oborona/milyarder-ta-ekskerivnik-google-robit-droni-dlya-ukrajini-shcho-nim-ruhaye-809495/" target="_blank">Swift Beat</a> is churning out autonomous <a href="https://www.nytimes.com/2025/12/31/magazine/ukraine-ai-drones-war-russia.html" target="_blank">drones and modules for Ukrainian</a> <a href="https://www.nytimes.com/2025/12/31/magazine/ukraine-ai-drones-war-russia.html" target="_blank">forces</a>. The frenetic pace of tech development is helping a scrappy, innovative underdog hold at bay a much larger and better-equipped foe.</p><p>All of this development is careening toward AI-based systems that enable drones to navigate by recognizing features in the terrain, lock on to and chase targets without an operator’s guidance, and eventually exchange information with each other through mesh networks, forming self-organizing robotic kamikaze swarms. Such an attack swarm would be commanded by a single operator from a safe distance.</p><p><span>According to some reports, autonomous swarming technology is also being developed <a href="https://www.usni.org/magazines/proceedings/2025/may/step-step-ukraine-built-technological-navy" target="_blank">for</a> <a href="https://www.usni.org/magazines/proceedings/2025/may/step-step-ukraine-built-technological-navy" target="_blank">sea drones</a>. Ukraine has had some notable <span>successes with sea drones, which have reportedly</span> </span><span>destroyed or damaged </span><a href="https://en.usm.media/sbu-naval-drones-hit-11-russian-ships-and-vessels-details/" target="_blank">around a dozen</a><span> Russian vessels.</span></p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Hand holding a drone with six rotors, outdoors against a blue sky." class="rm-shortcode" data-rm-shortcode-id="90f30978c5ba0e77e9b1873c155131d2" data-rm-shortcode-name="rebelmouse-image" id="7cf11" loading="lazy" src="https://spectrum.ieee.org/media-library/hand-holding-a-drone-with-six-rotors-outdoors-against-a-blue-sky.jpg?id=65341722&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">The Skynode X system, from Auterion, provides a degree of autonomy to a drone.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">AUTERION</small></p><p>For Ukraine, swarming can solve a major problem that puts the nation at a disadvantage against Russia—the lack of personnel. Autonomy is “the single most impactful defense technology of this century,” says Azhnyuk. “The moment this happens, you <span>shift from a manpower challenge to a production challenge, which is much more manageable,” he adds.</span></p><p>The autonomous warfare future envisioned by Azhnyuk and others is not yet a reality. But <a href="https://www.linkedin.com/in/marcclange/?skipRedirect=true" target="_blank">Marc Lange</a>, a German defense analyst and business strategist, believes that “an inflection point” is already in view. Beyond it, “things will be so dramatically different,” he says.</p><p>“Ukraine pretty rapidly realized that if the operator-to-drone ratio can be shifted from one-to-one to one-to-many, that creates great economies of scale and an amazing cost exchange ratio,” Lange adds. “The moment one operator can launch 100, 50, or even just 20 drones at once, this completely changes the economics of the war.”</p><h2>Drones With a View </h2><p>For a while, jammers that sever the radio links between drones and <span>operators or that spoof GPS receivers were able to provide fairly reliable defense against human-controlled first-person-view attack drones (FPVs). But as autonomous navigation progressed, those electronic shields have gradually become less effective. Defenders must now contend with unjammable drones—ones that are attached to hair-thin optical fibers or that are capable of </span><a href="https://spectrum.ieee.org/ukraine-killer-drones" target="_self">finding</a> <a href="https://spectrum.ieee.org/ukraine-killer-drones" target="_self">their way to their targets</a> without external guidance. In this emerging struggle, the defenders’ track records aren’t very encouraging: The typical countermeasure is to try to shoot down the attacking drone with a service weapon. It’s rarely successful.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Truck on rural road covered with camouflage netting, trees and fields in the background." class="rm-shortcode" data-rm-shortcode-id="7c7af1e395cf35752b367f8dd54130fc" data-rm-shortcode-name="rebelmouse-image" id="58155" loading="lazy" src="https://spectrum.ieee.org/media-library/truck-on-rural-road-covered-with-camouflage-netting-trees-and-fields-in-the-background.jpg?id=65341708&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">A truck outfitted with signal-jamming gear drives under antidrone nets near Oleksandriya, in eastern Ukraine, on 2 October 2025.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">ED JONES/AFP/GETTY IMAGES</small></p><p>“The attackers gain an immense advantage from unmanned systems,” says Lange. “You can have a drone pop up from anywhere and it can wreak havoc. But from autonomy, they gain even more.”</p><p>The self-navigating drones rely on image-recognition algorithms that have been around for over a decade, says Lange. And the mass deployments of drones on Ukrainian battlefields are enabling both Russian and Ukrainian technologists to create <a href="https://www.reuters.com/technology/ukraine-collects-vast-war-data-trove-train-ai-models-2024-12-20/" target="_blank">huge datasets</a> that improve the training and precision of those AI algorithms.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Six-wheeled robotic vehicle with mounted equipment in a grassy field." class="rm-shortcode" data-rm-shortcode-id="caa0a697b2d5752603687ac7f0278581" data-rm-shortcode-name="rebelmouse-image" id="1c591" loading="lazy" src="https://spectrum.ieee.org/media-library/six-wheeled-robotic-vehicle-with-mounted-equipment-in-a-grassy-field.jpg?id=65341706&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">A Ukrainian land robot, the Ravlyk, can be outfitted with a machine gun.</small></p><p>While uncrewed aerial vehicles (UAVs) have received the most attention, the Ukrainian military is also deploying dozens of different kinds of drones on land and sea. Ukraine, struggling with the shortage of infantry personnel, began working on replacing a portion of human soldiers with wheeled ground robots in 2024. As of early 2026, thousands of ground robots are crawling across the gray zone along the front line in Eastern Ukraine. Most are used to deliver supplies to the front line or to help evacuate the wounded, but some “killer” ground robots fitted with turrets and remotely controlled machine guns have also been tested.</p><p>In mid-February, Ukrainian authorities released a video of a Ukrainian ground robot using its thermal camera to detect a Russian soldier in the dark of the night and then kill the invader with a round from a heavy machine gun. So far these robots are mostly controlled <span>by a human operator, but the makers of these uncrewed ground vehicles say their systems are capable of basic autonomous operations, such as returning to base when radio connection is lost. The goal is to enable them to swarm so that one operator controls not one, but a whole herd of mesh-connected killer robots.</span></p><p>But <a href="https://www.hudson.org/experts/1303-bryan-clark" target="_blank">Bryan <span>Clark</span></a>, senior fellow and <span>director of the Center for Defense Concepts and Technology at the </span><a href="https://www.hudson.org/" target="_blank">Hudson Institute</a>, questions how quickly ground robots’ abilities can progress. “Ground environments are very difficult to navigate in because of the terrain you have to address,” he says. “The line of sight for the sensors on the ground vehicles is really constrained because of terrain, whereas an air vehicle can see everything around it.”</p><p>To achieve autonomy, <a href="https://spectrum.ieee.org/sea-drone" target="_self">maritime drones</a>, too, will require <span>naviga</span><span>tional approaches beyond AI-based image recognition, possibly based on star positions or electronic signals from radios and cell towers that are within reach, says Clark. Such technologies are still being developed or are in a relatively early operational stage.</span></p><h2>How the Shaheds Got Better</h2><p>Russia is not lagging behind. In fact, some analysts believe its autonomous systems may be slightly ahead of Ukraine’s. For a good example of the Russian military’s rapid <span>evolu</span><span>tion, they say, consider the long-range Iranian-designed Shahed drones. Since 2022, Russia has been using them to attack Ukrainian cities and other targets hundreds of kilometers from the front line. “At the beginning, Shaheds just had a frame, a </span><span>motor, and an inertial navigation system,” </span><a href="https://www.linkedin.com/in/oleksii-solntsev-aa0b72189?originalSubdomain=ua" target="_blank">Oleksii</a><span> </span><a href="https://www.linkedin.com/in/oleksii-solntsev-aa0b72189?originalSubdomain=ua" target="_blank">Solntsev</a><span>, CEO of Ukrainian defense tech startup MaXon Systems, tells me. “They used to be imprecise and pretty stupid. But they are becoming more and more autonomous.” Solntsev founded MaXon </span><span>Systems in late 2024 to help protect Ukrainian civil</span><span>ians from the growing threat of Shahed </span><span>raids.</span></p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Silhouette of a triangular drone flying in the sky." class="rm-shortcode" data-rm-shortcode-id="a9c89e21028ccf85e20a49ecead8309f" data-rm-shortcode-name="rebelmouse-image" id="72159" loading="lazy" src="https://spectrum.ieee.org/media-library/silhouette-of-a-triangular-drone-flying-in-the-sky.jpg?id=65341701&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">A Russian Geran-2 drone, based on the Iranian Shahed-136, flies over Kyiv during an attack on 27 December 2025.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">SERGEI SUPINSKY/AFP/GETTY IMAGES</small></p><p>First produced <a href="https://www.adaptinstitute.org/from-tehran-to-alabuga-the-evolution-of-shahed-drones-into-russias-strategic-asset/26/09/2025/" target="_blank">in Iran in the 2010s</a>, Shaheds can <span>carry 90-kilogram warheads </span><a href="https://isis-online.org/isis-reports/alabugas-shahed-136-geran-2-warheads-a-dangerous-escalation" target="_blank">up to 650 km</a> (50-kg warheads can go twice as far). <a href="https://www.csis.org/analysis/calculating-cost-effectiveness-russias-drone-strikes" target="_blank">They cost around $35,000 per unit</a><span>, compared to a couple of million dollars, at least, for a ballistic missile. The low cost </span><span>allows Russia to manufacture Shaheds in high quantities, unleashing entire fleets onto </span><a href="https://isis-online.org/isis-reports/a-comprehensive-analytical-review-of-russian-shahed-type-uavs-deployment-against-ukraine-in-2025" target="_blank">Ukrainian cities</a><span> </span><a href="https://isis-online.org/isis-reports/a-comprehensive-analytical-review-of-russian-shahed-type-uavs-deployment-against-ukraine-in-2025" target="_blank">and infrastructure almost every night</a><span>.</span></p><p>The early Shaheds were able to reach a prepro<span>grammed location based on satellite-navigation coordinates. Even one of these early models could frequently overcome the jamming of satellite-navigation signals with the help of an onboard inertial navigation unit. This was essentially a dead-reckoning system of accelerators and gyroscopes that estimate the drone’s position from continual measurements of its motions.</span></p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Silhouette of person with large equipment under a starry night sky." class="rm-shortcode" data-rm-shortcode-id="37186ec06b71203ba4f30db497507797" data-rm-shortcode-name="rebelmouse-image" id="1aca7" loading="lazy" src="https://spectrum.ieee.org/media-library/silhouette-of-person-with-large-equipment-under-a-starry-night-sky.jpg?id=65341699&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">In the Donetsk Region, on 15 August 2025, a Ukrainian soldier hunts for Shaheds and other drones with a thermalimaging system attached to a ZU23 23-millimeter antiaircraft gun.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">KOSTYANTYN LIBEROV/LIBKOS/GETTY IMAGES</small></p><p>Ukrainian defense forces learned to down Shaheds with heavy machine guns, but as Russia continued to innovate, the daily onslaughts started to become <a href="https://euromaidanpress.com/2025/06/29/why-cant-ukraine-stop-russias-shahed-drones-anymore/" target="_blank">increasingly effective.</a></p><p>Today’s Shaheds fly faster and higher, and therefore are more difficult to detect and take down. Between January 2024 and August 2025, the number of Shaheds and Shahed-type attack drones launched by Russia into Ukraine per month <a href="https://united24media.com/war-in-ukraine/why-russias-shahed-drones-are-now-deadlier-and-harder-than-ever-to-stop-11693" target="_blank">increased more than tenfold</a>, from 334 to more than 4,000. In 2025, Ukraine found <a href="https://www.unmannedairspace.info/counter-uas-systems-and-policies/recently-downed-russian-shahed-demonstrates-new-levels-of-autonomous-capability/" target="_blank">AI-enabling</a> <a href="https://www.unmannedairspace.info/counter-uas-systems-and-policies/recently-downed-russian-shahed-demonstrates-new-levels-of-autonomous-capability/" target="_blank">N</a><a href="https://www.unmannedairspace.info/counter-uas-systems-and-policies/recently-downed-russian-shahed-demonstrates-new-levels-of-autonomous-capability/" target="_blank">vidia</a> <a href="https://www.unmannedairspace.info/counter-uas-systems-and-policies/recently-downed-russian-shahed-demonstrates-new-levels-of-autonomous-capability/" target="_blank">chipsets in wreckages of Shaheds</a>, as well as thermal-vision modules capable of locking onto targets at night.</p><p>“Now, they are interconnected, which allows them to exchange information with each other,” Solntsev says. “They also have cameras that allow them to autonomously navigate to objects. Soon they will be able to tell each other to avoid a <span>jammed</span> <span>region or an area where one of them got </span><span>intercepted.”</span></p><p>These Russian-manufactured Shaheds, which Russian forces call Geran-2s, are thought to be more capable than the garden variety Shahed-136s that Iran has lately been launching against targets throughout the Middle East. Even the relatively primitive Shahed-136s have done considerable damage, according to <a href="https://www.theguardian.com/world/2026/mar/02/iran-unleashes-hundreds-of-drones-aimed-at-targets-across-middle-east" target="_blank">press accounts</a>.</p><p>Those Shahed successes may accrue, at least in part, from the fact that the United States and Israel <span>lack Ukraine’s long experience with fending them off. In just two days in early March, upward of a thousand drones, mostly Shaheds, were launched against U.S. and Israeli targets, with </span><a href="https://www.theguardian.com/world/2026/mar/02/iran-unleashes-hundreds-of-drones-aimed-at-targets-across-middle-east" target="_blank">hundreds of</a> <a href="https://www.theguardian.com/world/2026/mar/02/iran-unleashes-hundreds-of-drones-aimed-at-targets-across-middle-east" target="_blank">them reportedly finding their marks</a>.</p><p>One attack, caught on videotape, shows a Shahed destroying a radar dome at the U.S. navy base in <span>Manama, Bahrain. U.S. forces were understood to be </span><a href="https://carnegieendowment.org/emissary/2026/03/iran-drones-shahed-us-lessons" target="_blank">attempting to fend off the drones</a> by striking launch platforms, dispatching fighter aircraft to shoot them down, and by using some extremely costly air-defense interceptors, including ones meant to down ballistic missiles. On 4 March, <a href="https://www.cnn.com/2026/03/04/politics/us-air-defenses-iran-attack-drones-challenge" target="_blank">CNN</a> <a href="https://www.cnn.com/2026/03/04/politics/us-air-defenses-iran-attack-drones-challenge" target="_blank">reported</a> that in a congressional briefing the day before, top U.S. defense officials, including Secretary of Defense Pete Hegseth, acknowledged that U.S. air defenses weren’t keeping up with the onslaught of Shahed drones.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Broken drone on soil, cylindrical container nearby." class="rm-shortcode" data-rm-shortcode-id="769830682ff53a401780108ca11db2b6" data-rm-shortcode-name="rebelmouse-image" id="c9d58" loading="lazy" src="https://spectrum.ieee.org/media-library/broken-drone-on-soil-cylindrical-container-nearby.jpg?id=65341692&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">Russian V2U attack drones are outfitted with Nvidia processors and run computer-vision software and AI algorithms to enable the drones to navigate autonomously.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">GUR OF THE MINISTRY OF DEFENSE OF UKRAINE</small></p><p>Russia is also starting to field a newer generation of attack drones. One of these, the V2U, has been used to strike targets in the Sumy region of northeastern Ukraine. <a href="https://euromaidanpress.com/2025/06/09/russias-v2u-drone-uses-ai-for-autonomous-strikes-in-ukraines-sumy-oblast/" target="_blank"><span>The V2U drones</span></a> are outfitted with Nvidia Jetson Orin processors and run <span>computer</span>-­<span>vision software and AI algorithms that allow the drones to navigate even where satellite navigation is jammed.</span></p><p>The sale of Nvidia chips to Russia is banned under U.S. sanctions against the country. However, press reports suggest that the chips are getting to Russia <a href="https://www.pravda.com.ua/eng/news/2024/10/28/7481703/" target="_blank">via intermediaries in India</a>.</p><h2>Antidrone Systems Step Up</h2><p>MaXon Systems is one of several companies working to fend off the nightly drone onslaught. Within one year, the company developed and battle-tested a Shahed interception system that hints at the sci-fi future envisioned by Azhnyuk. For a system to be capable of reliably defending against autonomous weaponry, it, too, needs to be autonomous.</p><p><span>MaXon’s solution consists of ground turrets scanning the sky with infrared sensors, with additional input from a network of radars that </span><span>detects approaching Shahed drones at distances of, typically, </span><a href="https://en.defence-ua.com/weapon_and_tech/2025_systems_to_shield_kyiv_from_shaheds_new_air_defense_details_from_maxon_where_balloons_carry_interceptor_drones-15499.html" target="_blank">12 to 16</a><span> km. The turrets fire autonomous fixed-winged interceptor drones, fitted with explosive warheads, toward the approaching Shaheds at speeds of nearly 300 km/h. To boost the chances of successful interception, MaXon </span><a href="https://en.defence-ua.com/weapon_and_tech/2025_systems_to_shield_kyiv_from_shaheds_new_air_defense_details_from_maxon_where_balloons_carry_interceptor_drones-15499.html" target="_blank">is also fielding</a><span> an airborne anti-Shahed fortification </span><span>system</span><span> </span><span>consisting of helium-filled </span><a href="https://spectrum.ieee.org/airships-drones-ukraine" target="_self">aerostats</a><span> hovering above the city that dispatch the interceptors from a higher altitude.</span></p><p>“We are trying to increase the level of automation of the system compared to existing solutions,” says Solntsev. “We need automatic <span>detection, automatic takeoff, and automatic mid-track guidance so that we can guide the interceptor before it can itself flock the target.”</span></p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Gray drone on display stand, surrounded by military personnel in camouflage uniforms." class="rm-shortcode" data-rm-shortcode-id="592b19dbfc4fe9a54033067c6169aeec" data-rm-shortcode-name="rebelmouse-image" id="ab79b" loading="lazy" src="https://spectrum.ieee.org/media-library/gray-drone-on-display-stand-surrounded-by-military-personnel-in-camouflage-uniforms.jpg?id=65341687&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">An interceptor drone, part of the U.S. MEROPS defensive system, is tested in Poland on 18 November 2025.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">WOJTEK RADWANSKI/AFP/GETTY IMAGES</small></p><p>In November 2025, the Ukrainian military announced it had been conducting successful trials of the <a href="https://www.forcesnews.com/nato/bang-your-buck-200m-worth-russian-drones-taken-out-15m-merops-uavs" target="_blank">Merops Shahed drone interceptor</a> system developed by the U.S. startup <a href="https://themerge.co/p/project-eagle" target="_blank">Project Eagle</a>, another of former <span>Google CEO Eric Schmidt’s Ukraine defense ventures. Like the MaXon gear, the system can operate largely autonomously and has so far downed over 1,000 Shaheds.</span></p><h2>What Works in the Lab Doesn’t Necessarily Fly on the Battlefield </h2>Despite the progress on both sides, analysts say that <span>the kind of robotic warfare imagined by Azhnyuk won’t be a reality for years.</span><p>“The software for drone collaboration is there,” says <a href="https://www.csis.org/people/kateryna-bondar" target="_blank">Kate Bondar</a>, a former policy advisor for the Ukrainian <span>government and currently a research fellow at the U.S. </span><a href="https://www.csis.org/" target="_blank">Center for Stra</a><a href="https://www.csis.org/" target="_blank">tegic and International Studies</a><span>. “Drones can fly in labs, but in real life, [the forces] are afraid to deploy them because the risk of a mistake is too high,” she adds.</span></p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Two people launching a drone in an open field using a catapult system." class="rm-shortcode" data-rm-shortcode-id="894baf9e936bef6f8c45a0363afac141" data-rm-shortcode-name="rebelmouse-image" id="7c4e9" loading="lazy" src="https://spectrum.ieee.org/media-library/two-people-launching-a-drone-in-an-open-field-using-a-catapult-system.jpg?id=65341682&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">Ukrainian soldiers watch a GOR reconnaissance drone take to the sky near Pokrovsk in the Donetsk region, on 10 March 2025.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">ANDRIY DUBCHAK/FRONTLINER/GETTY IMAGES</small></p>In Bondar’s view, powerful AI-equipped drones won’t be deployed in large numbers given the current prices for high-end processors and <span>other advanced components. And, she adds, the more autonomous the system needs to be, the more expensive are the processors and sensors it must have. “For these cheap attack drones that fly only once, you don’t install a high-resolution camera that [has] the resolution for AI to see properly,” she says. “[You install] the cheapest camera. You don’t </span><span>want expensive chips that can run AI algorithms either. Until we can achieve this balance of technological sophistication, when a system can conduct a mission but at the lowest price possible, it won’t be deployed en masse.”</span><p>While existing AI systems are doing a good job recognizing and following large objects like Shaheds or tanks, experts question their ability to reliably distinguish and pursue smaller and more nimble or inconspicuous targets. “When we’re getting into more specific questions, like can it distinguish a Russian soldier from a Ukrainian soldier or at least a soldier from a civilian? The answer is no,” says Bondar. “Also, it’s one thing to track a tank, and it’s another to track infantrymen riding buggies and motorcycles that are moving very fast. That’s really challenging for AI to track and strike precisely.”</p><p>Clark, at the Hudson Institute, says that although the AI algorithms used to guide the Russian and <span>Ukrainian drones are “pretty good,” they rely on information provided bysensors that “aren’t good enough.” “You need multiphenomenology sensors that are able to look at infrared and visual and, in some cases, different parts of the infrared spectrum to be able to figure out if something is a decoy or real target,” </span><span>he </span><span>says.</span></p><p><span>German defense analyst Lange agrees that right now, battlefield AI image-recognition systems are too easily fooled. “If you compress reality into a </span><span>2D</span><span> image, a lot of things can be easily camouflaged—like what Russia did recently, when they started drawing birds on the back of their drones,” he <span>says.</span></span></p><h2>Autonomy Remains Elusive on the Ground and at Sea, Too</h2><p>To make Ukraine’s <span>emerging uncrewed ground vehicles (UGVs) equally self-sufficient will be an even greater task, in Clark’s view. Still, </span><span>Bondar expects major advances to materialize within the next several years, even if humans are still going to be part of the decision-making loop.</span></p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Military radar equipment in a grassy field." class="rm-shortcode" data-rm-shortcode-id="0b36a03b7582535b3d3319d7d9b74c33" data-rm-shortcode-name="rebelmouse-image" id="d65ea" loading="lazy" src="https://spectrum.ieee.org/media-library/military-radar-equipment-in-a-grassy-field.jpg?id=65341671&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">A mobile electronic-warfare system built by PiranhaTech is demonstrated near Kyiv on 21 October 2025.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">DANYLO ANTONIUK/ANADOLU/GETTY IMAGES</small></p><p>“I think in two or three years, we will have pretty good full autonomy, at least in good weather conditions,” she says, referring to aerial drones in partic<span>ular. “Humans will still be in the loop for some years, simply because there are so many unpredictable situations when you need an intervention. We won’t be able to fully rely on the machine for at least another 10 or 15 years.”</span></p><p>Ukrainian defenders are apprehensive about that autonomous future. The boom of drone inno<span>vation has come hand in hand with the development of sophisticated jamming and radio-frequency detection systems. But a lot of that innovation will become obsolete once the pendulum swings away from human control. Ukrainians got their first taste of dealing with unjammable drones in mid-2024, when Russia began rolling out fiber-optic tethered drones. Now they have to brace for a threat on a much larger scale.</span></p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Quadcopter drone flying with a fire extinguisher attached in a cloudy sky." class="rm-shortcode" data-rm-shortcode-id="70f326221988cb6004338272d1d8dd4d" data-rm-shortcode-name="rebelmouse-image" id="aa25d" loading="lazy" src="https://spectrum.ieee.org/media-library/quadcopter-drone-flying-with-a-fire-extinguisher-attached-in-a-cloudy-sky.jpg?id=65341673&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">An experimental drone is demonstrated at the Brave1 defense-tech incubator in Kyiv.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">DANYLO DUBCHAK/FRONTLINER/GETTY IMAGES</small></p><p>“Today, we have a situation where we have lots of signals on the battlefield, but in the near future, <span>in maybe two to five years, UAVs are not going to be sending any signals,” says Oleksandr Barabash, CTO of </span><a href="https://www.falcons.com.ua/en" target="_blank">Falcons</a>, a Ukrainian startup that has developed a smart radio-frequency detection system capable <span>of revealing precise locations of enemy radio sources such as drones, control stations, and jammers.</span></p><p>Last September, Falcons secured funding from the U.S.-based dual-use tech fund <a href="https://www.greenflag.vc/" target="_blank">Green Flag Ven</a><a href="https://www.greenflag.vc/" target="_blank">tures</a> to scale production of its technology and work toward NATO certification. But Barabash admits that its system, like all technologies fielded in <span>Ukrainian war zones, has an expiration date. Instead of radio-frequency detectors, Barabash thinks, the next R&D push needs to focus on passive radar systems capable of identifying small and fast-moving targets based on the signal from sources like TV towers or radio transmitters that propagate through the environment and are reflected by those moving targets. Passive radars have a significant advantage in the war zone, according to Barabash. Since they don’t emit their own signal, they can’t be that easily discovered by the enemy.</span></p><p>“Active radar is emitting signals, so if you are using active radars, you are target No. 1 on the front line,” Barabash says.</p><p><span>Bondar, on the other hand, thinks that the increased onboard compute power needed </span><span>for</span> AI-controlled drones will, by itself, generate enough electromagnetic radiation to prevent autonomous drones from ever operating completely undetectably.</p><p><span>“You can have full autonomy, but you will still have systems onboard that emit electromagnetic radiation or heat that can be detected,” says Bondar. “Batteries emit electromagnetic radiation, motors emit heat, and [that heat can be] visible in infrared from far away. You just need to have the right sensors to be able to identify it in advance.” She adds that that takeaway is “how capable contemporary detection systems have become and how technically challenging it is to design drones that can reliably operate in the Ukrainian battlefield environment.”</span></p><h2>There Will Be Nowhere to Hide from Autonomous Drones</h2><p>When autonomous drones become a standard weapon <span>of war, their threat will extend far beyond the battlefields of Ukraine. Autonomous turrets and drone-interceptor fortification might soon dot the perimeter of European cities, particularly in the eastern part of the continent.</span></p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="Person holding gray drone against a blue sky, preparing to launch it." class="rm-shortcode" data-rm-shortcode-id="c480e8fb2bdf2e560c142729e35c7320" data-rm-shortcode-name="rebelmouse-image" id="f9032" loading="lazy" src="https://spectrum.ieee.org/media-library/person-holding-gray-drone-against-a-blue-sky-preparing-to-launch-it.jpg?id=65327903&width=980"/><small class="image-media media-caption" placeholder="Add Photo Caption...">A fixed-wing drone is tested in Ukraine in April 2025.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">ANDREWKRAVCHENKO/BLOOMBERG/GETTY IMAGES</small></p><p>Nefarious actors from all over the world have closely watched Ukraine and taken notes, warns Lange. Today, FPV drones are being used b<a href="https://gnet-research.org/2025/07/30/weaponised-skies-the-expansion-of-terrorist-drone-use-across-africa/" target="_blank">y</a> <a href="https://gnet-research.org/2025/07/30/weaponised-skies-the-expansion-of-terrorist-drone-use-across-africa/" target="_blank">Islamic terrorists in Africa</a> and <a href="https://www.atlanticcouncil.org/blogs/new-atlanticist/drug-cartels-are-adopting-cutting-edge-drone-technology-heres-how-the-us-must-adapt/#%3A~%3Atext%3DIf%20confirmed%2C%20this%20would%20suggest%2CUS%20homeland%20security%E2%80%94are%20profound" target="_blank">Mexican drug cartels</a> to fight against local authorities.</p><p>When autonomous killing machines become widely available, it’s likely that no city will be safe. “We might see nets above city centers, protecting civilian streets,” Lange says. “In every case, the West needs to start performing similar kinetic-defense development that we see in Ukraine. Very rapid iteration and testing cycles to find solutions.”</p><p>Azhnyuk is concerned that the historic defenders of Europe—the <span>United States and the European countries themselves—are falling behind. “We are in danger,” he says. While Russia and Ukraine made major strides in their drones and countermeasures over the past year, “Europe and the United States have progressed, in the best-case scenario, from the winter-of-2022 technology to the summer-of-2022 technology.</span></p><p>“The gap is getting wider,” he warns. “I think the next few years are very dangerous for the security of Europe.” <span class="ieee-end-mark"></span></p><p><em>This article appears in the April 2026 print issue as “Rise of the <span>AUTONOMOUS </span>Attack Drones.”</em></p>]]></description><pubDate>Tue, 24 Mar 2026 13:00:05 +0000</pubDate><guid>https://spectrum.ieee.org/autonomous-drone-warfare</guid><category>Military-robots</category><category>Military-drones</category><category>Drone-war</category><category>Shahed-drones</category><category>Ai-agents</category><dc:creator>Tereza Pultarova</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/person-holding-a-large-drone-outdoors-under-a-sunny-partly-cloudy-sky.jpg?id=65327386&amp;width=980"></media:content></item><item><title>Video Friday: Humanoid Learns Tennis Skills Playing Humans</title><link>https://spectrum.ieee.org/tennis-playing-robot</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/robot-playing-tennis-holding-racket-on-green-court-inset-shows-human-opponent-hitting-ball.png?id=65325604&width=2000&height=1500&coordinates=264%2C0%2C265%2C0"/><br/><br/><p>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at <em>IEEE Spectrum</em> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please <a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a> for inclusion.</p><h5><a href="https://2026.ieee-icra.org/">ICRA 2026</a>: 1–5 June 2026, VIENNA</h5><h5><a href="https://mrs.fel.cvut.cz/summer-school-2026/">Summer School on Multi-Robot Systems</a>: 29 July–4 August 2026, PRAGUE</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="23zsarayx6o"><em>Human athletes demonstrate versatile and highly dynamic tennis skills to successfully conduct competitive rallies with a high-speed tennis ball. However, reproducing such behaviors on humanoid robots is difficult, partially due to the lack of perfect humanoid action data or human kinematic motion data in tennis scenarios as reference. In this work, we propose LATENT, a system that Learns Athletic humanoid TEnnis skills from imperfect human motioN daTa.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b359b1966adb83fc68515b1a4514b8ca" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/23ZsaraYX6o?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://zzk273.github.io/LATENT/">LATENT</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="cwithpe4hna">A beautifully designed robot inspired by Strandbeests.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1c60a43596b696ace279c9366e02ecd4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/CwItHPe4HnA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.cranfield.ac.uk/press/news-2026/wind-powered-robot-could-enable-long-term-exploration-of-hostile-environments">Cranfield University</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="uvqdqf8ppuw"><em>We believe we’re the first robotics company to demonstrate a robot peeling an apple with dual dexterous humanlike hands. This breakthrough closes a key gap in robotics, achieving bimanual, contact-rich manipulation and moving far beyond the limits of simple grippers.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2c50d7039587c10b8f33da57970bff7f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/UVQdqf8ppuw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>Today’s AI models (VLMs) are excellent at perception but struggle with action. Controlling high-degree-of-freedom hands for tasks like this is incredibly complex, and precise finger-level teleoperation is nearly impossible for humans.  Our first step was a shared-autonomy system: rather than controlling every finger, the operator triggers prelearned skills like a “rotate apple or tennis ball” primitive via a keyboard press or pedal. This makes scalable data collection and RL training possible.</em><br/><em>How does the AI manage this? We created “<a data-linked-post="2674040994" href="https://spectrum.ieee.org/video-friday-google-gemini-robotics" target="_blank">MoDE-VLA</a>” (Mixture of Dexterous Experts). It fuses vision, language, force, and touch data by using a team of specialist “experts,” making control in high-dimensional spaces stable and effective.  The combination of these two innovations allows for seamless, contact-rich manipulation. The human provides high-level guidance, and the robot executes the complex in-hand coordination required.</em></blockquote><p>[ <a href="https://www.sharpa.com/">Sharpa</a> ]</p><p>Thanks, Alex!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="pczsnnwxvia"><em>It was great to see our name amongst the other “AI Native” companies during the <a data-linked-post="2676218078" href="https://spectrum.ieee.org/nvidia-groq-3" target="_blank">NVIDIA GTC</a> keynote. NVIDIA Isaac Lab helps us train reinforcement learning policies that enable the UMV to drive, jump, flip, and hop like a pro.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7b935f0fe975b31f175c1f1fb07566e0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/pcZSNNWXviA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://rai-inst.com/">Robotics and AI Institute</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="iojvnq-zhww"><em>This Finger-Tip Changer technology was jointly researched and developed through a collaboration between Tesollo and RoCogMan LaB at Hanyang University ERICA. The project integrates Tesollo’s practical robotic hand development experience with the lab’s expertise in robotic manipulation and gripper design.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="02d553395b82e93112b8f1739a601bd4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/iojvNQ-Zhww?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>I don’t know why more robots don’t do this. Also, those pointy fingertips are terrifying.</p><p>[ <a href="http://bmr.hanyang.ac.kr/">RoCogMan LaB</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="z55m_um_7fq">Here’s an upcoming ICRA paper from the Fluent Robotics Lab at the University of Michigan along with the <a href="https://www.laas.fr/en/teams/ris/" target="_blank">Robotics and Interactions Team at LAAS-CNRS</a> featuring an operational <a data-linked-post="2650254910" href="https://spectrum.ieee.org/this-is-what-pr2s-do-for-fun" target="_blank">PR2</a>! With functional batteries!!!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="df662d906aa6b4c85644b271ad7a281c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/z55M_um_7fQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://fluentrobotics.com/">Fluent Robotics Lab</a> ] and [ <a href="https://www.laas.fr/en/teams/ris/" target="_blank">RIS</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="9qzctmarvpk"><em>This video showcases the field tests and interaction capabilities of KAIST Humanoid v0.7, developed at the DRCD Lab featuring in-house actuators. The control policy was trained through deep reinforcement learning leveraging human demonstrations.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="6868cb35447265d5d8ab10642b15acd5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/9qZcTMARvpk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://dynamicrobot.kaist.ac.kr/">KAIST DRCD Lab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="_wnckaf2gb8">This needs to come in adult size.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="489e194d2beb7942474b8da6039ec082" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/_WNckAf2GB8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en">Deep Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="k5wgpmenpcq">I did not know this, but apparently shoeboxes are really annoying to manipulate because if you grab them by the lid, they just open, so specialized hardware is required.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b2f884b6d81248335c4efbff6414e328" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/k5WGpMENPCQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://nomagic.ai/news/zalando-to-install-up-to-50-ai-powered-nomagic-robots/">Nomagic</a> ]</p><p>Thanks, Gilmarie!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="clfpxcpza14"><em>This paper presents a method to recover quadrotor Unmanned Air Vehicles (UAVs) from a throw, when no control parameters are known before the throw.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="da02ec67edcf7a40100d406b105b468a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/CLFPXcpzA14?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ieeexplore.ieee.org/document/10801514">MAVLab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="pmetcxgumhm">Uh-oh, robots can see glass doors now. We’re in trouble.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="31ecd9975c0baef1553d7e3372c79b98" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/pMeTCxGumhM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.limxdynamics.com/en/products/oli">LimX Dynamics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="pshyocgoc5u">This drone hugs trees <3</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="12d5d406c1777d91e696c722d9f0fba1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/pSHYocGOC5U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://slap-perching.github.io/">Stanford BDML</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="afviggntkm8"><em>Electronic waste is one of the fastest-growing environmental problems in the world. As robotics and electronic systems become more widespread, their environmental footprint continues to increase. In this research, scientists developed a fully biodegradable soft robotic system that integrates electronic devices, sensors, and actuators yet completely decomposes after use.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="75a180ef9157983647255f5588abe215" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/AFVIGgntKm8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.nature.com/articles/s41893-026-01780-4">Nature</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="yhyvrk9wce8"><em>We developed a distributed algorithm that enables multiple aerial robots to flock together safely in complex environments, without explicit communication or prior knowledge of the surroundings, using only onboard sensors and computation. Our approach ensures collision avoidance, maintains proximity between robots, and handles uncertainties (tracking errors and sensor noise). Tested in simulations and real-world experiments with up to four drones in a dense forest, it proved robust and reliable.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0ff57e0c9dc071bc6306ca0c3798c944" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/yHyvrk9WCE8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://mrs.fel.cvut.cz/rbl">RBL</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="b3v-ylwcaee"><em>The University of Pennsylvania’s 2025 President’s Sustainability Prize winner Piotr Lazarek has developed a system that uses satellite data to pinpoint inefficiencies in farmers’ fields, conducts real-time soil analysis with autonomous drones to understand why they occur, and generates precise fertilizer application maps. His startup Nirby aims to increase productivity in farm areas that are underperforming and reduce fertilizer in high-performing ones.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="796256fd6880d5e76310d5685661fa67" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/b3v-yLwcAEE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://penntoday.upenn.edu/news/2025-penn-presidents-sustainability-prize-recipient-nirby">University of Pennsylvania</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="wl0-pu_8f0u"><em>The production version of Atlas is a departure from the typical humanoid form factor, favoring industrial utility over human likeness. Intended for purposeful work in an industrial setting, Atlas has a form factor that signals its role as a machine rather than a companion or friendly assistant. Join two lead hardware engineers and our head of industrial design for a technical discussion of how key product requirements, ranging from passive thermal management to a modular architecture, dictated a bold new vision for a humanoid.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cce82a9b133af0d383e29e75c54cb937" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/wL0-Pu_8F0U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://bostondynamics.com/blog/atlas-evolution-from-research-robot-to-industrial-humanoid/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="cmbbkd46z48"><em>Dr. Christian Hubicki gives a talk exploring the common themes of modern robotics research and his time on the reality competition show, “Survivor.”</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="04cf1a709c7b176620b8d56b2629431a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/CmBbkd46Z48?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.optimalroboticslab.com/">Optimal Robotics Lab</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Sat, 21 Mar 2026 16:30:04 +0000</pubDate><guid>https://spectrum.ieee.org/tennis-playing-robot</guid><category>Humanoid-robots</category><category>Video-friday</category><category>Robot-locomotion</category><category>Nvidia</category><category>Robot-manipulation</category><category>Quadruped-robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/robot-playing-tennis-holding-racket-on-green-court-inset-shows-human-opponent-hitting-ball.png?id=65325604&amp;width=980"></media:content></item><item><title>Overcoming Core Engineering Barriers in Humanoid Robotics Development</title><link>https://content.knowledgehub.wiley.com/engineering-challenges-and-component-strategies-in-humanoid-robotics-from-prototype-to-production/</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/logo-of-murata-in-red-with-text-innovator-in-electronics-below.png?id=65106483&width=980"/><br/><br/><p><span>A technical examination of the sensing, motion control, power, and thermal challenges facing humanoid robotics engineers — with component-level design strategies for real-world deployment.</span></p><p><span>What Attendees will Learn</span></p><ol><li><span>Why motion control remains the hardest unsolved problem — Explore the modelling complexity, real-time feedback requirements, and sensor fusion demands of maintaining stable bipedal locomotion across dynamic environments.</span></li><li><span>How sensing architectures enable perception and safety — Understand the role of inertial measurement units, force/torque feedback, and tactile sensing in achieving reliable human-robot interaction and collision avoidance.</span></li><li><span>What power and thermal constraints mean for system design — Examine the trade-offs in battery chemistry selection (LFP vs. NCA), DC/DC converter topologies, and thermal protection strategies that determine operational endurance.</span></li><li><span>How the industry is transitioning from prototype to mass production — Learn about the shift toward modular architectures, cost-driven component selection, and supply chain readiness projected for the late 2020s.</span></li></ol><p><a href="https://content.knowledgehub.wiley.com/engineering-challenges-and-component-strategies-in-humanoid-robotics-from-prototype-to-production/" target="_blank">Download this free whitepaper now!</a></p>]]></description><pubDate>Thu, 19 Mar 2026 10:00:05 +0000</pubDate><guid>https://content.knowledgehub.wiley.com/engineering-challenges-and-component-strategies-in-humanoid-robotics-from-prototype-to-production/</guid><category>Sensor-fusion</category><category>Type-whitepaper</category><category>Motion-control</category><category>Humanoid-robots</category><dc:creator>Murata Manufacturing Co.</dc:creator><media:content medium="image" type="image/png" url="https://assets.rbl.ms/65106483/origin.png"></media:content></item><item><title>Video Friday: These Robots Were Born to Run</title><link>https://spectrum.ieee.org/legged-modular-robot</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/rolling-cannon-distant-cityscape-trees-and-water.gif?id=65282014&width=2000&height=1500&coordinates=160%2C0%2C160%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://2026.ieee-icra.org/">ICRA 2026</a>: 1–5 June 2026, VIENNA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="8vksx1zsg7q"><em>All legged robots deployed “in the wild” to date were given a body plan that was predefined by human designers and could not be redefined in situ. The manual and permanent nature of this process has resulted in very few species of agile terrestrial robots beyond familiar four-limbed forms. Here, we introduce highly athletic modular building blocks and show how they enable the automatic design and rapid assembly of novel agile robots that can “hit the ground running” in unstructured outdoor environments.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="508a07a4b7d915c6cfd07081bdc63e86" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/8VKSx1zSg7Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robotics.northwestern.edu/news-events/index.html" target="_blank">Northwestern University Center for Robotics and Biosystems</a> ] [ <a href="https://www.pnas.org/doi/10.1073/pnas.2519129123">Paper</a> ] via [ <a href="https://gizmodo.com/these-self-configuring-modular-robots-may-one-day-rule-the-world-2000731381">Gizmodo</a> ] </p><div class="horizontal-rule"></div><p class="rm-anchors" id="l2q3kpl4mjq">If you were going to develop the ideal urban delivery robot more or less from scratch, it would be this.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ba83a841b32a7807384eeb10bc2c6b03" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/l2q3kPl4mJQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.rivr.ai/rivr-two">RIVR</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="cadtjepdbfc">Don’t get me wrong, there are some clever things going on here, but I’m still having a lot of trouble seeing where the unique, sustainable value is for a <a data-linked-post="2666662286" href="https://spectrum.ieee.org/humanoid-robots" target="_blank">humanoid robot</a> performing these sorts of tasks.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b6313fcff2b0315bed664e00897cf53a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/CAdTjePDBfc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.figure.ai/news/helix-02-living-room-tidy">Figure</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="xyhob9__qk0">One of those things that you don’t really think about as a human, but which is actually pretty important.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="53ef3877dae03acc90a17fd9dcba1e6b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xYhOb9__Qk0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/abs/2602.05760">Paper</a> ] via [ <a href="https://rsl.ethz.ch/" target="_blank">ETH Zurich</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="wi6u8bvofvc"><em>We propose TRIP-Bag (Teleoperation, Recording, Intelligence in a Portable Bag), a portable, puppeteer-style teleoperation system fully contained within a commercial suitcase, as a practical solution for collecting high-fidelity manipulation data across varied settings.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4d39bbac7f62958700b13bfd53bc8bfd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Wi6U8bvoFvc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://uiuckimlab.github.io/TRIP-Bag-pages/">KIMLAB</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="nuouwhuzpwq"><em>We propose an open-vocabulary semantic exploration system that enables robots to maintain consistent maps and efficiently locate (unseen) objects in semi-static real-world environments using LLM-guided reasoning.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1724903fcd3e1d57df45824508205a87" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/nUouwHUZPWQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.tum.de/en/news-and-events/all-news/press-releases/details/search-robot-thinks-for-itself">TUM</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="vrxamllkjko">That’s it, folks. We have no need for real pandas anymore—if we ever did in the first place. Be honest, what has a <a data-linked-post="2675288239" href="https://spectrum.ieee.org/robot-martial-arts" target="_blank">panda</a> done for you lately?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="af51d3f513d68d80617dd0b62738a1bb" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/VRxAMLlkjko?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.magiclab.top/en/">MagicLab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="uhd6o6dem_o"><em>RoboGuard is a general-purpose guardrail for ensuring the safety of LLM-enabled robots. RoboGuard is configured offline with high-level safety rules and a robot description, reasons about how these safety rules are best applied in robot’s context, then synthesizes a plan that maximally follows user preferences while ensuring safety.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bfc2abc33b815af7c16c37617a485a87" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/uhd6O6DEM_o?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robo-guard.github.io/">RoboGuard</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="5ekki51q1sk"><em>In this demonstration, a small team responds to a (simulated) radiation contamination leak at a real nuclear reactor facility. The team deploys their reconfigurable robot to accompany them through the facility. As the station is suddenly plunged into darkness, the robot’s camera is hot-swapped to thermal so that it can continue on. Upon reaching the approximate location of the contamination, the team installs a Compton gamma-ray camera and pan-tilt illuminating device. The robot autonomously steps forward, locates the radiation source, and points it out with the illuminator.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7928c582f10167b05ca04c694c729b67" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/5ekKI51q1Sk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ieeexplore.ieee.org/document/11078050">Paper</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="zcnmhsg5bpw"><em>On March 6, 2025, the Robomechanics Lab at CMU was flooded with 4 feet of black water (i.e., mixed with sewage). We lost most of the robots in the lab, and as a tribute, my students put together this “In Memoriam” video. It includes some previously unreleased robots and video clips.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e1739161841c3f7f5fb2ae563d8b15bc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zcnMHsg5Bpw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.cmu.edu/me/robomechanicslab/">Carnegie Mellon University Robomechanics Lab</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="i3goczr4ya0">There haven’t been a lot of successful <a data-linked-post="2650267089" href="https://spectrum.ieee.org/your-kid-wants-a-thymio-ii-education-robot" target="_blank">education robots</a>, but here’s one of them.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="971e1ccf67faed9fa1a9a5292d6b5b49" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/i3goCzr4YA0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://sphero.com/collections/all/products/rvr?variant=42004659142701">Sphero</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="35i9m-jc0oc">The opening keynote from the 2025 Silicon Valley Humanoids Summit: “Insights Into Disney’s Robotic Character Platform,” by Moritz Baecher, Director, Zurich Lab, Disney Research.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a7fc3671608ce481554dac55c022d319" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/35i9M-jc0Oc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://humanoidssummit.com/">Humanoids Summit</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 13 Mar 2026 16:00:04 +0000</pubDate><guid>https://spectrum.ieee.org/legged-modular-robot</guid><category>Robotics</category><category>Humanoid-robots</category><category>Video-friday</category><category>Modular-robots</category><category>Robot-videos</category><category>Quadruped-robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/rolling-cannon-distant-cityscape-trees-and-water.gif?id=65282014&amp;width=980"></media:content></item><item><title>Video Friday: A Robot Hand With Artificial Muscles and Tendons</title><link>https://spectrum.ieee.org/video-friday-robot-hand-artificial-muscles</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/robotic-hand-grasping-a-red-bull-can-against-a-dark-background.png?id=65162441&width=2000&height=1500&coordinates=113%2C0%2C113%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://2026.ieee-icra.org/">ICRA 2026</a>: 1–5 June 2026, VIENNA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="hd1hdfw1bhy"><em>The functional replication and actuation of complex structures inspired by nature is a longstanding goal for humanity. Creating such complex structures combining soft and rigid features and actuating them with artificial muscles would further our understanding of natural kinematic structures. We printed a biomimetic hand in a single print process composed of a rigid skeleton, soft joint capsules, tendons, and printed touch sensors.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d1520429687b7c6ef41cd204b2161ddc" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/hD1HDFw1BhY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ieeexplore.ieee.org/abstract/document/10522043">Paper</a> ] via [ <a href="https://srl.ethz.ch/">SRL</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="u18ehtnvfd4">Two <a href="https://spectrum.ieee.org/tag/boston-dynamics" target="_blank">Boston Dynamics</a> product managers talk about their favorite classic BD robots, and then I talk about mine.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ee14e75b8b4fac354bdb72fef9eb1549" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/U18EHTnvFd4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>And this is Boston Dynamics’ LittleDog, doing legged locomotion research 16 or so years ago in what I’m pretty sure is Katie Byl’s lab at UCSB.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="27626ccc6010288122cf616a0f35aa3d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/AdWpo43b2FI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://bostondynamics.com/about/history/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="gocorcrlgb4"><em>This is our latest work on the trajectory planning method for floating-based articulated robots, enabling the global path for searching in complex and cluttered environments.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f2d2afeed034c4c40136e41360360951" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/GOcorcrLGb4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.dragon.t.u-tokyo.ac.jp/">DRAGON Lab</a> ]</p><p>Thanks, Moju!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="i2jmf_z9ts8"><em>OmniPlanner is a unified solution for exploration and inspection-path planning (as well as target reach) across aerial, ground, and underwater robots. It has been verified through extensive simulations and a multitude of field tests, including in underground mines, ballast water tanks, forests, university buildings, and submarine bunkers.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="fcaa6b98fc3995a5010528eb89bb8f14" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/I2JMF_Z9tS8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.autonomousrobotslab.com/">NTNU</a> ]</p><p>Thanks, Kostas!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="a_hwcpqbbly"><em>In the ARISE project, the <a href="https://www.fzi.de/en/" target="_blank">FZI Research Center for Information Technology</a> and its international partners ETH Zurich, University of Zurich, University of Bern, and University of Basel took a major step toward future lunar missions by testing cooperative autonomous multirobot teams under outdoor conditions.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="17b9634bb780c7e02ba8230822684990" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/a_hwCPQbBlY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.fzi.de/en/2025/02/26/one-step-closer-to-the-moon-through-international-cooperation/">FZI</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="dmbjbwhwyeu">Welcome to the future, where there are no other humans.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="05f12866fdd4c32a9372563b0d407f5d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/DmbJbwhWYEU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.zj-humanoid.com/">Zhejiang Humanoid</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="8oot8cnpai0"><em>This is our latest work on robotic fish, and it’s also the first underwater robot from DRAGON Lab. </em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2e719c55aa3bd82ab9f1c1123ecfe88f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/8oot8CnpAi0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.dragon.t.u-tokyo.ac.jp/">DRAGON Lab</a> ]</p><p>Thanks, Moju!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="awrnl8rcbmk">Watch this one simple trick to make <a href="https://spectrum.ieee.org/topic/robotics/humanoid-robots/" target="_blank">humanoid robots</a> cheaper and safer!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e69112d5d83fdcd0226a652b2b7cb898" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/AwRnL8rcBmk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.zj-humanoid.com/">Zhejiang Humanoid</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="90twy79yffo"><em>Gugusse and the Automaton</em> is a 1897 French film by <a href="https://en.wikipedia.org/wiki/Georges_M%C3%A9li%C3%A8s" target="_blank">Georges Méliès</a> featuring a humanoid robot in a depiction that’s nearly as realistic as some of the humanoid promo videos we’ve seen lately.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4d2d9a469b74b0b57aa6d34c9859e471" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/90tWY79YfFo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.loc.gov/item/2026125501/?loclr=blogloc">Library of Congress</a> ] via [ <a href="https://gizmodo.com/first-film-to-depict-a-robot-discovered-in-michigan-2000727995">Gizmodo</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="lm3htxushva"><em>At Agility, we create automated solutions for the hardest work. We’re incredibly proud of how far we’ve come, and can’t wait to show you what’s next.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="007204bb742016f199f77925109d19ef" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/LM3hTXUShvA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.agilityrobotics.com/">Agility</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="si-jhnqcjt0"><a href="https://www.nist.gov/people/kamel-s-saidi" target="_blank">Kamel Saidi</a>, robotics program manager at the <a href="https://www.nist.gov/" target="_blank">National Institute of Standards and Technology (NIST)</a>, on how performance standards can pave the way for humanoid adoption.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4484b448f54ec06f10f3985953b03c9b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/sI-jhnqcJt0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://humanoidssummit.com/">Humanoids Summit</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="gwsl1oh1i4w"><em><a href="https://people.eecs.berkeley.edu/~anca/" target="_blank">Anca Dragan</a> is no stranger to Waymo. She worked with us for six years while also at UC Berkeley and now at Google DeepMind. Her focus on making AI safer helped Waymo as it launched commercially. In this final episode of our season, Anca describes how her work enables AI agents to work fluently with people, based on human goals and values.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bafb48d4c6dec0871809a152ad842b8e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/GwSl1OH1i4w?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.youtube.com/playlist?list=PLCkt0hth826G9AtnOrQsPbKKD5JmdaMXb">Waymo Podcast</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="r9ugdinfhbm">This <a href="https://www.grasp.upenn.edu/" target="_blank">UPenn GRASP</a> SFI Seminar is by Junyao Shi: “Unlocking Generalist Robots with Human Data and Foundation Models.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c1e2f8d6dc1171693ed8ee0180f30e9d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/r9UGdInfhBM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>Building general-purpose robots remains fundamentally constrained by data scarcity and labor-intensive engineering. Unlike vision and language, robotics lacks large, diverse datasets that span tasks, environments, and embodiments, thus limiting both scalability and generalization. This talk explores how human data and foundation models trained at scale can help overcome these bottlenecks.</em></blockquote><p>[ <a href="https://www.grasp.upenn.edu/events/spring-2026-grasp-sfi-junyao-shi/">UPenn</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 06 Mar 2026 16:00:05 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-robot-hand-artificial-muscles</guid><category>Humanoid-robots</category><category>Video-friday</category><category>Underwater-robots</category><category>Bipedal-robots</category><category>Robot-videos</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/robotic-hand-grasping-a-red-bull-can-against-a-dark-background.png?id=65162441&amp;width=980"></media:content></item><item><title>What Military Drones Can Teach Self-Driving Cars</title><link>https://spectrum.ieee.org/military-drones-self-driving-cars</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/silhouette-from-the-back-of-an-adults-head-as-they-look-at-two-monitors-one-screen-displays-a-drone-and-the-other-shows-self-d.jpg?id=65098234&width=2000&height=1500&coordinates=416%2C0%2C417%2C0"/><br/><br/><p><a href="https://spectrum.ieee.org/self-driving-cars/missy-cummings" target="_blank">Self-driving cars often struggle</a> with situations that are commonplace for human drivers. When confronted with construction zones, school buses, power outages, or misbehaving pedestrians, these vehicles often behave unpredictably, leading to crashes or freezing events, causing significant disruption to local traffic and possibly blocking first responders from doing their jobs. Because self-driving cars cannot successfully handle such routine problems, self-driving companies use human babysitters to remotely supervise them and intervene when necessary.</p><p>This idea—humans supervising autonomous vehicles from a distance—is not new. The U.S. military has been doing it since the 1980s with unmanned aerial vehicles (UAVs). In those early years, the military experienced numerous accidents due to poorly designed control stations, lack of training, and communication delays.</p><p>As a Navy fighter pilot in the 1990s, I was one of the first researchers to examine how to improve the UAV remote supervision interfaces. The thousands of hours I and others have spent working on and observing these systems generated a deep body of knowledge about how to safely manage remote operations. With recent revelations that U.S. commercial self-driving car remote operations are handled by <a href="https://www.c-span.org/program/senate-committee/tesla-and-waymo-executives-others-testify-about-self-driving-cars/672835" rel="noopener noreferrer" target="_blank">operators in the Philippines</a>, it is clear that self-driving companies have not learned the hard-earned military lessons that would promote safer use of self-driving cars today.</p><p>While stationed in the Western Pacific during the Gulf War, I spent a significant amount of time in air operations centers, learning how military strikes were planned, implemented and then replanned when the original plan inevitably fell apart. After obtaining my PhD, I leveraged this experience to begin research on the remote control of UAVs for all three branches of the U.S. military. Sitting shoulder-to-shoulder in tiny trailers with operators flying UAVs in local exercises or from 4000 miles away, my job was to learn about the pain points for the remote operators as well as identify possible improvements as they executed supervisory control over UAVs that might be flying halfway around the world.</p><p>Supervisory control refers to situations where humans monitor and support autonomous systems, stepping in when needed. For self-driving cars, this oversight can take several forms. The first is teleoperation, where<strong> </strong>a human remotely controls the car’s speed and steering from afar. Operators sit at a console with a steering wheel and pedals, similar to a racing simulator. Because this method relies on real-time control, it is extremely sensitive to communication delays.</p><p>The second form of supervisory control is remote assistance. Instead of driving the car in real time, a human gives higher-level guidance. For example, an operator might click a path on a map (called laying “breadcrumbs”) to show the car where to go, or interpret information the AI cannot understand, such as hand signals from a construction worker. This method tolerates more delay than teleoperation but is still time-sensitive.</p><h2>Five Lessons From Military Drone Operations</h2><p>Over 35 years of UAV operations, the military consistently encountered five major challenges during drone operations which provide valuable lessons for self-driving cars.</p><h3>Latency</h3><p>Latency—delays in sending and receiving information due to distance or poor network quality—is the single most important challenge for remote vehicle control. Humans also have their own built-in delay: neuromuscular lag. Even under perfect conditions, people cannot reliably respond to new information in less than 200–500 milliseconds. In remote operations, where communication lag already exists, this makes real-time control even more difficult.</p><p>In early drone operations, U.S. Air Force pilots in Las Vegas (the primary U.S. UAV operations center) attempted to take off and land drones in the Middle East using teleoperation. With at least a two-second delay between command and response, the accident rate was <a href="https://dsiac.dtic.mil/articles/reliability-of-uavs-and-drones/" rel="noopener noreferrer" target="_blank">16 times that of fighter jets conducting the same missions</a> . The military switched to local line-of-sight operators and eventually to fully automated takeoffs and landings. When I interviewed the pilots of these UAVs, they all stressed how difficult it was to control the aircraft with significant time lag.</p><p>Self-driving car companies typically rely on cellphone networks to deliver commands. These networks are unreliable in cities and prone to delays. This is one reason many companies prefer remote assistance instead of full teleoperation. But even remote assistance can go wrong. In <a href="https://www.forbes.com/sites/bradtempleton/2024/03/26/waymo-runs-a-red-light-and-the-difference-between-humans-and-robots/" rel="noopener noreferrer" target="_blank">one incident</a>, a Waymo operator instructed a car to turn left when a traffic light appeared yellow in the remote video feed—but the network latency meant that the light had already turned red in the real world. After moving its remote operations center from the U.S. to the Philippines, Waymo’s latency increased even further. It is imperative that control not be so remote, both to resolve the latency issue but also increase oversight for security vulnerabilities.</p><h3>Workstation Design</h3><p>Poor interface design has caused many drone accidents. The military learned the hard way that confusing controls, difficult-to-read displays, and unclear autonomy modes can have disastrous consequences. Depending on the specific UAV platform, the FAA attributed between 20% and 100% of Army and Air Force UAV <a href="https://apps.dtic.mil/sti/pdfs/ADA460102.pdf" rel="noopener noreferrer" target="_blank">crashes caused by human error through 2004</a> to poor interface design.</p><h3>UAV crashes (1986-2004) caused by human factors problems, including poor interface and procedure design. These two categories do not sum to 100% because both factors could be present in an accident.</h3><br/><table border="0" style="white-space: unset;" width="100%"><tbody><tr><th align="left" scope="col" style="color: #ffffff; background-color: #000000;"></th><th align="left" scope="col" style="color: #ffffff; background-color: #000000;" width="25%">Human Factors</th><th align="left" scope="col" style="color: #ffffff; background-color: #000000;" width="25%"> Interface Design</th><th align="left" scope="col" style="color: #ffffff; background-color: #000000;" width="25%"> Procedure Design</th></tr><tr><th align="left" scope="col" style="color: #ffffff; background-color: #000000;"> Army Hunter</th><td align="left" style="background-color: #DFD5C1;"> 47%</td><td align="left" style="background-color: #DFD5C1;"> 20%</td><td align="left" style="background-color: #DFD5C1;"> 20%</td></tr><tr><th align="left" scope="col" style="color: #ffffff; background-color: #000000;"> Army Shadow</th><td align="left" style="background-color: #E9E3D6;"> 21%</td><td align="left" style="background-color: #E9E3D6;"> 80%</td><td align="left" style="background-color: #E9E3D6;"> 40%</td></tr><tr><th align="left" scope="col" style="color: #ffffff; background-color: #000000;">Air Force Predator</th><td align="left" style="background-color: #DFD5C1;"> 67%</td><td align="left" style="background-color: #DFD5C1;"> 38%</td><td align="left" style="background-color: #DFD5C1;"> 75%</td></tr><tr><th align="left" scope="col" style="color: #ffffff; background-color: #000000;" width="25%"> Air Force Global Hawk</th><td align="left" style="background-color: #E9E3D6;"> 33%</td><td align="left" style="background-color: #E9E3D6;"> 100%</td><td align="left" style="background-color: #E9E3D6;"> 0%</td></tr></tbody></table><p>Many UAV aircraft crashes have been caused by poor human control systems. In one case, buttons were placed on the controllers such that it was relatively easy to <a href="https://spectrum.ieee.org/review-djis-new-fpv-drone-is-effortless-exhilarating-fun" target="_self">accidentally shut off the engine</a> instead of firing a missile. This poor design led to the accidents where the remote operators <a href="https://dspace.mit.edu/handle/1721.1/84129" rel="noopener noreferrer" target="_blank">inadvertently shut the engine down instead of launching a missile</a>.</p><p> The self-driving industry reveals hints of comparable issues. Some autonomous shuttles use off-the-shelf gaming controllers, which—while inexpensive—were never designed for vehicle control. The off-label use of such controllers can lead to mode confusion, which was a factor in a <a href="https://www.govtech.com/transportation/after-crash-orlandos-self-driving-bus-back-on-the-road" rel="noopener noreferrer" target="_blank">recent shuttle crash</a>. Significant human-in-the-loop testing is needed to avoid such problems, not only prior to system deployment, but also after major software upgrades.</p><h3>Operator Workload</h3><p>Drone missions typically include long periods of surveillance and information gathering, occasionally ending with a missile strike. These missions can sometimes last for days; for example, while the military waits for the person of interest to emerge from a building. As a result, the remote operators experience extreme swings in workload: sometimes overwhelming intensity, sometimes crushing boredom. Both conditions can lead to errors.</p><p>When operators teleoperate drones, workload is high and fatigue can quickly set in. But when onboard autonomy handles most of the work, operators can become bored, complacent, and less alert. This pattern is <a href="https://www.airuniversity.af.edu/Wild-Blue-Yonder/Articles/Article-Display/Article/2144225/airmen-and-unmanned-aerial-vehicles-the-danger-of-generalization/" rel="noopener noreferrer" target="_blank">well documented in UAV research</a>.</p><p>Self-driving car operators are likely experiencing similar issues for tasks ranging from interpreting confusing signs to helping cars escape dead ends. In simple scenarios, operators may be bored; in emergencies—like driving into a flood zone or responding during a citywide power outage—they can become quickly overwhelmed.</p><p>The military has tried for years to have one person supervise many drones at once, because it is far more cost effective. However, cognitive switching costs (regaining awareness of a situation after switching control between drones) result in workload spikes and high stress. That coupled with increasingly complex interfaces and communication delays have made this extremely difficult.</p><p>Self-driving car companies likely face the same roadblocks. They will need to model operator workloads and be able to reliably predict what staffing should be and how many vehicles a single person can effectively supervise, especially during emergency operations. If every self-driving car turns out to need a dedicated human to pay close attention, such operations would no longer be cost-effective.</p><h3>Training</h3><p>Early drone programs lacked formal training requirements, with training programs designed by pilots, for pilots. Unfortunately, supervising a drone is more akin to air traffic control than actually flying an aircraft, so the military often placed drone operators in critical roles with inadequate preparation. This caused many accidents. Only years later did the military conduct <a href="https://www.researchgate.net/publication/238795397_Enhancing_Unmanned_Aerial_System_Training_A_Taxonomy_of_Knowledge_Skills_Attitudes_and_Methods" rel="noopener noreferrer" target="_blank">a proper analysis of the knowledge, skills, and abilities needed to conduct safe remote operations</a>, and changed their training program.</p><p>Self-driving companies do not publicly share their training standards, and no regulations currently govern the qualifications for remote operators. On-road safety depends heavily on these operators, yet very little is known about how they are selected or taught. If commercial aviation dispatchers are required to have formal training overseen by the FAA, which are very similar to self-driving remote operators, we should hold commercial self-driving companies to similar standards.</p><h3>Contingency Planning</h3><p>Aviation has strong protocols for emergencies including predefined procedures for lost communication, backup ground control stations, and highly reliable onboard behaviors when autonomy fails. In the military, drones may fly themselves to safe areas or land autonomously if contact is lost. Systems are designed with cybersecurity threats—like GPS spoofing—in mind.</p><p>Self-driving cars appear far less prepared. The <a href="https://waymo.com/blog/2025/12/autonomously-navigating-the-real-world" rel="noopener noreferrer" target="_blank">2025 San Francisco power outage</a> left Waymo vehicles frozen in traffic lanes, blocking first responders and creating hazards. These vehicles are supposed to perform “minimum-risk maneuvers” such as pulling to the side—but many of them didn’t. This suggests gaps in contingency planning and basic fail-safe design.</p><div class="horizontal-rule"></div><p><span>The history of military drone operations offers crucial lessons for the self-driving car industry. Decades of experience show that remote supervision demands extremely low latency, carefully designed control stations, manageable operator workload, rigorous, well-designed training programs, and strong contingency planning.</span></p><p>Self-driving companies appear to be repeating many of the early mistakes made in drone programs. Remote operations are treated as a support feature rather than a mission-critical safety system. But as long as AI struggles with uncertainty, which will be the case for the foreseeable future, remote human supervision will remain essential. The military learned these lessons through painful trial and error, yet the self-driving community appears to be ignoring them. The self-driving industry has the chance—and the responsibility—to learn from our mistakes in combat settings before it harms road users everywhere.</p><p>A full paper on this topic will be presented at the <a href="https://www.ieeesmc.org/ichms-2026/" target="_blank">2026 IEEE International Conference on Human-Machine Systems (ICHMS)</a> meeting in Singapore in July.</p>]]></description><pubDate>Mon, 02 Mar 2026 12:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/military-drones-self-driving-cars</guid><category>Drones</category><category>Military-robots</category><category>Self-driving-cars</category><dc:creator>Missy Cummings</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/silhouette-from-the-back-of-an-adults-head-as-they-look-at-two-monitors-one-screen-displays-a-drone-and-the-other-shows-self-d.jpg?id=65098234&amp;width=980"></media:content></item><item><title>Video Friday: Robot Dogs Haul Produce From the Field</title><link>https://spectrum.ieee.org/quadruped-farming-robots</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/quadruped-robots-with-crates-on-their-backs-carry-produce-on-a-path-amidst-lush-leafy-green-crops.png?id=65095903&width=2000&height=1500&coordinates=144%2C0%2C144%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://2026.ieee-icra.org/">ICRA 2026</a>: 1–5 June 2026, VIENNA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="vzjcvzyi2wq"><em>Our robots Lynx M20 help transport harvested crops in mountainous farmland—tackling the rural “last mile” logistics challenge.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ba185ed737063e503a9255c4a1cfd96d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/VzjcvzYi2WQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en">Deep Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="eqpyvr-b7hc">Once again, I would point out that now that we are reaching peak humanoid robots doing humanoid things, we are inevitably about to see humanoid robots doing nonhumanoid things.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="07f6309039fd04258b0e4abdcfae0617" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/eQpyvR-B7hc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.unitree.com/">Unitree</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="g9oyrplrig8"><em>In a study, a team of researchers from the Max Planck Institute for Intelligent Systems, the University of Michigan, and Cornell University show that groups of <a data-linked-post="2650267091" href="https://spectrum.ieee.org/magnetic-microbots-to-fight-cancer" target="_blank">magnetic microrobots</a> can generate fluidic forces strong enough to rotate objects in different directions without touching them. These microrobot swarms can turn gear systems, rotate objects much larger than the robots themselves, assemble structures on their own, and even pull in or push away many small objects.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="316646fb51cfe112cec7b3c3839dec72" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/G9oYrPLRIG8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.science.org/doi/10.1126/sciadv.aea9947">Science</a> ] via [ <a href="https://is.mpg.de/en/news/magnetic-microrobot-swarms-enable-contactless-manipulation-of-objects-through-fluidic-torque">Max Planck Institute</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="klhx6qfrzes"><em>Bipedal—or two-legged—autonomous robots can be quite agile. This makes them useful for performing tasks on uneven terrain, such as carrying equipment through outdoor environments or performing maintenance on an oceangoing ship. However, unstable or unpredictable conditions also increase the possibility of a robot wipeout. Until now, there’s been a significant lack of research into how a robot recovers when its direction shifts—for example, a robot losing balance when a truck makes a quick turn. The team aims to fix this research gap.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="de95bdf7f06151477581b83f9cd1d146" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/klhX6qFRZEs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://news.gatech.edu/news/2026/02/18/humanoid-robots-make-confident-strides-toward-walking-stability">Georgia Tech</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="u7tsdb4nuge"><em>Robotics is about controlling energy, motion, and uncertainty in the real world.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d6071f0de516f1571a9e1cc180c0f753" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/U7TSDb4NugE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.cs.cmu.edu/~16311/current/labs/lab01/index.html">Carnegie Mellon University</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="fftskxohrxm"><em>Delicious dinner cooked by our robot Robody. We’ve asked our investors to speak about why they’re along for the ride.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e7e0e84ad77fb254d7071021f9b5677e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/FfTSKxOhrxM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.devanthro.com/">Devanthro</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="7oc55almc4u"><em>Tilt-rotor aerial robots enable omnidirectional maneuvering through thrust vectoring, but introduce significant control challenges due to the strong coupling between joint and rotor dynamics. This work investigates reinforcement learning for omnidirectional aerial motion control on overactuated tiltable quadrotors that prioritizes robustness and agility.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4141e9b6c30d50877b06af77023c3bab" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/7oc55aLMC4U?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://zwt006.github.io/posts/BeetleOmni/">Dragon Lab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="3fzhnaoajae"><em>At the [Carnegie Mellon University] Robotic Innovation Center’s 75,000-gallon water tank, members of the TartanAUV student group worked to further develop their autonomous underwater vehicle (AUV) called Osprey. The team, which takes part in the annual <a data-linked-post="2650255401" href="https://spectrum.ieee.org/build-your-own-undersea-robot" target="_blank">RoboSub</a> competition sponsored by the U.S. Office of Naval Research, is comprised primarily of undergraduate engineering and robotics students.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d138043667221480117b9978bd790ef3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/3fzhNAoAjaE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.cmu.edu/news/stories/archives/2026/february/cmus-robotics-innovation-center-propels-research-from-deep-sea-to-space">Carnegie Mellon University</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="1one4l_pghw">Sure seems like the only person who would want a robot dog is a person who does not in fact want a dog.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c384213c6605abee52a4b285faed7d20" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/1ONE4l_pgHw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>Compact size, industrial capability. Maximum torque of 90N·m, over 4 hours of no-load runtime, IP54 rainproof design. With a 15-kg payload, range exceeds 13 km. Open secondary development, empowering industry applications.</em></blockquote><p>[ <a href="https://www.unitree.com/">Unitree</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="flmtgad-snu">If your robot video includes tasty baked goods it <strong><em>will</em></strong> be included in Video Friday.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bc37f01ece7f72aae9e972b63ed5f39d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/fLMTgAD-SNU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://qbrobotics.com/">QB Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="hdktpcuzwli"><em>Astorino is a 6-axis educational robot created for practical and affordable teaching of robotics in schools and beyond. It has been created with 3D printing, so it allows for experimentation and the possible addition of parts. With its design and programming, it replicates the actions of industrial robots giving students the necessary skills for future work.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9175161ed153e8deaaa1697322641517" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/HDKtpcUzwLI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://astorino.com.pl/en/">Astorino by Kawasaki</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="zdqvhaoagcu">We need more <a data-linked-post="2668536834" href="https://spectrum.ieee.org/autonomous-vehicles-great-at-straights" target="_blank">autonomous driving datasets</a> that accurately reflect how sucky driving can be a lot of the time.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="32bf318792e72e34dbc1d1ef25b3d572" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zDQVhAOagcU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://asrl.utias.utoronto.ca/">ASRL</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="g-g0wl_tqw4">This Carnegie Mellon University Robotics Institute Seminar is by CMU’s own Victoria Webster-Wood, on “Robots as Models for Biology and Biology and Materials for Robots.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5ad1f7c8448c962c5dc5ae76dffc81e9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/G-G0wL_TqW4?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>In the last century, it was common to envision robots as shining metal structures with rigid and halting motion. This imagery is in contrast to the fluid and organic motion of living organisms that inhabit our natural world. The adaptability, complex control, and advanced learning capabilities observed in animals are not yet fully understood, and therefore have not been fully captured by current robotic systems. Furthermore, many of the mechanical properties and control capabilities seen in animals have yet to be achieved in robotic platforms. In this talk, I will share an interdisciplinary research vision for robots as models for neuroscience and biology as materials for robots.</em></blockquote><p>[ <a href="https://www.ri.cmu.edu/event/robots-as-models-for-biology-and-biology-and-materials-for-robots/">CMU RI</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 27 Feb 2026 18:00:55 +0000</pubDate><guid>https://spectrum.ieee.org/quadruped-farming-robots</guid><category>Humanoid-robots</category><category>Video-friday</category><category>Swarm-robotics</category><category>Quadruped-robots</category><category>Farm-robots</category><category>Bipedal-robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/quadruped-robots-with-crates-on-their-backs-carry-produce-on-a-path-amidst-lush-leafy-green-crops.png?id=65095903&amp;width=980"></media:content></item><item><title>Perseverance Smashes Autonomous Driving Record on Mars</title><link>https://spectrum.ieee.org/perseverance-mars-rover-autonomous-driving</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-self-portrait-captured-by-nasa-s-perseverance-rover-while-traversing-mars-rocky-surface.jpg?id=65007226&width=2000&height=1500&coordinates=416%2C0%2C417%2C0"/><br/><br/><p><em>This article is part of our exclusive <a href="https://spectrum.ieee.org/collections/journal-watch/" target="_self">IEEE Journal Watch series</a> in partnership with <a href="https://spectrum.ieee.org/tag/ieee-xplore" target="_self">IEEE Xplore</a>.</em></p><p>In past missions to Mars, like with the <a href="https://spectrum.ieee.org/nasa-mars-curiosity-rover-autonomous-driving-mode" target="_blank"><em>Curiosity</em></a> and <em>Opportunity</em> rovers, the robots relied mostly on human instructions from millions of miles away in order to safely navigate the Martian landscape. The <em>Perseverance</em> rover, on the other hand, has zipped across the alien, boulder-ridden land almost completely autonomously, smashing previous records for autonomous driving on Mars. </p><p>Whereas the <em>Curiosity</em> rover completed about 6.2 percent of its travels autonomously,<strong> </strong><em>Perseverance</em> had completed about 90 percent of its travels autonomously, as of its 1,312th Martian day since landing (28 October 2024). <em>Perseverance</em> was able to accomplish such a feat<span>—</span>using remarkably little computing power<span>—</span>thanks to its specially designed autonomous driving algorithm, Enhanced Autonomous Navigation, or ENav. </p><p>The full details on ENav’s inner workings and how well it has performed on Mars are described in a <a href="https://ieeexplore.ieee.org/document/11265757" target="_blank">study</a> published in <a href="https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=10495159" target="_blank"><em>IEEE Transactions on Field Robotics</em></a> in November 2025. </p><p>There are some advantages, but some serious challenges when it comes to autonomous navigation on Mars. On the plus side, almost nothing on the planet moves. Rocks and gravel slopes—while formidable obstacles—remain stationary, offering rovers consistency and predictability in their calculations and pathfinding. On the other hand, Mars is in large part uncharted terrain. </p><p>“This enormous uncertainty is the major challenge,” says <a href="https://www-robotics.jpl.nasa.gov/who-we-are/people/masahiro_ono/" target="_blank">Masahiro “Hiro” Ono</a>, supervisor of the Robotic Surface Mobility Group at NASA’s Jet Propulsion Laboratory, who helped develop ENav.</p><h2>Creating a Highly Autonomous Rover </h2><p>While some images from the space-borne Mars Reconnaissance Orbiter exist, these are usually not high enough resolution for ground-based navigation by a rover. In December, NASA engineers performed the first <a href="https://spectrum.ieee.org/perseverance-rover-nasa-anthropic-ai" target="_self">test of a navigation technique</a> that uses a model based on Anthropic’s AI to analyze MRO images and generate waypoints—the coordinates used to guide the rover’s path—for more complete automation. </p><p class="ieee-inbody-related">RELATED: <a href="https://spectrum.ieee.org/perseverance-rover-nasa-anthropic-ai" target="_blank">NASA Let AI Drive the Perseverance Rover</a></p><p><span>But in the majority of today’s navigation, <em>Perseverance</em> must rely on images the rover itself takes, analyze these to assess thousands of different paths, and choose the right route that won’t end in its own demise. The kicker? It must do so with the equivalent computing capacity of an </span><a href="https://en.wikipedia.org/wiki/IMac_G3" target="_blank">iMac G3</a><span>, an Apple computer sold in the late 1990s.</span></p><p><span></span>The rover’s processor must undergo <a href="https://spectrum.ieee.org/europa-clipper" target="_blank">radiation hardening</a>, a process that makes them resilient to the extreme levels of solar radiation and cosmic rays experienced on Mars. Although other radiation-hardened CPUs with more computing power were available at the time of <em>Perseverance</em><span>‘s development, the one used is proven to be reliable in the harsh conditions of outer space. By reusing hardware from previous missions—the same CPU was used in <em>Curiosity</em>—NASA can reduce costs while minimizing risk.</span></p><p>Given its limited computing resources, the ENav algorithm was strategically designed to do the heaviest computing only when driving on challenging terrains. It works by analyzing images of its surroundings and assessing about 1,700 possible paths forward, typically within 6 meters from the rover’s current position. Assessing factors such as travel time and terrain roughness, it ranks possible paths. Finally, it runs a computationally heavy collision-checking algorithm, called ACE (approximate clearance estimation) on only on a handful of top-ranked potential paths. </p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="The driving path of NASA's Perseverance rover across Mars' surface, spanning 18.65 miles." class="rm-shortcode" data-rm-shortcode-id="e6b5e0be83268493f2e1836c913c6174" data-rm-shortcode-name="rebelmouse-image" id="e6dbd" loading="lazy" src="https://spectrum.ieee.org/media-library/the-driving-path-of-nasa-s-perseverance-rover-across-mars-surface-spanning-18-65-miles.jpg?id=65007264&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">As of October 2024, Perseverance has driven more than 30 kilometers (18.65 miles) and collected 24 samples of rock and regolith. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Source:  JPL-Caltech/ASU/MSSS/NASA</small></p><h2>Exploring the Red Planet with ENav</h2><p><em>Perseverance</em> landed on Mars on 18 February 2021. In their study, Ono and his colleagues describe how the rover was initially deployed with strong human navigation oversight during its first 64 Martian days on the Red Planet, but then went on to predominantly use ENav to travel to one of the major exploration targets: the delta formed by an ancient river that once flowed into Jezero Crater billions of years ago. Scientists believe it could be a prime spot for finding evidence of past alien life, if life ever existed on Mars.</p><p>After a brief exploration of an area southwest of its landing site, <em>Perseverance</em> jetted counterclockwise around sand dunes toward the ancient river delta at a crisp pace, averaging 201 meters per Martian day. (It’s too cold for the rover to travel at night.) Over the course of just 24 Martian days of driving, the rover traveled about 5 kilometers into the foothill of the delta. 95 percent of all driving that month was performed using the autonomous driving mode, resulting in an unprecedented amount of autonomous driving on Mars.</p><p>Past rovers, such as <em>Curiosity</em>, had to stop and “think” about their paths before moving forward. “That was the main speed bump for <em>Curiosity</em>, why it was so slow to drive autonomously,” Ono explains. </p><p>In contrast, <em>Perseverance</em> is able to think and drive at the same time. “Sometimes [<em>Perseverance</em>] has to stop and think, particularly when it cannot figure out a safe path quickly. But most of the time, particularly on easy terrains, it can keep driving without stopping,” Ono says. “That made its autonomous driving an order of magnitude faster.”</p><p><em>Opportunity</em> held the previous record for autonomous driving on Mars, traveling 109 meters in a single Martian day. But on 3 April 2023, <em>Perseverance</em> set a new record by driving 331.74 meters autonomously (and 347.69 meters in total) in a single Martian day. </p><p>Ono says that fine-tuning the ENav algorithm took a lot of work, but he is happy with its performance. He also emphasizes that efforts to continue advancing autonomous navigation are critical if humans want to continue exploring even deeper into space, where Earthly communication with rovers and other spacecraft will become increasingly difficult.</p><p>“The automation of the space systems is unstoppable direction that we have to go if we want to explore deeper in space,” Ono says. “This is the direction that we must go to push the boundaries and frontiers of space exploration.”</p><p><em>This article was updated on 27 February to clarify NASA’s reasoning for selecting the CPU used in the </em>Perseverance<em> rover.</em></p>]]></description><pubDate>Wed, 25 Feb 2026 15:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/perseverance-mars-rover-autonomous-driving</guid><category>Mars</category><category>Perseverance-rover</category><category>Autonomous-robots</category><category>Journal-watch</category><dc:creator>Michelle Hampson</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/a-self-portrait-captured-by-nasa-s-perseverance-rover-while-traversing-mars-rocky-surface.jpg?id=65007226&amp;width=980"></media:content></item><item><title>Video Friday: Humanoid Robots Celebrate Spring</title><link>https://spectrum.ieee.org/robot-martial-arts</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/five-humanoid-robots-in-red-vests-perform-synchronized-movements-on-a-shiny-stage.png?id=64966934&width=2000&height=1500&coordinates=315%2C0%2C315%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://2026.ieee-icra.org/">ICRA 2026</a>: 1–5 June 2026, VIENNA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p class="rm-anchors" id="mumlv814ajo">So humanoid robots are nearing peak human performance. I would point out, though, that this is likely very far from <a data-linked-post="2666662286" href="https://spectrum.ieee.org/humanoid-robots" target="_blank">peak robot performance</a>, which has yet to be effectively exploited, because it requires more than just copying humans.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="024bb0c442a4f9a923cb9d2287d65cd3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/mUmlv814aJo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.unitree.com/">Unitree</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="tvy0henpoto"><em>“The Street Dance of China”: Turning lightness into gravity, and rhythm into impact.This is a head-on collision between metal and beats. This Chinese New Year, watch PNDbotics Adam bring the heat with a difference.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="9d2fff7bb612a25b39211999f00444b3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Tvy0HenPoTo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.pndbotics.com/">PNDbotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="zg286rri750">You had me at robot pandas.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="fedfa68ba524b95536171d927a20a6af" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zG286rRI750?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.magiclab.top/en/">MagicLab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="koftfrgo4zs"><em>NASA’s <a data-linked-post="2675264402" href="https://spectrum.ieee.org/perseverance-rover-nasa-anthropic-ai" target="_blank">Perseverance rover</a> can now precisely determine its own location on Mars without waiting for human help from Earth. This is possible thanks to a new technology called Mars global localization. This technology rapidly compares panoramic images from the rover’s navigation cameras with onboard orbital terrain maps. It’s done with an algorithm that runs on the rover’s helicopter base station processor, which was originally used to communicate with the Ingenuity Mars helicopter. In a few minutes, the algorithm can pinpoint Perseverance’s position to within about 10 inches (25 centimeters). The technology will help the rover drive farther autonomously and keep exploring. </em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="401bcf4da451bde0e05bb8995d292bd6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/KofTfRGO4Zs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.jpl.nasa.gov/news/nasas-perseverance-now-autonomously-pinpoints-its-location-on-mars/">NASA Jet Propulsion Laboratory</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="yjmfc4p-u60">Legs? Where we’re going, we don’t need legs!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d5d2299bc6edac5ee7dff1d6e24eef8d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/yJmFc4p-U60?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://advanced.onlinelibrary.wiley.com/doi/10.1002/aisy.202500270">Paper</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ojioedp1mke">This is a bit of a tangent from robotics, but it gets a pass because of the cute jumping spider footage.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ffa1eda2a0ffbdd35216c4d08e332cee" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/OjIoEDP1mKE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://als.lbl.gov/">Berkeley Lab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="7vuvvqap5vq"><em>Corvus One for Cold Chain is engineered to live and operate in freezer environments permanently, down to<br/> –20 °F, while maintaining full-flight and barcode-scanning performance.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="72c26bad197919021cd10399958457a7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/7vuVVQAP5VQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>I am sure there is an excellent reason for putting a cold-storage facility in the Mojave Desert.</p><p>[ <a href="https://www.corvus-robotics.com/">Corvus Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="plielbq6bzc"><em>The video documents the current progress made in the picking rate of the Shiva robot when picking strawberries. It first shows the previous status, then the further development, and finally the field test.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0cf52bc202c0d15d39ffa28de952a72d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/PLIELBq6bZc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://robotik.dfki-bremen.de/de/forschung/projekte/roland">DFKI</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="mh4nlgi64mm"><em>Data powers an organization’s digital transformation, and ST Engineering MRAS is leveraging Spot to get a full view of critical equipment and a facility. Working autonomously, Spot collects information about machine health—and now, thanks to an integration of the Leica BLK ARC for reality capture, detailed and accurate point-cloud data for their digital twin.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="10813bef7dc9bee7221fe4245caf64c2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/mH4NLGI64MM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://bostondynamics.com/case-studies/spot-at-st-engineering-mras/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="fhk28r9j-dq">The title of this video is “Get out and have fun!” Is that mostly what humanoid robots are good for right now, pretty much...?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="98b64102d3af8082637898f2bb44507d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/fhK28R9J-DQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://en.engineai.com.cn/">Engine AI</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="mey8uw65yw8"><em>Astorino is a modern six-axis <a data-linked-post="2666268821" href="https://spectrum.ieee.org/3d-printed-robot-hand" target="_blank">robot based on 3D-printing</a> technology. Programmable in AS language, the robot facilitates the preparation of classes with ready-made teaching materials, is easy both to use and to repair, and gives the opportunity to learn and make mistakes without fear of breaking it.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2343d89a61472a8c3764df56a794d75f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/MeY8Uw65YW8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://astorino.com.pl/en/">Kawasaki</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="clw5-nkw1xs">Can I get this in my living room?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="412804fd295fdadb519e7277b046f10a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/clw5-NkW1xs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.yaskawa-global.com/centenary/robot">Yaskawa</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="-g8q-nx5fce"><em>What does it mean to build a humanoid robot in seven months, and the next one in just five? This documentary takes you behind the scenes at Humanoid, a U.K.-based AI and robotics company building reliable, safe, and  helpful humanoid robots. You’ll hear directly from our engineering, hardware, product, and other teams as they share their perspectives on the journey of turning physical AI into reality.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a984d29bd7b9e36f3603259780ba8687" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/-G8q-NX5FCE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://thehumanoid.ai/">Humanoid</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ckr6h6vsuz8">This IROS 2025 keynote is from <a data-linked-post="2650279622" href="https://spectrum.ieee.org/darpa-tim-chung-on-subt-challenge-urban-circuit" target="_blank">Tim Chung</a>—now at Microsoft—on catalyzing the future of human, robot, and AI agent teams in the physical world.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="71d714793cd8568276d695e1a470b03e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Ckr6h6vSuz8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>The convergence of technologies—from foundation AI models to diverse sensors and actuators to ubiquitous connectivity—is transforming the nature of interactions in the physical and digital world. People have accelerated their collaborative connections and productivity through digital and immersive technologies, no longer limited by geography or language or access. Humans have also leveraged and interacted with AI in many different forms, with the advent of hyperscale AI models (that is, large language models) forever changing (and at an ever-astonishing pace) the nature of human–AI teams, realized in this era of the AI “copilot.” Similarly, robotics and automation technologies now afford greater opportunities to work with and/or near humans, allowing for increasingly collaborative physical robots to dramatically impact real-world activities. It is the compounding effect of enabling all three capabilities, each complementary to one another in valuable ways, and we envision the triad formed by human–robot–AI teams as revolutionizing the future of society, the economy, and technology.</em></blockquote><p>[ <a href="https://www.iros25.org/KeynoteSessions.html">IROS 2025</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="3dgvyq66yhm">This GRASP SFI talk is by Chris Paxton at Agility Robotics: “How Close Are We to Generalist Humanoid Robots?”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d4595fc1a5a1e20ee73a48779ac78576" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/3DgvYq66YhM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>With billions of dollars of funding pouring into robotics, general-purpose humanoid robots seem closer than ever. And certainly it feels like the pace of robotics is faster than ever, with multiple companies beginning large-scale deployments of humanoid robots. In this talk, I’ll go over the challenges still facing scaling robot learning, looking at insights from a year of discussions with researchers all over the world.</em></blockquote><p>[ <a href="https://www.grasp.upenn.edu/events/spring-2026-grasp-sfi-chris-paxton/">University of Pennsylvania GRASP Laboratory</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ry8itipzbfe">This week’s Carnegie Mellon University Robotics Institute Seminar is from Jitendra Malik at University of California, Berkeley: “Robot Learning, With Inspiration From Child Development.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="8f92598b30d5a78a3c0f4c212b77230a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ry8itipzBFE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><blockquote><em>For intelligent robots to become ubiquitous, we need to “solve” locomotion, navigation, and manipulation at sufficient reliability in widely varying environments. In locomotion, we now have demonstrations of humanoid walking in a variety of challenging environments. In navigation, we pursued the task of “Go to Any Thing”: A robot, on entering  a newly rented Airbnb, should be able to find objects such as TV sets or potted plants. RL in simulation and sim-to-real have been workhorse technologies for us, assisted by a few technical innovations. I will sketch promising directions for future work.</em></blockquote><p>[ <a href="https://www.ri.cmu.edu/event/robot-learning-with-inspiration-from-child-development/">Carnegie Mellon University Robotics Institute</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 20 Feb 2026 18:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/robot-martial-arts</guid><category>Humanoid-robots</category><category>Video-friday</category><category>Agility-robotics</category><category>Perseverance-rover</category><category>Insect-robots</category><category>Industrial-robotics</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/five-humanoid-robots-in-red-vests-perform-synchronized-movements-on-a-shiny-stage.png?id=64966934&amp;width=980"></media:content></item><item><title>Tech Is Taking Over Olympic Curling</title><link>https://spectrum.ieee.org/olympics-curling-robot-ai</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/curling-players-sweeping-a-red-stone-on-ice-motion-blur-emphasizes-speed-and-action.jpg?id=64953312&width=2000&height=1500&coordinates=166%2C0%2C167%2C0"/><br/><br/><p>At this year’s <a href="https://spectrum.ieee.org/winter-olympics-2026-tech" target="_blank">Winter Olympics in Italy</a>, the controversy began with a fingertip.</p><p>A disputed double-touch—whether a curler had <a href="https://www.nytimes.com/athletic/7045743/2026/02/13/curling-canada-sweden-marc-kennedy-cheating/" target="_blank">brushed a moving stone twice</a>—sparked protests, profanity-laced exchanges, and heated debate about sportsmanship. In a game that prides itself on mutual trust and the idea of competition as a shared test of skill, even the suggestion of impropriety can ripple far beyond a single end.</p><p>But if a double-touch can shake the sport, what happens when the controversy isn’t about a fingertip but an algorithm?</p><p>That’s the question shadowing the rise of analytics driven by machine learning and a new breed of AI-powered robots that can throw stones, read the ice, and calculate strategy with machine precision.</p><p class="ieee-inbody-related">RELATED: <a href="https://spectrum.ieee.org/winter-olympics-2026-tech" target="_blank">Milan-Cortina Winter Olympics Debut Next-Generation Sport Smarts</a></p><p>Some of these robots, such a “<a href="https://www.smithsonianmag.com/smart-news/curly-curling-robot-can-beat-pros-their-own-game-180975951/" target="_blank">Curly</a>,” have already toppled elite human opponents in head-to-head competitions. Others, engineered either to replicate the biomechanics of human shot delivery or to fire stones consistently with repeatable speed and rotation, are transforming the sport by dissecting technique and strategy with a level of rigor no coach with a stopwatch could match.</p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="7fb2199a88bd9f3a0f97671101939e8c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/uj3ur1uW-7Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span> <small class="image-media media-caption" placeholder="Add Photo Caption...">Seen here in action, the two-part robot system named Curly made its debut in 2018 ahead of that year’s Paralympic Winter Games in Pyeongchang.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit..."><a href="https://www.youtube.com/@TUBerlinTV" rel="noopener noreferrer" target="_blank">TUBerlinTV/YouTube</a></small></p><p>“The amount of innovation I’m seeing is just tremendous,” says <a href="https://glennpaulley.ca" rel="noopener noreferrer" target="_blank">Glenn Paulley</a>, a retired computer scientist who now runs Throwing Rocks Consulting Services, where he coaches curlers and advises teams on analytics.</p><p><a href="https://www.tabletmag.com/sections/sports/articles/israeli-curling" target="_blank">Fueled by investments from governments</a> and sporting bodies around the world, the pursuit of a competitive edge has escalated into a data-driven push for marginal gains ahead of each Olympic cycle. <del></del>“They’re trying like crazy to elevate their national team programs,” Paulley says, “and they’re doing it in every way possible.” <span>By the time medals are handed out in Cortina d’Ampezzo this weekend, the imprint of this full-throttle tech offensive could be etched into every sheet of ice.</span></p><p>Yet, as algorithms begin suggesting shots, the contours of fair play blur. Regulators and coaches alike are grappling with where to draw the line. And as top curlers lean more into AI and robotic systems, some fear the loss of something fundamental: the quiet, hard-earned feel for ice that separates veterans from novices.</p><p>“It’s a big debate!” says <a href="https://en.wikipedia.org/wiki/Emily_Zacharias" target="_blank">Emily Zacharias</a>, a former elite curler from Manitoba who captured gold representing Canada at the 2020 World Junior Curling Championships.</p><p>Three decades ago, Garry Kasparov sat across from IBM’s Deep Blue and discovered that even the most cerebral of games could be <a href="https://spectrum.ieee.org/how-ibms-deep-blue-beat-world-champion-chess-player-garry-kasparov" target="_blank">unsettled by silicon</a>. Curling, long called “<a href="https://chessonice.ca/" target="_blank">chess on ice</a>,” may now be entering its own version of that reckoning.</p><h2>Can New Tech Comply With the “Spirit of Curling”?</h2><p>Curling has been at this kind of crossroads before. A decade back, the sweeping-fabric controversy known as “<a href="https://www.cbc.ca/listen/cbc-podcasts/1427-broomgate-a-curling-scandal" target="_blank"><span>Broomgate</span></a>” triggered accusations of <a href="https://spectrum.ieee.org/motor-doping-cycling" target="_blank">technological doping</a>, a dispute that tore at the heart of the sport’s ethos of trust and bonhomie.</p><p>The World Curling Federation responded by clamping down on brush materials, <span>but<strong> </strong></span><span>AI now poses a broader challenge. It is not just a better broom but a decision engine, capable of shifting authority from a player’s<strong></strong> judgment in the “<a href="https://www.curlingbasics.com/en/images/z_house_03.png" target="_blank">house</a>” to a model running in the cloud.</span></p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="A hexapod robot stands on a demonstration curling course " class="rm-shortcode" data-rm-shortcode-id="58f9a3f0bebfe5952e925b87a6f208f8" data-rm-shortcode-name="rebelmouse-image" id="44f61" loading="lazy" src="https://spectrum.ieee.org/media-library/a-hexapod-robot-stands-on-a-demonstration-curling-course.jpg?id=64957715&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">The six-legged “hexapod” curling robot is displayed at the World Robot Conference 2022 in Beijing, where that year’s Olympic Games were also held.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Anna Ratkoglo/Sputnik/AP</small></p><p>It’s a prospect that unsettles some athletes and ethicists, who worry about what gets lost as optimization tightens its grip on a sport long governed by the so-called <a href="https://www.olympics.com/en/video/the-spirit-of-curling" target="_blank">Spirit of Curling</a>, an unwritten code of integrity, fairness, and respect.</p><p>“We’re at a point now where just about everything that we used to hold up as uniquely human is now being eroded by technology—and we feel a loss,” says <a href="https://www.craiedl.ca/team/jason-millar" target="_blank">Jason Millar</a>, who runs the Canadian Robotics and AI Ethical Design Lab at the University of Ottawa.</p><p>“The AI doesn’t care,” he adds. “There’s no ‘spirit’ there.”</p><h2>Building Rock-Solid Curling Robots</h2><p>The Curly robot first made waves in 2018 when, ahead of that year’s Paralympic Winter Games in Pyeongchang, engineers at Korea University, in Seoul, <a href="https://www.youtube.com/watch?v=uj3ur1uW-7Q" target="_blank">unveiled the AI-powered device</a>—or, rather, two coordinated devices, a pair of “skip” and “thrower” units, designed to read the ice and deliver stones.</p><p>Driven by a physics-based simulator and an adaptive deep-reinforcement-learning framework, the robot didn’t simply replay preprogrammed shots. It learned from its own misses, updated its aim based on the distance gaps between intended and actual stone positions, and factored in the cumulative wear of pebbled ice as a match unfolded.</p><p>That capacity was put to the test in a series of mini-games against top-ranked Korean athletes. As reported in the journal <em></em><em><a href="https://www.science.org/doi/10.1126/scirobotics.abb9764" target="_blank">Science Robotics</a>,</em> Curly started slow, dropping the opening match as it calibrated to the live ice. But it then went on to win the next three contests, demonstrating what its creators called “human-level performance” under real-world conditions.</p><p>The next Winter Olympics—the <a href="https://spectrum.ieee.org/carbon-neutral-winter-olympics" target="_blank">Beijing 2022 Games</a>—then brought a more agile machine: <a href="https://doi.org/10.1016/j.eng.2023.10.018" target="_blank">a “hexapod” curling robot</a><a href="https://doi.org/10.1016/j.eng.2023.10.018"></a> built to walk, align, and throw like a human curler.</p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="4313dd37d8c0249258f89615b9114690" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/IXQ7MjwdZ3A?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span> <small class="image-media media-caption" placeholder="Add Photo Caption...">With six legs, the hexapod robot can act more like a human curler when launching the stone, putting a new spin on curling-robot tech.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit..."><a href="https://www.youtube.com/watch?v=IXQ7MjwdZ3A" target="_blank">FlyingDumplings/YouTube</a></small></p><p>With its six-legged gait for stable traction and flexibility on the ice, the robot could pivot at the “hack,” the rubber foothold curlers use to launch their delivery. From there, the hexapod set its angle, kicked off, and glided on a skateboard-like undercarriage before releasing the stone, imparting competition-level spin.</p><p>Equipped with lidar and cameras, the robot scanned the sheet to map stone positions and fed those data into software that <a href="https://doi.org/10.1007/s11465-025-0835-5" target="_blank">calculated collision paths</a> and solved for the precise release parameters needed to execute a chosen strategy.</p><h2>Curling Bots Leave Broom for Improvement<br/></h2><p>For all the technical prowess of Curly and the hexapod, one stubborn constraint remains: No robot can sweep—at least not yet.</p><p>There are no <a href="https://spectrum.ieee.org/irobot-roomba-history" target="_self">Roomba</a>-like machines flanking the stone, frantically brushing to extend its travel or hold its line. Once released, the robot’s shot is fate, untouched by the vigorous, broom-flailing choreography that so often determines whether a stone bites the button or drifts wide.</p><p>“These robots are leaving out a huge chunk of potential that humans are bringing to the game,” says <a href="https://umanitoba.ca/kinesiology-recreation-management/faculty-staff/steven-passmore-phd" target="_blank">Steven Passmore</a>, a human-movement scientist at the University of Manitoba in Winnipeg who, together with Zacharias, coauthored a <a href="https://doi.org/10.3389/fspor.2024.1291241" target="_blank">comprehensive review of the scientific literature</a> on curling.</p><p>At the time of their data cutoff, in 2021, they found nearly two dozen published studies about robotics, AI, and emerging tech in the sport.<strong> </strong>But as Zacharias points out<del></del>, the most sophisticated tools shaping elite play often never appear in academic journals, developed behind closed doors and closely guarded as competitive secrets.</p><p>For her part, Zacharias—who <a href="https://stats.curling.io/players/zacharias-emily" target="_blank">competed at four Canadian women’s curling championships</a> between 2021 and 2024—says she never once practiced against a robot. But she has trained with a rock launcher, a mechanized delivery system that fires stones at precisely calibrated speeds and rotations, over and over.</p><p>By standardizing the throw, the device allows athletes to isolate how different sweeping techniques, brush-head fabrics, or ice temperatures alter a stone’s path, explains Paulley. “It means you can run repeated experiments in order to test the impact of different variables,” he says. “And in curling, there are <em>a lot</em> of variables.”</p><h2>Cutting-Edge Tech Helps Athletes Train</h2><p>In Japan, all these technologies and more are being explored in a government-backed initiative called <a href="https://xfuture.info/en/curling/" target="_blank">Curling of the Future</a>.</p><p>The program brings together university engineers, sporting agencies, and elite athletes to prototype delivery robots and sweep-assist machines, along with AI strategy engines, instrumented “smart stones,” and rock-launcher systems for controlled training. </p><p>“The core objective is elite performance: improving decision-making and the quality of training so that Japan can strengthen its competitiveness in international competition,” says <a href="https://www.fun.ac.jp/en/faculty/takegawa-yoshinari" target="_blank"><span>Yoshinari Takegawa</span></a>, an information scientist at the Future University Hakodate who is co-leading the project.</p><p class="shortcode-media shortcode-media-rebelmouse-image"> <img alt="A set of photos shows a blonde woman in a wheelchair wearing a VR headset, and images of a VR curling environment." class="rm-shortcode" data-rm-shortcode-id="641c890960e19b2fb95aa6b42a14a6a0" data-rm-shortcode-name="rebelmouse-image" id="7c584" loading="lazy" src="https://spectrum.ieee.org/media-library/a-set-of-photos-shows-a-blonde-woman-in-a-wheelchair-wearing-a-vr-headset-and-images-of-a-vr-curling-environment.jpg?id=64953348&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Dylan Rusnak, a kinesiology student at Red Deer Polytechnic, contributed to the project by developing a VR system for curling. Rusnak wears a Meta Quest headset [left] while demonstrating the system, which shows athletes immersive views of the rink [right]. </small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Red Deer Polytechnic</small></p><p>The technology push isn’t confined to Olympic play either. At the Paralympics next month, the Canadian national wheelchair curling squad will be coming primed with training sessions inside a full virtual replica of the Cortina Curling Olympic Stadium, courtesy of a <a href="https://spectrum.ieee.org/tag/virtual-reality" target="_blank">VR system</a> developed by mechanical engineer <a href="https://rdpolytech.ca/about-us/faculty/jennifer-dornstauder" target="_blank">Jennifer Dornstauder</a> and her students at Red Deer Polytechnic in Alberta. </p><p><del></del>The setup drops athletes into an immersive curling rink via a <a href="https://spectrum.ieee.org/tag/meta" target="_self">Meta</a> Quest headset, where they can look down and see virtual renderings of their legs, wheelchair, throwing stick, stones, and the ice surface beneath them.<del></del></p><p>According to <a href="https://paralympic.ca/news/coach-spotlight-mick-lizmore-reconnects-para-sport-wheelchair-curling-coach/" target="_blank">Mick Lizmore</a>, head coach of Canada’s National Wheelchair Curling Program, his team has used the VR to help visualize the venue where they will be competing and for group tactical training<strong></strong>, even when they can’t meet together in person. Beyond sharpening elite preparation, Dornstauder says, the same tool should help expand access to wheelchair curling for <a href="https://spectrum.ieee.org/tag/disability" target="_blank">people with disabilities</a> who face mobility challenges or limited ice availability.</p><p>“VR is just this amazing tool that is almost designed for getting around these barriers,” she says.</p><h2>Will Tech Change Curling?</h2><p>Many of the technologies entering curling are, in many ways, benign—tools for analysis, accessibility, and incremental refinement rather than wholesale disruption. A rock launcher standardizes practice. A VR headset extends rehearsal beyond the rink. A strategy engine offers probabilities, not ultimatums.<del></del><span><br/></span></p><p><span>Taken together, however, they reveal how thoroughly digital systems are seeping into every layer of the sport.</span></p><p>AI-powered sparring machines tuned to mimic a rival team’s tendencies, and thus capable of playing out fully simulated preparatory matches, remain a fantasy. National curling programs operate on tight budgets, limiting how far and how fast innovation can go. And even well-funded federations must balance software and robotics against coaching, travel, and ice time. </p><p class="shortcode-media shortcode-media-rebelmouse-image rm-float-left rm-resized-container rm-resized-container-25" data-rm-resized-container="25%" style="float: left;"> <img alt="A device shaped long a stretched letter H sits on a curling rink next to equipment." class="rm-shortcode" data-rm-shortcode-id="d1ad93daab6f9bd8fa1655e3c6b33c50" data-rm-shortcode-name="rebelmouse-image" id="2de6f" loading="lazy" src="https://spectrum.ieee.org/media-library/a-device-shaped-long-a-stretched-letter-h-sits-on-a-curling-rink-next-to-equipment.png?id=64953356&width=980"/> <small class="image-media media-caption" placeholder="Add Photo Caption...">Rock launchers provide a consistent throw to help athletes practice sweeping.</small><small class="image-media media-photo-credit" placeholder="Add Photo Credit...">Sean Maw/University of Saskatchewan</small></p><p>Yet as money continues to flow into high-performance curling, those possibilities draw closer<em><em>.</em></em></p><p>“It’s probably just a matter of time,” says <a href="https://engineering.usask.ca/people/sopd/maw,sean.php" target="_blank">Sean Maw</a>, a sports engineer at the University of Saskatchewan who has <a href="https://harvest.usask.ca/items/7b2c095a-5f96-416e-86af-8ca6f073dc12" target="_blank">built rock launchers</a> and studies the complexities of curling<em><em></em></em>. </p><p>For now, the stones still leave human hands—hands capable of brilliance, instinct, and the occasional double-touch—and the final call still rests with the skip in the house. But the algorithms are edging closer to the button.</p>]]></description><pubDate>Wed, 18 Feb 2026 15:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/olympics-curling-robot-ai</guid><category>Robotics</category><category>Artificial-intelligence</category><category>Virtual-reality</category><category>Sports</category><category>Canada</category><category>Olympic-games</category><dc:creator>Elie Dolgin</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/curling-players-sweeping-a-red-stone-on-ice-motion-blur-emphasizes-speed-and-action.jpg?id=64953312&amp;width=980"></media:content></item><item><title>Video Friday: Robot Collective Stays Alive Even When Parts Die</title><link>https://spectrum.ieee.org/video-friday-robot-collective</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/robot-collective-crawls-under-a-bridge-of-rocks-with-glowing-lights-video-speed-increased-10x.gif?id=64423332&width=2000&height=1500&coordinates=47%2C0%2C47%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://2026.ieee-icra.org/">ICRA 2026</a>: 1–5 June 2026, VIENNA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="boebmm8mlea"><em>No system is immune to failure. The compromise between reducing failures and improving adaptability is a recurring problem in robotics. Modular robots exemplify this trade-off, because the number of modules dictates both the possible functions and the odds of failure. We reverse this trend, improving reliability with an increased number of modules by exploiting redundant resources and sharing them locally.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d4b443d6937b9d0cc92c03a7e8d6617b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/bOeBmm8mleA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.science.org/doi/10.1126/scirobotics.ady6304">Science</a> ] via [ <a href="https://www.epfl.ch/labs/rrl/">RRL</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="unorxwlzlfk"><em>Now that the <a href="https://robotsguide.com/robots/atlas" target="_blank">Atlas</a> enterprise platform is getting to work, the research version gets one last run in the sun. Our engineers made one final push to test the limits of full-body control and mobility, with help from the RAI Institute.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="dabc1539dcc8e868a47dee3a32fd7b46" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/UNorxwlZlFk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://rai-inst.com/">RAI</a> ] via [ <a href="https://bostondynamics.com/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="khimsr8guce"><em>Announcing Isaac 0: the laundry-folding robot we’re shipping to homes, starting in February 2026 in the Bay Area.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3e2ca93296a7724dd71ad07d146372c8" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/KhImSR8GuCE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.weaverobotics.com/">Weave Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="md7auy7lh34"><em>In a paper published in Science, researchers at the Max Planck Institute for Intelligent Systems, the Humboldt University of Berlin, and the University of Stuttgart have discovered that the secret to the elephant’s amazing sense of touch is in its unusual whiskers. The interdisciplinary team analyzed elephant-trunk whiskers using advanced microscopy methods that revealed a form of material intelligence more sophisticated than the well-studied whiskers of rats and mice. This research has the potential to inspire new physically intelligent robotic sensing approaches that resemble the unusual whiskers that cover the elephant trunk.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1dfc845660633f2fd1566c3989dd428d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/MD7Auy7lH34?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.mpg.de/26113474/elephant-trunk-whiskers-exhibit-material-intelligence?c=2249">MPI</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="rcpuqmvs37q">Got an interest in autonomous mobile robots, <a href="https://spectrum.ieee.org/tag/robot-operating-system" target="_blank">ROS2</a>, and a mere US $150 lying around? Try this.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="648c0f35b6cbb775c227c778b61e3aea" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/RCPUQmvS37Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://makerspet.com/store#!/Arduino-ROS2-Self-Driving-Robot-120mm-Build-Pack/p/725772983">Maker's Pet</a> ]</p><p>Thanks, Ilia!</p><div class="horizontal-rule"></div><p class="rm-anchors" id="9ti9mi8rbiq">We’re giving <a href="https://spectrum.ieee.org/topic/robotics/humanoid-robots/" target="_blank">humanoid robots</a> swords now.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4bb44dd7b81db8ae1c0a107f312c9998" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/9Ti9Mi8rbIQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.robotera.com/en/">Robotera</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ew3az19rlqa"><em>A system developed by researchers at the University of Waterloo lets people collaborate with groups of robots to create works of art inspired by music.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="16928fc1dec0c295881676004533743c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ew3az19rlqA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://uwaterloo.ca/news/media/translating-music-light-and-motion-robots">Waterloo</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="exp3fsnqxqw"><em>FastUMI Pro is a multimodal, model-agnostic data acquisition system designed to power a truly end-to-end closed loop for embodied intelligence, transforming real-world data into genuine robotic capability.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="4c6a332d7e9b76af05f00dc9990e971e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/EXP3fsnQXqw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.lumosbot.tech/">Lumos Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="4fbygialjyu"><em>We usually take fingernails for granted, but they’re vital for fine-motor control and feeling textures. Our students have been doing some great work looking into the mechanics behind this.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b12a33801a528b063c865db11a550308" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/4FByGIALjyU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/html/2602.05156v1">Paper</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="uxuvmz3nwto"><em>This is a 550-lb. all-electric coaxial unmanned rotorcraft developed by Texas A&M University’s Advanced Vertical Flight Laboratory and Harmony Aeronautics as a technology demonstrator for our quiet-rotor technology. The payload capacity is 200 lb. (gross weight = 750 lb). The noise level measured was around 74 dBA in hover mode at 50 feet, making this probably the quietest rotorcraft at this scale.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5112475428ad208d8d71a9ba961c0fbb" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/uxuvMz3nwto?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://harmonyaeronautics.com/">Harmony Aeronautics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="bk9k_mjjlxe"><em>Harvard scientists have created an advanced 3D-printing method for developing soft robotics. This technique, called rotational multimaterial 3D printing, enables the fabrication of complex shapes and tubular structures with dissolvable internal channels. This innovation could someday accelerate the production of components for surgical robotics and assistive devices, advancing medical technology.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d8440b53651f8443a6733570f8f342a6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/BK9K_mJjlxE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://seas.harvard.edu/news/3d-printing-soft-robots">Harvard</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="a48dohk0u7i"><em>The Lynx M20 wheel-legged robot steps onto the ice and snow, taking on challenges inspired by four winter sports scenarios. Who says robots can’t enjoy winter sports?</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cb096d509d326a8b7fc63c514373044b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/a48DoHK0U7I?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en">Deep Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ec712qh3t6g">NGL right now I find this more satisfying to watch than a humanoid doing just about anything.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="29de1f6d610f3a555db97ef444bf7f06" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Ec712qh3T6g?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.linkedin.com/posts/fanuc-america-corporation_robotic-case-packing-and-palletizing-system-activity-7426656807932203009-y4hR/">Fanuc</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="xqfk1bd5bkg"><em>At Mentee Robotics, we design and build humanoid robots from the ground up with one goal: reliable, scalable deployment in real-world industrial environments. Our robots are powered by deep vertical integration across hardware, embedded software, and AI, all developed in-house to close the Sim2Real gap and enable continuous, around-the-clock operation.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="65329eda591818e176b81a494fe5b599" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/XqFk1Bd5BKg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.menteebot.com/">Mentee Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="3y0rawjlaxs">You don’t need to watch this whole video, but the idea of little submarines that hitch rides on bigger boats and recharge themselves is kind of cool.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a15f69e364382ed4d168f7fdbc178597" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/3y0RAwJlAxs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.lockheedmartin.com/en-us/products/mmauv.html">Lockheed Martin</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="u6zp38xurgs"><em>Learn about the work of Dr. Roland Siegwart, Dr. Anibal Ollero, Dr. Dario Floreano, and Dr. Margarita Chli on flying robots and some of the challenges they are still trying to tackle in this video created based on their presentations at ICRA@40, the 40th-anniversary celebration of the IEEE International Conference on Robotics and Automation.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="facbfd3b29494abfbcc4306c7c81ecca" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/U6ZP38XUrGs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://icra40.ieee.org/">ICRA@40</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 13 Feb 2026 16:30:03 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-robot-collective</guid><category>Modular-robots</category><category>Video-friday</category><category>Autonomous-robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/gif" url="https://spectrum.ieee.org/media-library/robot-collective-crawls-under-a-bridge-of-rocks-with-glowing-lights-video-speed-increased-10x.gif?id=64423332&amp;width=980"></media:content></item><item><title>Video Friday: Autonomous Robots Learn By Doing in This Factory</title><link>https://spectrum.ieee.org/autonomous-warehouse-robots</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/robotic-arms-on-mobile-bases-sort-crates-on-a-conveyor-belt-in-a-warehouse.png?id=63907821&width=2000&height=1500&coordinates=146%2C0%2C146%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://2026.ieee-icra.org/">ICRA 2026</a>: 1–5 June 2026, VIENNA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="qdwi4cn3oi0"><em>To train the next generation of <a data-linked-post="2650273449" href="https://spectrum.ieee.org/toyota-to-invest-1-billion-in-ai-and-robotics-rd" target="_blank">autonomous robots</a>, scientists at Toyota Research Institute are working with Toyota Manufacturing to deploy them on the factory floor.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="197161b054a2a3e4ef30ccd9b27cb5b5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/QDwi4CN3OI0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.linkedin.com/posts/toyota-research-institute_whats-next-for-tri-robotics-max-bajracharya-activity-7424198589196685313-4e92?utm_source=share&utm_medium=member_desktop&rcm=ACoAAAM4nT0BW_DvaXaoyr7IuJL-to9SJ5MlYT4">Toyota Research Institute</a> ]</p><p>Thanks, Erin!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="sh0chr6usao"><em>This is just one story (of many) about how we tried, failed, and learned how to improve our ‪<a data-linked-post="2650278428" href="https://spectrum.ieee.org/in-the-air-with-ziplines-medical-delivery-drones" target="_blank">drone delivery</a> system.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="be361c20896c763261fbae4500176505" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/sH0cHr6USao?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Okay, but like you didn’t show the really cool bit...?</p><p>[ <a href="https://www.zipline.com/">Zipline</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="y2dhzlpgdwy"><em>We’re introducing KinetIQ, an AI framework developed by Humanoid, for end-to-end orchestration of humanoid robot fleets. KinetIQ coordinates wheeled and bipedal robots within a single system, managing both fleet-level operations and individual robot behavior across multiple environments. The framework operates across four cognitive layers, from task allocation and workflow optimization to task execution based on Vision-Language-Action models and whole-body control taught by reinforcement learning, and is shown here running across our wheeled industrial robots and bipedal R&D platform.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e2c953a1800d81c0e0b0040c519e3979" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Y2DhzLPGdwY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://thehumanoid.ai/">Humanoid</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="bp7esfyyv4g"><em>What if a robot gets damaged during operation? Can it still perform its mission without immediate repair? Inspired by the self-embodied resilience strategies of stick insects, we developed a decentralized adaptive resilient neural control system (DARCON). This system allows legged robots to autonomously adapt to limb loss, ensuring mission success despite mechanical failure. This innovative approach leads to a future of truly resilient, self-recovering robotics.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="453cb36391a89fdaf9b4559aec7fe3c9" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Bp7esFyYV4g?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://advanced.onlinelibrary.wiley.com/doi/10.1002/aisy.202500270">VISTEC</a> ]</p><p>Thanks, Poramate!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="lo2gluku4c8"><em>This animation shows Perseverance’s point of view during a drive of 807 feet (246 meters) along the rim of Jezero Crater on 10 December 2025, the 1,709th Martian day, or sol, of the mission. Captured over 2 hours and 35 minutes, 53 navigation-camera (Navcam) image pairs were combined with rover data on orientation, wheel speed, and steering angle, as well as data from Perseverance’s inertial measurement unit, and placed into a 3D virtual environment. The result is this reconstruction with virtual frames inserted about every 4 inches (0.1 meters) of drive progress.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="87c1f3217435e9b3f4931fa59fa64683" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/LO2GluKu4C8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://science.nasa.gov/mission/mars-2020-perseverance/">NASA Jet Propulsion Lab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="sx4wkuhap4e"><em>−47.4 °C, 130,000 steps, 89.75°E, 47.21°N… On the extremely cold snowfields of Altay, the birthplace of human skiing, Unitree’s humanoid robot G1 left behind a unique set of marks.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c7485e5fd02cbfca05ec835a4a3e766c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/SX4WKUHAP4E?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.unitree.com/">Unitree</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="as_ouaft2he"><em>Representing and understanding 3D environments in a structured manner is crucial for autonomous agents to navigate and reason about their surroundings. In this work, we propose an enhanced hierarchical 3D scene graph that integrates open-vocabulary features across multiple abstraction levels and supports object-relational reasoning. Our approach leverages a vision language model (VLM) to infer semantic relationships. Notably, we introduce a task-reasoning module that combines large language models and a VLM to interpret the scene graph’s semantic and relational information, enabling agents to reason about tasks and interact with their environment more intelligently. We validate our method by deploying it on a quadruped robot in multiple environments and tasks, highlighting its ability to reason about them.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="768f3e1223995648c873b29ea0bac5b3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/as_oUaFT2hE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ntnu-arl.github.io/reasoning_graph/">Norwegian University of Science & Technology, Autonomous Robots Lab</a> ]</p><p>Thanks, Kostas!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="hmfgfp9xohq"><em>We present HoLoArm, a quadrotor with compliant arms inspired by the nodus structure of dragonfly wings. This design provides natural flexibility and resilience while preserving flight stability, which is further reinforced by the integration of a reinforcement-learning control policy that enhances both recovery and hovering performance.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5096413bf8582da71f43303183472528" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/hmfgFP9XoHQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ieeexplore.ieee.org/abstract/document/11361075">HO Lab via IEEE Robotics and Automation Letters</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="giijs_mmmrg"><em>In this work, we present SkyDreamer, to the best of our knowledge the first end-to-end vision-based autonomous-drone racing policy that maps directly from pixel-level representations to motor commands.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="90c32886c61319893e1ba59f4bcf677f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/GiIjs_MmMrg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/pdf/2510.14783">MAVLab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="gr867dgh5tk"><em>This video showcases AI Worker, equipped with five-finger hands, performing dexterous object manipulation across diverse environments. Through teleoperation, the robot demonstrates precise, humanlike hand control in a variety of manipulation tasks.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="126ba42a9ad07785c5c33eae1738c225" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/Gr867DGH5tk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://ai.robotis.com/hands/introduction_hands.html">Robotis</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="phomadn-qze"><em>Autonomous following, 45-degree slope climbing, and reliable payload transport in extreme winter conditions, built to support operations where environments push the limits.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="90a3ca6f001914b8f44f854ed188eb0a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/pHOmadN-qzE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en">DEEP Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="80qqrfmvir0"><em>Living architectures, from plants to beehives, adapt continuously to their environments through self-organization. In this work, we introduce the concept of architectural swarms: systems that integrate swarm robotics into modular architectural façades. The Swarm Garden exemplifies how architectural swarms can transform the built environment, enabling “living-like” architecture for functional and creative applications.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="3324e87c4f923abd04c61086c8018795" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/80QqrFmvIr0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.science.org/doi/10.1126/scirobotics.ady7233">SSR Lab via Science Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="wxtmqieul0s">Here are a couple of IROS 2025 keynotes, featuring Bram Vanderborght and Kyu-Jin Cho.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="1589c01bc2c7e94f846550cf18fa08c3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/WXtMQIeUl0s?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p><br/></p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="99ea91bbf11816455fe9e3668e4f0c4e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/j6fEnhU56aA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span> <small class="image-media media-caption" placeholder="Add Photo Caption...">- YouTube</small> <small class="image-media media-photo-credit" placeholder="Add Photo Credit..."> <a href="https://www.youtube.com/watch?v=j6fEnhU56aA" target="_blank">www.youtube.com</a> </small> </p><p>[ <a href="https://www.iros25.org/">IROS 2025</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 06 Feb 2026 17:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/autonomous-warehouse-robots</guid><category>Video-friday</category><category>Autonomous-robots</category><category>Humanoid-robots</category><category>Industrial-robots</category><category>Robot-ai</category><category>Perseverance-rover</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/robotic-arms-on-mobile-bases-sort-crates-on-a-conveyor-belt-in-a-warehouse.png?id=63907821&amp;width=980"></media:content></item><item><title>Ode to Very Small Devices</title><link>https://spectrum.ieee.org/poetry-for-engineers-ode</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/anthropomorphized-miniature-gadgets-standing-on-the-heads-of-two-hex-bolts.jpg?id=63525887&width=2000&height=1500&coordinates=78%2C0%2C79%2C0"/><br/><br/><p>As fairies for the Irish or leeks for Welsh,<br/><span>it’s the secret lives of small hidden machines,<br/></span><span>their junctures, and networks that inspire me:<br/></span><span>Mystic hidden functionaries that make<br/></span><span>our made world live, brave little servo motors,<br/></span><span>whose couplers, whose eccentric fire-filled<br/></span><span>sensors are encased in bakelite with brass<br/></span><span>screws, who stare with red eyes, who gauge moisture,<br/></span><span>who notice tiny motions and respond,<br/></span><span>whose cooling fans call out in white-noise<br/></span><span>registers like older folk singers–I can<br/></span><span>almost hear their earlier songs, their strong voices<br/></span><span>now yelps, their thumps, their throbs, their hum, their chant–,<br/></span><span>they click, they whir, they are sent spinning<br/></span><span>inside like teen girls giggling over boy bands.<br/></span><span>Most of all: ones waiting silently, concealing<br/></span><span>the surprise of their purpose, tasks not yet known,<br/></span><span>their true natures found only in connections.</span></p><p>Those that listen, those that speak,<br/><span>those that control cool and heat,<br/></span><span>those that open doors, those that lock<br/></span><span>all the things that we’ve forgot,<br/></span><span>those that hide, those that disclose<br/></span><span>those embedded in our clothes<br/></span><span>those in our ears, those in our hearts<br/></span><span>those that bring together, those a part<br/></span><span>of divisions, those like birds,<br/></span><span>like parrots that complete our words,<br/></span><span>those like fish, those that entrap,<br/></span><span>those that free, those that freely flap<br/></span><span>in fierce winds, those that replace<br/></span><span>what we have lost, those that see<br/></span><span>at night, in fog, in brightness, in fear,<br/></span><span>those that show what we hold dear,<br/></span><span>those that tempt, those that repel,<br/></span><span>those that buy and those that sell,<br/></span><span>those that keep us alive, those that<br/></span><span>don’t, won’t, couldn’t and cannot.</span></p><p>Parts of one mind, not mine, blunt orchestra<br/><span>of information, bundles of feelers<br/></span><span>reaching out to touch us, teach us, guide us<br/></span><span>to form better futures better understood.<br/></span><span>May your sounds, your chimes, your silence calm us.<br/></span><span>May your tender tendrils touch what we seek.<br/></span><span>Small parts becoming one being intertwined,<br/></span><span>a world in itself, remind us to be kind. </span></p>]]></description><pubDate>Fri, 30 Jan 2026 19:02:06 +0000</pubDate><guid>https://spectrum.ieee.org/poetry-for-engineers-ode</guid><category>Poetry</category><category>Robotics</category><category>Type-departments</category><category>Verse-becomes-electric</category><dc:creator>Paul Jones</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/anthropomorphized-miniature-gadgets-standing-on-the-heads-of-two-hex-bolts.jpg?id=63525887&amp;width=980"></media:content></item><item><title>Video Friday: Multitasking Robots Smoothly Do the Things Together</title><link>https://spectrum.ieee.org/multitasking-robot</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/humanoid-robot-holds-small-yellow-bag-near-biohazard-trash-bin-in-a-hallway.png?id=63524908&width=2000&height=1500&coordinates=133%2C0%2C134%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://2026.ieee-icra.org/">ICRA 2026</a>: 1–5 June 2026, VIENNA</h5><p>Enjoy this week’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="xpc3kfygwis"><em>Westwood Robotics is proud to announce a major update: THEMIS Gen2.5, the world’s first commercial full-size humanoid robot capable of manipulation on the move!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="46fb0888f9e5404f861e060bdf4c4f09" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xpC3KfYGwIs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Now that you mention it, the bit at the end where the robot picks up a can while walking? I haven’t seen a lot of that.</p><p>[ <a href="https://www.westwoodrobotics.io/">Westwood Robotics</a> ] </p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="lqsvtrrtbrs"><em>Last year, Helix showed that a single neural network could control a humanoid’s upper body from pixels. Today, Helix 02 extends that control to the entire robot—walking, manipulating, and balancing as one continuous system.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b9f90c12f941a85805a47437a690097e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/lQsvTrRTBRs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Why, yes, I am a normal human, and this is very similar to the default state of my kitchen.</p><p>[ <a href="https://www.figure.ai/">Figure</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="npovmr80scc"><em><a data-linked-post="2659934687" href="https://spectrum.ieee.org/ieee-spectrum-wins-11-awards" target="_blank">Harry Goldstein</a>, our editor in chief, went to meet Sprout from Fauna Robotics. He was skeptical at first, but Sprout won him over with its robotic charm.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7bb8e2c7af09fe415bee2fac1f94523b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/npOVmR80sCc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://faunarobotics.com/">Fauna Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ta_ttogdmfa"><em>Kimberly Elenberg is showing how the data collected by <a data-linked-post="2674674488" href="https://spectrum.ieee.org/darpa-triage-challenge-robots" target="_blank">robotic responders </a>can save lives in mass casualty events.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="845574b48804af21ea93e39ca82a9128" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/tA_tToGDMFA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.cs.cmu.edu/news/2024/chiron-second-round">Carnegie Mellon University</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ttey2j1jnyy">The educational robotics market is tough, but you’ve got to hand it to <a data-linked-post="2650254165" href="https://spectrum.ieee.org/best-robots-of-ces" target="_blank">Sphero</a>—going strong since 2011, which is pretty incredible.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7b05c6aef6b6bb1b81072f7e54e3b5ba" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/TtEy2J1jNYY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://sphero.com/">Sphero</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="4fljuahsmgi"><em>If you want to fly in crazy conditions, you have to <a data-linked-post="2650278485" href="https://spectrum.ieee.org/photo-essay-tornados-and-frisky-birds-couldnt-stop-these-delivery-drones" target="_blank">flight test</a> in those conditions. Here’s how and why we do it!</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d7507888d3974c47b60be85458b94690" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/4flJuahSmgI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.zipline.com/">Zipline</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="09lbks8zplo">I want to be impressed more by the idea of 3D-printing skin and skeleton at the same time, but come on, animals have been doing that for literally hundreds of years without even trying.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="fb34b8a725411398845214eb5b0bfba0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/09LbkS8zpLo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.jsk.t.u-tokyo.ac.jp/">JSK Lab, University of Tokyo</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="uaba2po-i5g">If there is a market for small bipedal robots that can both ski and be dinosaurs, LimX has it covered.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="058c2dbd8dbef66320764f7cb3a51abe" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/uABA2Po-I5g?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.limxdynamics.com/en/tron1">LimX</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="u-yly7nghsq"><em>How do you remotely control robots that change shape? We introduce a method for user-guided control of <a data-linked-post="2650266702" href="https://spectrum.ieee.org/epfl-developing-connectors-for-modular-floating-robots" target="_blank">modular robots</a> using reconfigurable joint-space joysticks (JoJo) and real-time optimization. We demonstrate this system on two different robots, Mori3 and Roombots. The video shows examples of these robots performing object manipulation, locomotion, human-assistance, and reconfiguration, controlled by our system.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="bebd7cd6c2f469e1a8a0a2315d9eb7d4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/U-yly7NGhsQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.epfl.ch/labs/rrl/">EPFL Reconfigurable Robotics Lab</a> ] via [ <a href="https://www.nature.com/articles/s41467-025-63706-6" target="_blank">Nature Communications</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="xoheteywrcs"><em>Quadrotor Biplane Tailsitter (QBiT) UAVs at four different sizes (4, 12, 25, and 50 lbs) developed at Texas A&M University. QBiT combines the mechanical simplicity of a quadrotor drone with the cruise efficiency of a fixed-wing aircraft.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="138ec66a4f3e7c9a6e7cae4953cd0d18" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xohEteYwRCs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://avfl.engr.tamu.edu/">Texas A&M University</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="7b1obxjj75q">There’s a new DARPA challenge for “novel drone designs that can carry payloads more than four times their weight, which would revolutionize the way we use drones across all sectors.”</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="52742a3076f4e48a06e74730790b75b7" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/7b1obXjJ75Q?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.darpa.mil/research/challenges/lift">DARPA</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="-znp4jrfu8i">Here are a couple of plenary and keynote talks from IROS 2025, from <a data-linked-post="2667733826" href="https://spectrum.ieee.org/marco-hutter-ai-institute" target="_blank">Marco Hutter</a> and <a data-linked-post="2667878568" href="https://spectrum.ieee.org/ieee-society-boosting-student-membership" target="_blank">Karinne Ramirez Amaro</a>.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a4ce79733f9d1aba13f6eced3995901b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/-Znp4JrFu8I?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p><span>[ </span><a href="https://www.iros25.org/">IROS 2025</a><span> ]</span></p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 30 Jan 2026 18:30:02 +0000</pubDate><guid>https://spectrum.ieee.org/multitasking-robot</guid><category>Humanoid-robots</category><category>Video-friday</category><category>Commercial-robots</category><category>Drones</category><category>Educational-robots</category><category>Bipedal-robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/humanoid-robot-holds-small-yellow-bag-near-biohazard-trash-bin-in-a-hallway.png?id=63524908&amp;width=980"></media:content></item><item><title>Video Friday: Humans and Robots Team Up in Battlefield Triage</title><link>https://spectrum.ieee.org/darpa-triage-challenge-robot</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/four-legged-robot-with-camera-moving-across-grassy-terrain.png?id=63130606&width=2000&height=1500&coordinates=164%2C0%2C165%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://2026.ieee-icra.org/">ICRA 2026</a>: 1–5 June 2026, VIENNA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p class="rm-anchors" id="4cc_kg-heha">One of my favorite parts of robotics is watching research collide with non-roboticists in the real (or real-ish) world.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b82661335e0af4180886e0a12f5c9f9d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/4Cc_kG-HeHA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.darpa.mil/research/challenges/darpa-triage-challenge">DARPA</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="2wh8uss-2vo">Spot will <a data-linked-post="2674366356" href="https://spectrum.ieee.org/wildfire-drones" target="_blank">put out fires</a> for you. Eventually. If it feels like it.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="68e23ee0ffe0f3a5f8cc1ea7632eb66c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/2wH8USs-2vo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://asmedigitalcollection.asme.org/letterstransrobotics/article-abstract/1/3/031004/1229384/Development-of-an-Autonomous-Firefighting?redirectedFrom=fulltext">Mechatronic and Robotic Systems Laboratory</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="vyl-cplnyp0">All those robots rising out of their crates is not sinister at all.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5b5996b57b5f208087a25bb04a771f83" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/vYl-CPlnYp0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.limxdynamics.com/en">LimX</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="tlq_oynn1sc"><em>The Lynx M20 quadruped robot recently completed an extreme cold-weather field test in Yakeshi, Hulunbuir, operating reliably in temperatures as low as –30°C.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="425c899869ec71cd545f1342fce01eb4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/TlQ_OYNn1Sc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.cn/en">DEEP Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="jq1vmt5l1zg"><em>This is a teaser video for KIMLAB’s new teleoperation robot. For now, we invite you to enjoy the calm atmosphere, with students walking, gathering, and chatting across the UIUC Main Quad—along with its scenery and ambient sounds, without any technical details. More details will be shared soon. Enjoy the moment.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="96651d42358f3b5b5750d425e3fc0560" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/jq1Vmt5L1Zg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>The most incredible part of this video is that they have publicly available power in the middle of their quad.</p><p>[ <a href="https://publish.illinois.edu/kimlab2020/">KIMLAB</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="xxitvnsi4ei">For the eleventy-billionth time: Just because you can do a task with a <a data-linked-post="2666662286" href="https://spectrum.ieee.org/humanoid-robots" target="_blank">humanoid robot</a> doesn’t mean you should do a task with a humanoid robot.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2a8519483915cb8c15b9998df8ae3e23" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/xXiTvnsi4EI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.ubtrobot.com/en/">UBTECH</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="smadpijmdjq">I am less interested in this autonomous urban delivery robot and more interested in whatever that docking station is at the beginning that loads the box into it.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7af231056a3a33a98ced03fd7816e053" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/SMaDPiJMdjQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://unmanned.kaist.ac.kr/">KAIST</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="de06_clqrm0">Okay, so figuring out where <a data-linked-post="2650272183" href="https://spectrum.ieee.org/spot-is-boston-dynamics-nimble-new-quadruped-robot" target="_blank">Spot’s face</a> is just got a lot more complicated.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="22fc46969d1715269ce7259a740d72cf" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/de06_CLqrM0?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://bostondynamics.com/blog/a-new-perspective-for-facilities-inspection/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="da22c-akgzy"><em>An undergraduate team at HKU’s Tam Wing Fan Innovation Wing developed CLIO, an embodied tour-guide robot, just in months. Built on LimX Dynamics TRON 1, it uses LLMs for tour planning, computer vision for visitor recognition, and a laser pointer/expressive display for engaging tours.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e804d056d2be8c25a80df851a546358d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/DA22C-aKgZY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://arxiv.org/html/2512.05389">CLIO</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="jfufsr_xnqi">The future of work is doing work so that robots can then do the same work, except less well.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="34e1f025d50859ae35866e75b74b3730" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/JfUFSR_xnqI?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://global.agilex.ai/">AgileX</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 23 Jan 2026 17:00:03 +0000</pubDate><guid>https://spectrum.ieee.org/darpa-triage-challenge-robot</guid><category>Robotics</category><category>Video-friday</category><category>Darpa</category><category>Human-robot-interaction</category><category>Quadruped-robots</category><category>Humanoid-robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/four-legged-robot-with-camera-moving-across-grassy-terrain.png?id=63130606&amp;width=980"></media:content></item><item><title>Video Friday: Bipedal Robot Stops Itself From Falling</title><link>https://spectrum.ieee.org/video-friday-bipedal-robot</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/prototype-robot-next-to-a-digital-model-both-with-rounded-bodies-and-dome-shaped-heads.png?id=62822757&width=2000&height=1500&coordinates=260%2C0%2C260%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://2026.ieee-icra.org/">ICRA 2026</a>: 1–5 June 2026, VIENNA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p class="rm-anchors" id="zklqw-etviy">This is one of the best things I have ever seen. </p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="476b2d12fce3cafae560f39d7d09a622" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/zklqW-EtVIY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://publish.illinois.edu/kimlab2020/">Kinetic Intelligent Machine LAB</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="vlvttnvy_mc"><em>After years of aggressive testing and pushing the envelope with U.S. Army and Marine Corps partners, the <a data-linked-post="2650233072" href="https://spectrum.ieee.org/nasa-jpl-team-costar-darpa-subt-urban-circuit-systems-track" target="_blank">Robotic Autonomy in Complex Environments with Resiliency</a> (RACER) program approaches its conclusion. But the impact of RACER will reverberate far beyond the program’s official end date, leaving a legacy of robust autonomous capabilities ready to transform military operations and inspire a new wave of private-sector investment.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a6443b500c3228dc546d165cac5aef7a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/vLVtTNVY_Mc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.darpa.mil/news/2026/racer-finish-line">DARPA</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="a_hdbr3g_co">Best-looking humanoid yet.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="13d330c68822e8721f9765e34d7b059e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/a_HdbR3g_co?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://kawasakirobotics.com/eu-africa/news/20200714-01/">Kawasaki</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="0hiqs3tbb5g"><em>COSA (Cognitive OS of Agents) is a physical-world-native Agentic OS that unifies high-level cognition with whole-body motion control, enabling humanoid robots to think while acting in real environments. Powered by COSA, Oli becomes the first humanoid agent with both advanced loco-manipulation and high-level autonomous cognition.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="87e6a1a95e3622c88db50afb8b8b6534" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/0hIqs3TBb5g?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.limxdynamics.com/en">LimX Dynamics</a> ]</p><p>Thanks, Jinyan!</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ls_z60kjvek"><em>The 1X World Model’s latest update is a paradigm shift in robot learning: NEO now uses a physics-grounded video model (World Model) to turn any voice or text prompt into fully autonomous action, even for completely novel tasks and objects NEO has never seen before. By leveraging internet-scale video data fine-tuned on real robot experience, NEO can visualize future actions, predict outcomes, and execute them with humanlike understanding–all without prior examples. This marks the critical first step in NEO being able to collect data on its own to master new tasks all by itself. </em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="2d47ea7a3835cfe3199196e99b8ccf15" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/lS_z60kjVEk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.1x.tech/">1X</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="uuz00ozq_za">I’m impressed by the human who was mocapped for this.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5d68c459048195b673e4c3801152cdf2" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/uuz00OZq_ZA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://pndbotics.com/">PNDbotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ddnnl4d1kx8"><em>We introduce the GuideData Dataset, a collection of qualitative data, focusing on the interactions between guide dog trainers, visually impaired (BLV) individuals, and their guide dogs. The dataset captures a variety of real-world scenarios, including navigating sidewalks, climbing stairs, crossing streets, and avoiding obstacles. By providing this comprehensive dataset, the project aims to advance research in areas such as assistive technologies, robotics, and human-robot interaction, ultimately improving the mobility and safety of visually impaired people.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="18b078b2fbab2934ad05d04b5d95426d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/DDNnL4D1kX8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://guidedogrobot-hgidataset.github.io/">DARoS Lab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="_v77-uoldiq"><em>Fourier’s desktop Care-Bot prototype is gaining much attention at <a data-linked-post="2674863582" href="https://spectrum.ieee.org/robots-ces-2026" target="_blank">CES 2026</a>! Even though it’s still in the prototype stage, we couldn’t wait to share these adorable and fun interaction features with you.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="38ad9ffbee438201a2a1528faf69dcbe" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/_v77-uoLDIQ?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.fftai.com/">Fourier</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="7_a3bfefcje"><em>Volcanic gas measurements are critical for understanding eruptive activity. However, harsh terrain, hazardous conditions, and logistical constraints make near-surface data collection extremely challenging. In this work, we present an autonomous legged robotic system for volcanic gas monitoring, validated through real-world deployments on Mount Etna. The system combines a quadruped robot equipped with a quadrupole mass spectrometer and a modular autonomy stack, enabling long-distance missions in rough volcanic terrain.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c3c7088535c4808741867bc044ecdfc1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/7_a3BFefcJE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://leggedrobotics.github.io/etna-expedition/">ETH Zurich RSL</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="3fixjy2gwtg"><em>Humanoid and Siemens successfully completed a POC testing humanoid robots in industrial logistics. This is the first step in the broader partnership between the companies. The POC focused on a tote-to-conveyor destacking task within Siemens’s logistics process. HMND 01 autonomously picked, transported, and placed totes in a live production environment during a two-week on-site deployment at the Siemens Electronics Factory in Erlangen.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7a68b154448a7aa95ccd436eb2f94ab3" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/3FIXjy2GWTg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://thehumanoid.ai/">Humanoid</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="qjndiopdnby"><em>Four Growers, a category leader in intelligent ag-tech platforms, developed the GR-200 <a data-linked-post="2650276075" href="https://spectrum.ieee.org/autonomous-robots-plant-tend-and-harvest-entire-crop-of-barley" target="_blank">robotic harvesting</a> platform, powered by FANUC’s LR Mate robot. The system combines AI-driven vision and motion planning to identify and harvest ripe tomatoes with quick precision.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="133fc9e32f9dd8a9bd7ffcadc058843d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/QJndIoPDnBY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.fanucamerica.com/case-studies/automating-agriculture-greenhouse-turns-to-robots-for-tomato-harvesting">FANUC</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="nhfu5kha2fw"><em>Columbia Engineers built a robot that, for the first time, is able to learn facial lip motions for tasks such as speech and singing. In a new study published in Science Robotics, the researchers demonstrate how their robot used its abilities to articulate words in a variety of languages, and even sing a song out of its AI-generated debut album, “hello world_.” The robot acquired this ability through observational learning rather than via rules. It first learned how to use its 26 facial motors by watching its own reflection in the mirror before learning to imitate human lip motion by watching hours of YouTube videos.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="64b31d62efe4b3a9d5e8206d22cc3a10" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/nhFU5KHA2fw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.engineering.columbia.edu/about/news/robot-learns-lip-sync">Columbia</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="p0exoiozi6y">Roborock has some odd ideas about what lawns are like.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="833f8c5a9397b53fe812cf4fe8073d1c" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/p0eXOIOZi6Y?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://newsroom.roborock.com/gl/news/ces-2026-roborock-releases-the-world-s-first-robotic-vacuum-with-wheel-leg-architecture-as-it-joins-hands-with-real-madrid-football-club-">Roborock</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="k_sjgiajhes"><em>DEEP Robotics’ quadruped robots demonstrate coordinated multi-module operations under unified command, tackling complex and dynamic firefighting scenarios with agility and precision.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="8be4e43a12bb146e40ba2af765dc81a1" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/K_sJGIAjhes?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.deeprobotics.us/">DEEP Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="njhdpouccoe"><em>Unlike statically stable wheeled platforms, humanoids are dynamically stable, requiring continuous active control to maintain balance and prevent falls. This inherent instability presents a critical challenge for functional safety, particularly in collaborative settings. This presentation will introduce Synapticon’s POSITRON platform, a comprehensive solution engineered to address these safety-critical demands. We will explore how its integrated hardware and software enable robust, certifiable safety functions that meet the highest industrial standards, providing key insights into making the next generation of humanoid robots safe for real-world deployment.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="e696e242cd7e84cd609c348109bce902" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/njHdPOUCcoE?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.synapticon.com/en/products/positron-safety">Synapticon</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="voocufteiaw"><em>The University of California, Berkeley, is world-famous for its AI developments, and one big name behind them is <a data-linked-post="2650253755" href="https://spectrum.ieee.org/ken-goldberg-discusses-telerobots-androids-and-heidegger" target="_blank">Ken Goldberg</a>. Longtime professor and lifelong artist, Ken is all about deep learning while staying true to “good old-fashioned engineering.” Hear Ken talk about his approach to vision and touch for robotic surgeries and how robots will evolve across the board.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d71d8beecc44e7b59f1c9bbc6bf7f4a6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/VooCuFTEIaw?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.youtube.com/playlist?list=PLCkt0hth826G9AtnOrQsPbKKD5JmdaMXb">Waymo</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 16 Jan 2026 18:30:02 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-bipedal-robot</guid><category>Robotics</category><category>Video-friday</category><category>Bipedal-robots</category><category>Humanoid-robots</category><category>Quadruped-robots</category><category>Industrial-robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/prototype-robot-next-to-a-digital-model-both-with-rounded-bodies-and-dome-shaped-heads.png?id=62822757&amp;width=980"></media:content></item><item><title>Video Friday: Robots Are Everywhere at CES 2026</title><link>https://spectrum.ieee.org/robots-ces-2026</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/a-humanoid-robot-stands-in-a-workshop-surrounded-by-yellow-safety-barriers.png?id=62698082&width=2000&height=1500&coordinates=240%2C0%2C240%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://2026.ieee-icra.org/">ICRA 2026</a>: 1–5 June 2026, VIENNA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><blockquote class="rm-anchors" id="sd8ivhpji6g"><em>We’re excited to announce the product version of our Atlas® robot. This enterprise-grade humanoid robot offers impressive strength and range of motion, precise manipulation, and intelligent adaptability—designed to power the new industrial revolution.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d674b10c0938838fe2def9b457d9ce29" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/sd8ivhpjI6g?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="d4026917b3e805208b4e4ef8b77db99e" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/rrUHZKlrxms?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://bostondynamics.com/products/atlas/">Boston Dynamics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="suybti4yc-y">I appreciate the creativity and technical innovation here, but realistically, if you’ve got more than one floor in your house? Just get a second robot. That single-step sunken living room though....</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="967af367ce60e8b8c44b587901503219" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/SuyBti4YC-Y?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://us.roborock.com/pages/ces-2026">Roborock</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="btjxyrtr9m8"><em>Wow, SwitchBot’s CES 2026 video shows almost as many robots in their fantasy home as I have in my real home.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f6760b8fcecf22cb90a35467665c76b5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/btJxyrtR9M8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.switch-bot.com/pages/events-ces-2026">SwitchBot</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="bkuuepmjdc8">What is happening in robotics right now that I can derive more satisfaction from watching <a data-linked-post="2650252967" href="https://spectrum.ieee.org/building-better-solar-cells-at-robot-speed" target="_blank">robotic process automation</a> than I can from watching yet another <a data-linked-post="2673979476" href="https://spectrum.ieee.org/humanoid-robot-olympics" target="_blank">humanoid</a> video?</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d4bff1747272ef7c71f39a26e303c2d6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/BkUuepmJdc8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://new.abb.com/news/detail/132408/cstmr-xiang-piao-piao-increases-production-efficiency-by-40-percent-thanks-to-abb-automated-system">ABB</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="jzllfrhrc4g">Yes, this is definitely a robot I want in close proximity to my life.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="aab242eba5b8e024ce3d90816c57f7e0" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/JZllfrHRc4g?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.unitree.com/H2">Unitree</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="xxvtjtorxl0"><em>The video below demonstrates a MenteeBot learning, through mentoring, how to replace a battery in another MenteeBot. No teleoperation is used.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a487317b0ac4d3de3b6da4fcdf832901" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/_bjq90duOcM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.menteebot.com/">Mentee Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="ywwwcecwiza">Personally, I think we should encourage humanoid robots to fall much more often, just so we can see whether they can get up again.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="424dba74254eee7aa7c1b1db8b527dc5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/ywWwcecwiZA?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.agilityrobotics.com/">Agility Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="mx0dzgtptbg"><em>Achieving long-horizon, reliable clothing manipulation in the real world remains one of the most challenging problems in robotics. This live test demonstrates a strong step forward in embodied intelligence, vision-language-action systems, and real-world robotic autonomy.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="d200084eabd7ad3a3de53a413d28642d" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/MX0DzGtPtBg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://mmlab.hk/research/kai0">HKU MMLab</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="626qsq1czoc"><em>Millions of people around the world need assistance with feeding. Robotic feeding systems offer the potential to enhance autonomy and quality of life for individuals with impairments and reduce caregiver workload. However, their widespread adoption has been limited by technical challenges such as estimating bite timing, the appropriate moment for the robot to transfer food to a user’s mouth. In this work, we introduce WAFFLE: Wearable Approach For Feeding with LEarned Bite Timing, a system that accurately predicts bite timing by leveraging wearable sensor data to be highly reactive to natural user cues such as head movements, chewing, and talking.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="0e79a4e3a0a71147639dcb030e4b984b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/626qsQ1CZOc?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://sites.google.com/view/bitetiming/">CMU RCHI</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="jypic0aarpg">Humanoid robots are now available as platforms, which is a great way of sidestepping the whole practicality question.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="5886d805ea82159b9807bc413835d953" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/JyPiC0aArPg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://pndbotics.com/humanoid">PNDbotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="ig41e0yqijg"><em>We’re introducing Spatially Enhanced Recurrent Units (SRUs)—a simple yet powerful modification that enables robots to build implicit spatial memories for navigation. Published in the International Journal of Robotics Research (IJRR), this work demonstrates up to +105 percent improvement over baseline approaches, with robots successfully navigating 70+ meters in the real world using only a single forward-facing camera.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="41b5c74e3016a6bc4dcaf0075fa8cc32" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/iG41e0yQIjg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://michaelfyang.github.io/sru-project-website/">ETHZ RSL</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="gxnpy0-zwcu">Looking forward to the <a data-linked-post="2674674488" href="https://spectrum.ieee.org/darpa-triage-challenge-robots" target="_blank">DARPA Triage Challenge</a> this fall!</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ed4d9467100b26eaccd6863737c18bb5" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/GXNPY0-zwcU?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.darpa.mil/research/challenges/darpa-triage-challenge">DARPA</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="u7wnlqt5fs8">Here are a couple of good interviews from the Humanoids Summit 2025.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="ddab8d35b1f477a237d96158af7548ee" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/U7wnLqt5FS8?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p class="shortcode-media shortcode-media-youtube"> <span class="rm-shortcode" data-rm-shortcode-id="b3cd566727abd1e6f5230d1a0f49f74a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/vdCWKUfoaFk?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://humanoidssummit.com/">Humanoids Summit</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 09 Jan 2026 18:00:04 +0000</pubDate><guid>https://spectrum.ieee.org/robots-ces-2026</guid><category>Robotics</category><category>Video-friday</category><category>Ces-2026</category><category>Humanoid-robots</category><category>Industrial-robots</category><category>Human-robot-interaction</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/a-humanoid-robot-stands-in-a-workshop-surrounded-by-yellow-safety-barriers.png?id=62698082&amp;width=980"></media:content></item><item><title>Video Friday: Watch Scuttle Evolve</title><link>https://spectrum.ieee.org/video-friday-robot-farming</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/robotic-snake-in-grassy-terrain-featuring-gears-and-yellow-accents.png?id=62651449&width=2000&height=1500&coordinates=240%2C0%2C240%2C0"/><br/><br/><p><span>Video Friday is your weekly selection of awesome robotics videos, collected by your friends at </span><em>IEEE Spectrum</em><span> robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please </span><a href="mailto:automaton@ieee.org?subject=Robotics%20event%20suggestion%20for%20Video%20Friday">send us your events</a><span> for inclusion.</span></p><h5><a href="https://2026.ieee-icra.org/">ICRA 2026</a>: 1–5 June 2026, VIENNA</h5><p>Enjoy today’s videos!</p><div class="horizontal-rule"></div><div style="page-break-after: always"><span style="display:none"> </span></div><p class="rm-anchors" id="wbinr4it5ac">I always love seeing robots <a href="https://spectrum.ieee.org/ground-control-robot-insects" target="_blank">progress from research projects to commercial products</a>.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="df39d3617f63dea293fcf219a7b96c00" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/wBinR4It5Ac?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://groundcontrolrobotics.com/">Ground Control Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="rwrjncufqcs">Well this has to be one of the most “watch a robot do this task entirely through the magic of jump cuts” I’ve ever seen.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="c856abe0892f115574852f26bc595d1b" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/RwRJNCUFQcs?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.ubtrobot.com/en/">UBTECH</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="yys2pztl07g">Very satisfying sound on this one.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="f3be98b2b7e2a543ea1955c22f8417b6" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/yYS2pZtL07g?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.pudurobotics.com/en">Pudu Robotics</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="tv3xz0a55xg"><em>Welcome to the AgileX Robotics Data Collection Facility, where real robots build the foundation for universal embodied intelligence. Our core mission? Enable large-scale data sharing and reuse across dual-arm teleoperation robots of diverse morphologies, breaking down data silos that slow down AI progress.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="7ef69d635e6a19f56756a61d792a550a" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/tv3xz0A55Xg?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://global.agilex.ai/products/pika">AgileX</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="800zalklgmo">I’m not sure how much thought was put into this, but giving a service robot an explicit cat face could be a good way of moderating expectations on its behavior and interactivity.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="a7c74dd111801d3dc98a5e69b3579d2f" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/800zaLKlgmo?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.pudurobotics.com/en">Pudu Robotics</a> ]</p><div class="horizontal-rule"></div><p class="rm-anchors" id="lbw3ylbz6ty">UBTECH says they have built 1,000 of their <a data-linked-post="2674315827" href="https://spectrum.ieee.org/video-friday-baseball-robot" target="_blank">Walker S2 humanoid robots</a>, over 500 of which are “delivered & working.” I would very much like to know what “working” means in this context.</p><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="b6be5bc8aa73dd8fff037aac7460c6d4" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/lBW3YlbZ6tY?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>[ <a href="https://www.ubtrobot.com/en/">UBTECH</a> ]</p><div class="horizontal-rule"></div><blockquote class="rm-anchors" id="uzkhoiwtxkm"><em>Every story has its beginning, and ours started in 2023—a year defined by the unknown. Let technology return to passion; let trials catalyze evolution. Embracing growth, embarking on a new journey. We’ll see you at the next stop.</em></blockquote><p class="shortcode-media shortcode-media-youtube"><span class="rm-shortcode" data-rm-shortcode-id="cc1deca515500771c49703eedf4f68bd" style="display:block;position:relative;padding-top:56.25%;"><iframe frameborder="0" height="auto" lazy-loadable="true" scrolling="no" src="https://www.youtube.com/embed/uzKhOIWtxKM?rel=0" style="position:absolute;top:0;left:0;width:100%;height:100%;" width="100%"></iframe></span></p><p>Please, please hire someone to do some HRI (human-robot interface) design.</p><p>[ <a href="https://pndbotics.com/">PNDbotics</a> ]</p><div class="horizontal-rule"></div>]]></description><pubDate>Fri, 02 Jan 2026 18:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/video-friday-robot-farming</guid><category>Insect-robots</category><category>Robotics</category><category>Video-friday</category><category>Legged-robots</category><category>Humanoid-robots</category><category>Quadruped-robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/png" url="https://spectrum.ieee.org/media-library/robotic-snake-in-grassy-terrain-featuring-gears-and-yellow-accents.png?id=62651449&amp;width=980"></media:content></item><item><title>Tech to Track in 2026</title><link>https://spectrum.ieee.org/tech-in-2026</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/futuristic-drone-like-air-taxi-on-helipad-surrounded-by-desert-landscape.jpg?id=62639193&width=2000&height=1500&coordinates=60%2C0%2C60%2C0"/><br/><br/><p>Every September as we plan our <a data-linked-post="2650272013" href="https://spectrum.ieee.org/mostly-right-updates-on-our-2014-predictions" target="_blank">January tech forecast</a> issue, <em><em>IEEE</em></em> <em><em>Spectrum’s</em></em> editors survey their beats and seek out promising projects that could solve seemingly intractable problems or transform entire industries.</p><p>Often these projects fly under the radar of the popular technology press, which these days seems more interested in the personalities driving Big Tech companies than in the technology itself. We go our own way here, getting out into the field to bring you news of the hidden gems that genuinely—as the IEEE motto goes—advance technology for the benefit of humanity.</p><p>A look back at the last 20 years of January issues reveals that while we’ve certainly covered our share of huge tech projects, like the <a href="https://spectrum.ieee.org/at-last-first-light-for-the-james-webb-space-telescope" target="_self">James Webb Space Telescope</a>, many of the stories touch on subjects most people would have otherwise missed.</p><p>Last January, Senior Associate Editor Emily Waltz reported on startups that are piloting <a href="https://spectrum.ieee.org/ocean-carbon-removal" target="_self">ocean-based carbon capture</a>. This issue, she’s back with another CO<span><sub>2</sub></span>-centric story, this time focused on grid-scale storage, which is poised to blow up—literally. Waltz traveled to Sardinia to check out Milan-based <a href="https://spectrum.ieee.org/co2-battery-energy-storage" target="_blank">Energy Dome’s “bubble battery,”</a> which can store up to 200 megawatt-hours by compressing and decompressing pure carbon dioxide inside an inflatable dome.</p><p>This kind of modular, easy-to-deploy energy storage could be especially useful for AI data centers, says Senior Editor <a data-linked-post="2666671774" href="https://spectrum.ieee.org/technology-forecast-2024" target="_blank">Samuel K. Moore</a>, who curated this issue and wrote about <a href="https://spectrum.ieee.org/gravity-energy-storage-will-show-its-potential-in-2021" target="_self">gravity energy storage</a> back in January 2021.</p><p class="pull-quote">Big bubbles could help with grid-scale storage; tiny bubbles can liquefy cancer tumors.</p><p> “When we think about energy storage, our minds usually go to grid-scale batteries,” Moore says. “Yet these bubbles, which are in many ways more capable than batteries, will be sprouting up all over the place, often in association with computing infrastructure.”</p><p>For his story in this issue, Moore dove into the competition between two startups that are developing <a href="https://spectrum.ieee.org/rf-over-fiber" target="_blank">radio-based cables to replace conventional copper cables and fiber optics in data centers</a>. These radio systems can connect processors 10 to 20 meters apart using a third of the power of optical-fiber cables and at a third of the cost. The next step is to integrate the radio connections directly with GPUs, to ease cooling burdens and help data centers and the AI models running on them continue to scale up.</p><p>Big bubbles could help with grid-scale storage; tiny bubbles can liquify cancer tumors, as Greg Uyeno found when reporting on <a href="https://spectrum.ieee.org/ultrasound-cancer-treatment" target="_blank">HistoSonics’ ultrasound treatment</a>. Feared for its aggressive nature and extremely low survival rate, pancreatic cancer kills <a href="https://pmc.ncbi.nlm.nih.gov/articles/PMC9476884/#:~:text=Core%20tip:%20Pancreatic%20cancer%2C%20as,critical%20for%20global%20cancer%20control." target="_blank">almost half a million people</a> per year worldwide. <a href="https://histosonics.com/" target="_blank">HistoSonics</a> uses noninvasive, focused ultrasound to create cavitation bubbles that destroy tumors without dangerously heating surrounding tissue. This year, the company is concluding kidney trials as well as launching pancreatic cancer trials.</p><p>Over the last two decades, <em><em>Spectrum</em></em> has regularly covered the rise of drones. In 2018, for instance, we reported that the startup <a href="https://spectrum.ieee.org/zipline-expands-its-medical-delivery-drones-across-east-africa" target="_self">Zipline would deploy autonomous drones</a> to deliver blood and medical supplies in rural Rwanda. Today, <a href="https://www.zipline.com/" target="_blank">Zipline</a> has a market cap of about US $4 billion and operates in several African countries, Japan, and the United States, having completed almost 2 million drone deliveries. In this issue, journalist Robb Mandelbaum takes us inside the <a href="https://spectrum.ieee.org/wildfire-drones" target="_blank">Wildfire XPrize competition</a>, aimed at providing another life-saving service: dousing wildfires before they grow out of control. Zipline succeeded because it could make deliveries to remote locations much faster than land vehicles. This year’s XPrize teams plan to detect and suppress fires faster than conventional firefighting methods.</p><p>In addition to these emerging technologies, we’ve packed this issue with a dozen others, including <a href="https://spectrum.ieee.org/porsche-wireless-ev-charging" target="_blank">Porsche’s wireless home charger for EVs</a>, <a href="https://spectrum.ieee.org/joby-air-taxi" target="_blank">the world’s first electric air taxi service</a>, <a href="https://spectrum.ieee.org/neutral-atom-quantum-computing" target="_blank">neutral-atom quantum computers</a>, <a href="https://spectrum.ieee.org/mesh-network-interoperable-thread" target="_blank">interoperable mesh networks</a>, and <a href="https://spectrum.ieee.org/11-amazing-engineering-events-in-2026" target="_blank">robotic baseball umpires</a>. Let’s see which of this year’s picks make it to the big leagues.</p>]]></description><pubDate>Thu, 01 Jan 2026 15:00:02 +0000</pubDate><guid>https://spectrum.ieee.org/tech-in-2026</guid><category>Technology-forecast</category><category>Grid-scale-storage</category><category>Cancer</category><category>Ultrasound</category><category>Radio-frequency</category><category>Drones</category><dc:creator>Harry Goldstein</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/futuristic-drone-like-air-taxi-on-helipad-surrounded-by-desert-landscape.jpg?id=62639193&amp;width=980"></media:content></item><item><title>Teams of Robots Compete to Save Lives on the Battlefield</title><link>https://spectrum.ieee.org/darpa-triage-challenge-robots</link><description><![CDATA[
<img src="https://spectrum.ieee.org/media-library/aerial-drone-robotic-arm-explosion-and-medical-cross-on-abstract-background.jpg?id=62605094&width=2000&height=1500&coordinates=65%2C0%2C66%2C0"/><br/><br/><p>Last September, the <a href="https://www.darpa.mil/" rel="noopener noreferrer" target="_blank">Defense Advanced Research Projects Agency</a> (DARPA) unleashed teams of robots on <a data-linked-post="2650274270" href="https://spectrum.ieee.org/do-we-want-robot-warriors-to-decide-who-lives-or-dies" target="_blank">simulated mass-casualty scenarios</a>, including an airplane crash and a night ambush. The robots’ job was to find victims and estimate the severity of their injuries, with the goal of <a data-linked-post="2667038920" href="https://spectrum.ieee.org/ai-doctor" target="_blank">helping human medics</a> get to the people who need them the most.</p><h3>Kimberly Elenberg</h3><br/><p><a href="https://www.linkedin.com/in/kimberly-elenberg-0b52a595" rel="noopener noreferrer" target="_blank">Kimberly Elenberg</a> is a principal project scientist with the <a href="https://autonlab.org/" rel="noopener noreferrer" target="_blank">Auton Lab</a> of Carnegie Mellon University’s Robotics Institute. Before joining CMU, Elenberg spent 28 years as an army and U.S. Public Health Service nurse, which included 19 deployments and serving as the principal strategist for incident response at the Pentagon.</p><p>The final event of the <a href="https://www.darpa.mil/research/challenges/darpa-triage-challenge" target="_blank">DARPA Triage Challenge</a> will take place in November, and <a href="https://teamchiron.ai/tabs/team/team.html" rel="noopener noreferrer" target="_blank">Team Chiron</a> from <a href="https://www.cmu.edu/" target="_blank">Carnegie Mellon University</a> will be competing, using a squad of quadruped robots and drones. The team is led by <a href="https://www.linkedin.com/in/kimberly-elenberg-0b52a595" rel="noopener noreferrer" target="_blank">Kimberly Elenberg</a>, whose 28-year career as an army and <a href="https://www.usphs.gov/" target="_blank">U.S. Public Health Service</a> nurse took her from combat surgical teams to incident response strategy at the Pentagon.</p><p><strong>Why do we need robots for triage?</strong></p><p><strong>Kimberly Elenberg:</strong> We simply do not have enough responders for mass-casualty incidents. The drones and ground robots that we’re developing can give us the perspective that we need to identify where people are, assess who’s most at risk, and figure out how responders can get to them most efficiently.</p><p><strong>When could you have used robots like these?</strong></p><p><strong>Elenberg: </strong>On the way to one of the challenge events, there was a four-car accident on a back road. For me on my own, that was a mass-casualty event. I could hear some people yelling and see others walking around, and so I was able to reason that those people could breathe and move.</p><p>In the fourth car, I had to crawl inside to reach a gentleman who was slumped over with an occluded airway. I was able to lift his head until I could hear him breathing. I could see that he was hemorrhaging and feel that he was going into shock because his skin was cold. A robot couldn’t have gotten inside of the car to make those assessments.</p><p>This challenge involves enabling robots to remotely collect this data—can they detect heart rate from changes in skin color or hear breathing from a distance? If I’d had these capabilities, it would have helped me identify the person at greatest risk and gotten to them first.</p><p><strong>How do you design tech for triage?</strong></p><p><strong>Elenberg: </strong>The system has to be simple. For example, I can’t have a device that’s going to force a medic to take their hands away from their patient. What we came up with is a vest-mounted Android phone that flips down at chest height to display a map that has the GPS location of all of the casualties on it and their triage priority as colored dots, autonomously populated from the team of robots.</p><p><strong>Are the robots living up to the hype?</strong></p><p><strong>Elenberg: </strong>From my time in service, I know the only way to understand true capability is to build it, test it, and break it. With this challenge, I’m learning through end-to-end systems integration—sensing, communications, autonomy, and field testing in real environments. This is art and science coming together, and while the technology still has limitations, the pace of progress is extraordinary.</p><p><strong>What would be a win for you?</strong></p><p><strong>Elenberg:</strong> I already feel like we’ve won. Showing responders exactly where casualties are and estimating who needs attention most—that’s a huge step forward for disaster medicine. The next milestone is recognizing specific injury patterns and the likely life-saving interventions needed, but that will come.</p><p><em>This article appears in the January 2026 print issue as “Kimberly Elenberg.”</em><br/></p>]]></description><pubDate>Wed, 31 Dec 2025 13:00:01 +0000</pubDate><guid>https://spectrum.ieee.org/darpa-triage-challenge-robots</guid><category>5-questions</category><category>Darpa</category><category>Robotics</category><category>Drones</category><category>Quadruped-robots</category><dc:creator>Evan Ackerman</dc:creator><media:content medium="image" type="image/jpeg" url="https://spectrum.ieee.org/media-library/aerial-drone-robotic-arm-explosion-and-medical-cross-on-abstract-background.jpg?id=62605094&amp;width=980"></media:content></item></channel></rss>