Robots

Recent Articles

Sort Options:

Video Friday: Skyfall Takes on Mars With Swarm Helicopter Concept

Video Friday: Skyfall Takes on Mars With Swarm Helicopter Concept

IEEE Spectrum's latest Video Friday showcases innovative robotics, including AeroVironment's Skyfall concept for Mars exploration and various cutting-edge robots like the bipedal Walker S2 and the versatile FlashBot Max for in-building deliveries. Exciting advancements await!


What is the Skyfall concept developed by AeroVironment?
The Skyfall concept involves deploying six scout helicopters on Mars to explore the planet. These helicopters are designed to operate autonomously and can be used for tasks such as identifying areas with water or other resources and selecting sites for future astronaut missions.
Sources: [1]
How does the Skyfall concept relate to previous Mars exploration efforts?
The Skyfall concept builds upon the success of NASA's Ingenuity Mars helicopter, which was co-developed by AeroVironment and NASA's Jet Propulsion Laboratory. Ingenuity was the first aircraft to fly under its own power on another planet, demonstrating the feasibility of rotorcraft on Mars.
Sources: [1]

25 July, 2025
IEEE Spectrum

Video Friday: Cyborg Beetles May Speed Disaster Response One Day

Video Friday: Cyborg Beetles May Speed Disaster Response One Day

IEEE Spectrum's latest Video Friday showcases innovative robotics advancements, including beetles aiding search and rescue, humanoid robots, and exoskeletons enhancing mobility. Upcoming robotics events are also highlighted, inviting contributions from the community for future inclusions.


How are cyborg beetles controlled to assist in disaster response?
Cyborg beetles are equipped with removable microchip backpacks that use electrodes to control their antennae and forewings, allowing precise directional guidance via remote control, such as a video game controller. This enables operators to guide the beetles through complex environments like rubble to locate survivors.
Sources: [1], [2]
What advantages do cyborg beetles have over traditional robots in search and rescue missions?
Cyborg beetles naturally possess sophisticated sensing capabilities, soft environmental interactions, and the ability to climb vertical surfaces and maneuver through small, complex spaces like dense rubble. These traits make them more effective than similarly sized robots, which struggle with locomotion on vertical surfaces and navigating chaotic disaster environments.
Sources: [1], [2]

05 July, 2025
IEEE Spectrum

Video Friday: Jet-Powered Humanoid Robot Lifts Off

Video Friday: Jet-Powered Humanoid Robot Lifts Off

IEEE Spectrum's latest Video Friday showcases groundbreaking robotics innovations, including a jet-powered humanoid robot's successful vertical takeoff and the introduction of SCUTTLE, a versatile multilegged platform. The article also highlights upcoming robotics events and advancements in AI-driven robotic capabilities.


How does the jet-powered humanoid robot achieve vertical takeoff and maintain stability?
The jet-powered humanoid robot, known as iRonCub3, achieves vertical takeoff by integrating jet propulsion with an advanced whole-body control architecture. It uses AI-based control systems and aerodynamic modeling to maintain dynamic stability during flight. The control pipeline includes a Model Predictive Controller (MPC) that manages thrust and balance, allowing the robot to lift approximately 50 cm off the ground while maintaining stability. This system has been validated through both simulation and real-world experiments, demonstrating accurate tracking and control during vertical takeoff.
Sources: [1], [2], [3]
What are the main challenges in developing and operating a jet-powered humanoid robot?
Developing and operating a jet-powered humanoid robot involves several challenges, including managing the extreme conditions of jet propulsion such as high air temperatures (up to 700 degrees Celsius) and supersonic air speeds (around 1800 km/h). Safety protocols and experimental procedures are critical to handle these conditions. Additionally, accurately estimating thrust forces is pivotal for stable flight, as uncertainties in the robot’s base pose and thrust can affect control. The system must also handle complex dynamics, estimation errors, and external disturbances, requiring robust control algorithms and continuous improvements in modeling and sensor accuracy.
Sources: [1], [2], [3]

20 June, 2025
IEEE Spectrum

Realbotix

Realbotix

The article explores the development of hyper-realistic AI humanoids designed for seamless human interaction, highlighting their potential to revolutionize communication and engagement in various sectors, from customer service to entertainment, while raising important ethical considerations.


How does the Realbotix Robotic AI Vision System enhance human-robot interaction?
The Realbotix Robotic AI Vision System enables humanoid robots to detect human presence and dynamically adjust their facial expressions, creating emotionally engaging and natural responses that reduce the 'uncanny valley' effect. It also supports facial recognition for personalized interactions, real-time object identification, scene awareness, and integrates multimodal AI and conversational AI to provide contextually nuanced conversations.
Sources: [1]
What AI platforms can Realbotix humanoid robots integrate with, and what benefits does this provide?
Realbotix humanoid robots can integrate with major AI platforms such as OpenAI's ChatGPT, Meta's Llama, Google's Gemini, and DeepSeek R1. This integration allows for enhanced conversational abilities in multiple languages, including Spanish, Cantonese, Mandarin, French, and English, and supports both local and cloud-based AI applications. Additionally, Realbotix's proprietary lip sync technology ensures precise mouth movements, improving the realism and accuracy of robotic speech synchronization.
Sources: [1]

20 June, 2025
Product Hunt

Someone used a Raspberry Pi to create the robot of their childhood dreams, complete with creepy eyeballs

Someone used a Raspberry Pi to create the robot of their childhood dreams, complete with creepy eyeballs

A recent article critiques modern robots, lamenting the gap between 80s and 90s expectations of humanoid machines and today's reality of mundane chatbots. The authors express disappointment over the lack of innovation in robotic technology.


What kind of technology is used to create the 'creepy eyeballs' in robotic projects?
Projects like these often use servo motors, Raspberry Pi boards, and sometimes additional components like cameras or ultrasonic sensors to control and animate the robotic eyes. For example, a project might use a Raspberry Pi with a servo driver to control eye movements, and a camera with software like Google’s MediaPipe Face Mesh for eye tracking[2][5].
Sources: [1], [2]
How do robotic eye-following projects typically work?
These projects typically involve using sensors (like ultrasonic sensors or cameras) to detect movement or track objects. The data from these sensors is then processed by a microcontroller, such as a Raspberry Pi or Raspberry Pi Pico, which controls servo motors to move the eyes in response to the detected movement[2][5].
Sources: [1], [2]

14 June, 2025
XDA

Video Friday: AI Model Gives Neo Robot Autonomy

Video Friday: AI Model Gives Neo Robot Autonomy

IEEE Spectrum's latest Video Friday showcases groundbreaking robotics innovations, including 1X's Redwood AI model and Pudu Robotics' milestone of 100,000 units. The article also highlights upcoming robotics events and features engaging videos on humanoid robots and autonomous systems.


What is the Redwood AI model and how does it enable autonomy for the NEO humanoid robot?
The Redwood AI model is an advanced artificial intelligence system developed by 1X Technologies specifically for its NEO humanoid robot. Redwood enables NEO to autonomously perform household tasks such as laundry, answering doors, and navigating familiar spaces by integrating perception, movement, and interaction capabilities. The model is trained on real-world data from both EVE and NEO robots, allowing it to generalize across tasks, handle unfamiliar objects, and exhibit learned behaviors like hand selection and retrying failed grasps. Redwood also supports whole-body and multi-contact manipulation, enabling NEO to coordinate locomotion and manipulation for complex actions such as bracing or leaning during tasks. The system operates efficiently on NEO’s onboard embedded GPU and integrates with an off-board language model for real-time voice control.
Sources: [1]
How does Redwood AI differ from traditional robotic control systems?
Unlike traditional robotic control systems that separate locomotion and manipulation into distinct modules, Redwood AI fuses these capabilities into a unified system. This allows the NEO robot to perform whole-body and multi-contact manipulation, such as bracing against a wall while opening a heavy door or kneeling to pick up objects, which is difficult for most robots. Redwood’s embodied learning approach enables NEO to learn from real-world interactions and adapt to new environments, improving its autonomy over time. The system also supports mobile bi-manual manipulation, allowing NEO to move and manipulate objects simultaneously, and integrates with a language model for voice-based user intent prediction.
Sources: [1], [2]

13 June, 2025
IEEE Spectrum

An unhandled error has occurred. Reload 🗙