Chimpanzees have been observed yawning and lying down after seeing a humanoid android mimic a yawning facial expression, indicating contagious yawning triggered by an artificial agent. This study is the first to show yawn contagion in response to an inanimate model, suggesting that yawns may serve as a rest cue in addition to a social […]
Soft Robots Learn to Grasp with Human-Like Flexibility
A new study reveals how a soft, compliant robotic hand—built with silicone skin, springs, and bendable joints, can self-organize grasps without needing precise environmental data or complex programming. The ADAPT hand succeeded in grasping 24 different objects with a 93% success rate using only four programmed motions, adapting naturally through mechanical flexibility.
Seeing Is Believing: How We Judge AI as Creative or Not
New research shows that people perceive AI systems as more creative when they observe not just the final product, but also the creative process and the robot in action. In a set of controlled experiments using identical drawings, participants consistently rated creativity higher the more they saw of the act itself.
Robot Gender and Design Influence Customer Choices
New research reveals that service robots’ gendered characteristics can shape customer decisions in the hospitality industry. Robots with male-associated traits were more persuasive with women who had a lower sense of personal power, while customers with higher power felt less influenced.
AI Teaches Robots Tasks from a Single How-To Video
Researchers have developed RHyME, an AI-powered system that enables robots to learn complex tasks by watching a single human demonstration video. Traditional robots struggle with unpredictable scenarios and require extensive training data, but RHyME allows robots to adapt by drawing on previous video knowledge.
AI-Powered Brain Implant Lets Paralyzed Man Control Robotic Arm
A new brain-computer interface (BCI) has enabled a paralyzed man to control a robotic arm by simply imagining movements. Unlike previous BCIs, which lasted only a few days, this AI-enhanced device worked reliably for seven months.
AI Mimics Toddler-Like Learning to Unlock Human Cognition
A new AI model, based on the PV-RNN framework, learns to generalize language and actions in a manner similar to toddlers by integrating vision, proprioception, and language instructions. Unlike large language models (LLMs) that rely on vast datasets, this system uses embodied interactions to achieve compositionality while requiring less data and computational power.
What Makes Robots Feel Human? A New Scale Reveals the Secret
Researchers have developed a new scale to measure how human-like robots appear, identifying four key qualities: appearance, emotional capacity, social intelligence, and self-understanding. Robots lacking any of these traits risk being perceived as cold or unsettling, limiting their usefulness in customer service.
Tiny Walking Robots Advance Micro-Optics and Biological Research
Researchers have developed the smallest walking robots, measuring just 2 to 5 microns, capable of interacting with visible light for imaging and force measurement. These magnetically controlled robots can inch forward or swim through fluids while serving as diffraction elements, enabling super-resolution microscopy at scales previously unattainable.
Robots Help Unlock the Mystery of Human Sense of Self
A new study explores how robots can model and test aspects of the human sense of self, offering new insights into this complex phenomenon. Robots can simulate processes like body ownership and agency, or be used in experiments to study how humans perceive robots as social entities.