New research reveals that each person has a unique breathing “fingerprint” that can be used to identify them with nearly 97% accuracy. By continuously monitoring nasal airflow over 24 hours using a lightweight wearable, scientists discovered that these patterns also reflect physical and mental health traits.
Oral Bacteria Diversity Linked to Depression Symptoms
A new study reveals that lower diversity of microbes in the mouth is associated with greater symptoms of depression. Researchers analyzed data from over 15,000 U.S. adults, comparing their mental health surveys with saliva samples to assess microbial diversity.
Robot Yawns Spark Contagious Yawning in Chimps
Chimpanzees have been observed yawning and lying down after seeing a humanoid android mimic a yawning facial expression, indicating contagious yawning triggered by an artificial agent. This study is the first to show yawn contagion in response to an inanimate model, suggesting that yawns may serve as a rest cue in addition to a social […]
Soft Robots Learn to Grasp with Human-Like Flexibility
A new study reveals how a soft, compliant robotic hand—built with silicone skin, springs, and bendable joints, can self-organize grasps without needing precise environmental data or complex programming. The ADAPT hand succeeded in grasping 24 different objects with a 93% success rate using only four programmed motions, adapting naturally through mechanical flexibility.
Seeing Is Believing: How We Judge AI as Creative or Not
New research shows that people perceive AI systems as more creative when they observe not just the final product, but also the creative process and the robot in action. In a set of controlled experiments using identical drawings, participants consistently rated creativity higher the more they saw of the act itself.
Robot Gender and Design Influence Customer Choices
New research reveals that service robots’ gendered characteristics can shape customer decisions in the hospitality industry. Robots with male-associated traits were more persuasive with women who had a lower sense of personal power, while customers with higher power felt less influenced.
AI Teaches Robots Tasks from a Single How-To Video
Researchers have developed RHyME, an AI-powered system that enables robots to learn complex tasks by watching a single human demonstration video. Traditional robots struggle with unpredictable scenarios and require extensive training data, but RHyME allows robots to adapt by drawing on previous video knowledge.
AI-Powered Brain Implant Lets Paralyzed Man Control Robotic Arm
A new brain-computer interface (BCI) has enabled a paralyzed man to control a robotic arm by simply imagining movements. Unlike previous BCIs, which lasted only a few days, this AI-enhanced device worked reliably for seven months.
AI Mimics Toddler-Like Learning to Unlock Human Cognition
A new AI model, based on the PV-RNN framework, learns to generalize language and actions in a manner similar to toddlers by integrating vision, proprioception, and language instructions. Unlike large language models (LLMs) that rely on vast datasets, this system uses embodied interactions to achieve compositionality while requiring less data and computational power.
What Makes Robots Feel Human? A New Scale Reveals the Secret
Researchers have developed a new scale to measure how human-like robots appear, identifying four key qualities: appearance, emotional capacity, social intelligence, and self-understanding. Robots lacking any of these traits risk being perceived as cold or unsettling, limiting their usefulness in customer service.