04 May 2026

AI Robot Takes on Table Tennis Pros

A new artificial intelligence breakthrough from Sony is drawing attention after its table tennis robot, known as Ace, demonstrated the ability to outperform even highly skilled human players. Designed to learn and adapt through real-time data, the robot combines advanced sensing, rapid motion control, and machine learning to analyze opponents’ movements and return shots with remarkable precision. In trials and demonstrations, Ace was able to compete at a level comparable to national-level players, highlighting the growing sophistication of AI-driven robotics in dynamic, fast-paced environments.

Beyond its immediate performance, the development signals broader implications for robotics and human–AI interaction. The system is not only about winning matches but also about studying collaboration between humans and intelligent machines, with potential applications in training, rehabilitation, and skill development. This reflects a wider trend in Japan toward integrating AI into physical activities, showcasing how embodied intelligence, where software meets real-world movement, can push the boundaries of both sports technology and human–machine cooperation.

More information:

https://www.asahi.com/ajw/articles/photo/77463835

29 April 2026

TVCG 2026 Article

Recently, I co-authored a journal paper that was published at IEEE Transactions on Visualization and Computer Graphics. The paper is entitled “Interaction Under Whole-Body User Rotations in VR Space”. The study investigated how changes in a user’s virtual pitch orientation affect interaction performance and subjective experience. Using a within-subject design, 30 seated participants were exposed to 12 virtual tilt conditions ranging from moderate to extreme angles (±180°), while measures of comfort, simulator sickness, perceptual responses, and task performance were collected.

Results showed no significant increases in nausea, disorientation, or discomfort, with moderate tilts performing similarly to baseline conditions; even extreme tilts produced only low levels of nausea. Performance outcomes were mixed, as forward tilts resulted in similar or slightly improved performance, whereas backward tilts caused modest but statistically insignificant declines. Overall, the findings suggest that VR experiences with virtual body orientations differing from the user’s physical posture can be implemented without compromising comfort or performance.

More information:

https://www.computer.org/csdl/journal/tg/5555/01/11475228/2fuM7XOCKcg

28 April 2026

AI Robot Boosts Tomato Harvesting Efficiency

Researchers at Osaka Metropolitan University have developed an AI-powered tomato-harvesting robot that improves picking efficiency by evaluating how easy each tomato is to harvest before attempting to pick it. Instead of simply detecting ripe fruit, the system analyzes the tomato’s position, surrounding obstacles, and possible approach angles to predict the likelihood of a successful harvest. The robot then chooses the most effective picking path, allowing it to adapt to crowded or complex plant arrangements.

By planning its movements in advance and adjusting its strategy dynamically, the robot achieved a harvesting success rate of 81%, a significant improvement for automated agriculture systems. Researchers believe the technology could help address labor shortages in farming and reduce wasted time from failed picking attempts. The approach may eventually be adapted for harvesting other fruits and vegetables, supporting more efficient and autonomous farming operations in the future.

More information:

https://www.sciencedaily.com/releases/2025/04/250411175506.htm

21 April 2026

Monkeys Navigate VR with Thought

Researchers have unveiled a new intracortical brain–computer interface (BCI) that enables macaque monkeys to navigate complex 3D virtual reality environments using only their brain activity. Developed using neural signals from multiple brain regions, including the primary motor cortex and both dorsal and ventral premotor cortices, the system significantly improves the precision and flexibility of decoding real-time movement compared to earlier BCIs. The study demonstrates how combining signals from these areas allows for more natural and continuous control in immersive digital spaces.

In experimental trials, the monkeys successfully completed navigation tasks in VR without any physical movement, relying solely on neural input. They also showed the ability to learn and improve performance over time, with the system generalizing across different tasks without requiring retraining. Researchers highlight the potential of this technology for real-world applications, particularly in assisting people with paralysis to control wheelchairs, prosthetic devices, or explore virtual environments. The findings mark an important step toward more intuitive and adaptive brain-controlled interfaces.

More information:

https://www.rdworldonline.com/new-brain-computer-interface-allows-monkeys-to-navigate-3d-virtual-reality/

20 April 2026

AI Decodes Lost Roman Board Game

An international team of researchers has successfully used AI to reconstruct the rules of a mysterious Roman-era board game carved into a limestone slab. The artifact, discovered in the ruins of the ancient town of Coriovallum, had puzzled archaeologists for decades due to its unique pattern of intersecting lines that did not match any known historical games. By utilizing high-resolution 3D scans to map microscopic wear patterns, the team identified where players had repeatedly slid game pieces across the stone. These physical fingerprints of play allowed researchers to use the AI-driven system Ludii to simulate over 100 possible rule sets, eventually narrowing down the most likely gameplay to a blocking game where one player attempts to trap the opponent's pieces.

 

The discovery, recently published in the journal Antiquity, marks a significant breakthrough in both archaeology and digital humanities, as it provides the first evidence that blocking games were played in Europe centuries earlier than previously documented. Dubbed Ludus Coriovalli (the Coriovallum Game), the reconstruction suggests a strategic two-player battle of wits that likely dates back to the late Roman period between AD 250 and 476. Beyond solving a 2,000-year-old mystery, this innovative marriage of AI simulation and use-wear analysis offers a powerful new toolkit for historians to resurrect lost cultural practices from artifacts that lack written records, proving that even the most silent stones still have stories to tell.

More information:

https://www.sciencenews.org/article/ai-roman-board-game-limestone

15 April 2026

Dancer Returns to Stage Using Brain-Controlled Avatar

A groundbreaking performance has demonstrated how emerging brain–computer interface technology can restore artistic expression for people living with severe neurological conditions. A ballerina diagnosed with Amyotrophic Lateral Sclerosis (ALS) has returned to the stage using a digital avatar controlled by her brainwaves. Wearing an EEG-based headset, the dancer was able to translate imagined movements into real-time digital choreography, allowing her avatar to perform alongside other dancers in a live production. The initiative highlights the growing potential of neurotechnology to bridge physical limitations and enable new forms of creative participation.

Developed through a collaboration between technology and creative teams, the system captures neural signals associated with movement intention and converts them into computer-generated motion. The project not only enabled the performer to reconnect with dance after losing muscular control, but also signals broader applications in rehabilitation, accessibility, and inclusive performance arts. Researchers and developers emphasize that such innovations could transform how individuals with mobility impairments engage with culture, offering scalable solutions that extend beyond the stage into healthcare and assistive technologies.

More information:

https://www.bbc.com/news/articles/cgqkz5lzvnwo

13 April 2026

Holograms Enter Political Communication

A new pilot initiative at an airport in Jacksonville has demonstrated the emerging role of holographic technology in public communication, marking a significant step toward the integration of immersive media in political engagement. Using advanced display systems developed by companies such as Proto, a life-sized hologram of the city’s mayor was installed to deliver messages to travelers. The system supports both pre-recorded and interactive formats, showcasing the potential for public officials to extend their presence across multiple locations simultaneously and communicate at scale without the need for physical travel.

The deployment highlights both the opportunities and challenges associated with this technological shift. Proponents emphasize increased accessibility, efficiency, and the ability to reach broader audiences in real time. However, concerns have been raised regarding authenticity, trust, and the implications of AI-enhanced interactions in political contexts. As holographic and AI-driven communication tools continue to evolve, this initiative serves as an early case study in how emerging technologies may reshape the relationship between public figures and citizens, prompting important discussions about transparency, ethics, and the future of democratic engagement.

More information:

https://www.politico.com/news/2026/04/05/airport-holograms-politics-proto-jacksonville-00857411

09 April 2026

AI Sonar Hand Tracking

Researchers have developed a system called WatchHand that turns ordinary smartwatches into real-time hand-tracking devices using AI-powered sonar. Instead of relying on cameras or extra sensors, the smartwatch emits inaudible sound waves through its speaker; these waves bounce off the user’s hand and are captured by the microphone. A machine-learning model processes the returning echo profile to reconstruct the hand’s position and finger movements in 3D, in real time, all directly on the device.

This approach is significant because it works on off-the-shelf smartwatches without additional hardware, making it scalable and practical for everyday use. Tests with participants showed it can reliably track gestures like finger movements and wrist rotations, enabling applications such as gesture-based control of computers, AR/VR interaction, and assistive technologies. The system also preserves privacy by processing data locally, though it still has limitations, such as reduced accuracy while the user is moving and current compatibility mainly with Android devices.

More information:

https://interestingengineering.com/innovation/ai-smartwatch-hand-tracking-sonar-watchhand