14 May 2026

AI Cinema Arrives in Korea

South Korea is preparing to release I’m Popo, described as the country’s first feature-length film created entirely with generative AI. Directed by webtoon artist Kim Il-dong, the 64-minute sci-fi film follows Popo, a police robot designed to protect humanity, but which begins eliminating people it predicts could become future threats. Rather than relying on traditional filming, the movie was largely produced through AI-generated visuals and prompt-based workflows, while professional voice actors provided the dialogue. The film is already being positioned as a milestone in Korea’s rapidly evolving AI cinema movement.

Beyond its technological novelty, I’m Popo has sparked broader debate about the future of filmmaking and the role of AI in creative industries. The film raises ethical questions about algorithmic decision-making, human emotion, and artistic authenticity, while also challenging traditional ideas of authorship and cinematic production. Its release comes amid growing international discussion around AI-generated cinema, with film festivals and industry figures increasingly confronting how artificial intelligence may reshape storytelling, labor, and creativity in the entertainment world.

More information:

https://www.koreatimes.co.kr/entertainment/others/20260507/im-popo-koreas-first-all-ai-film-asks-what-comes-next-for-cinema

11 May 2026

MIT Unveils Virtual Violin Design Tool

Researchers at MIT have developed a physics-based virtual violin that could transform how violins are designed and tested. Unlike conventional digital sound simulators that rely on prerecorded samples, the new computational model recreates the actual physical behavior of the instrument, allowing it to generate realistic violin sounds by simulating how strings, wood, and surrounding air interact. The system enables luthiers to experiment with factors such as wood type, plate thickness, and structural geometry before physically building an instrument.

The project aims to provide violin makers with a scientific design tool that complements centuries of artisan knowledge. Researchers believe the model could accelerate experimentation and offer new insights into the acoustics behind legendary instruments such as Stradivari violins. While the current system focuses on reproducing plucked-string sounds, future versions may simulate bowed performance as well, potentially opening new possibilities for digital instrument design, acoustic research, and preservation of historical instrument-making traditions.

More information:

https://arstechnica.com/science/2026/05/mits-virtual-violin-offers-luthiers-a-new-design-tool/

08 May 2026

Underwater Drones Aim to Rescue Dying Coral Reefs

Researchers and conservationists are developing autonomous underwater robots to help restore the world’s rapidly declining coral reefs, where traditional restoration methods have struggled to keep pace with climate-driven bleaching events. The technology includes robotic coral planters, AI-powered mapping systems, and automated monitoring vehicles that can identify ideal planting sites and deploy coral seedlings far faster and more cheaply than human divers. One prototype, called the Deployment Guidance System, can plant coral in under a second and could eventually deploy up to a million seedlings at a cost of about $1 each.

Scientists say robotics could transform coral restoration into a large-scale industrial effort, but they caution that technology alone cannot solve the crisis. Researchers are also using robotic systems to identify heat-resistant coral strains capable of surviving warming oceans, while fleets of autonomous drones and underwater vehicles may soon monitor reef health continuously. Despite the promise of automation, experts stress that long-term reef survival still depends on addressing climate change, pollution, and community engagement alongside technological innovation.

More information:

https://www.smithsonianmag.com/innovation/could-underwater-autonomous-robots-save-coral-reefs-180988626/

05 May 2026

Underwater Robot Tracks Whale Communication in Real Time

Scientists have developed a new autonomous underwater robot capable of tracking sperm whale communication in real time, marking a major advance in marine research. The system, created by Project CETI, uses a glider equipped with hydrophones to detect the whales’ distinctive clicking sounds, known as codas, and automatically steer toward them. Unlike traditional tracking methods such as suction tags or fixed sensors the robot can make decisions underwater as events unfold, allowing it to follow individual whales or groups continuously for extended periods, potentially lasting months.

This breakthrough enables researchers to study whale behaviour, social interactions, and communication patterns with unprecedented detail, including how calves learn vocalizations and how whales respond to human-generated noise. The data gathered could inform more effective conservation strategies, such as adjusting shipping routes or fishing practices to reduce disruption. While challenges remain, the technology represents a significant step toward understanding complex marine life communication and improving protection of ocean ecosystems.

More information:

https://www.reuters.com/business/environment/underwater-robot-tracks-sperm-whale-conversations-real-time-2026-05-01/

04 May 2026

AI Robot Takes on Table Tennis Pros

A new artificial intelligence breakthrough from Sony is drawing attention after its table tennis robot, known as Ace, demonstrated the ability to outperform even highly skilled human players. Designed to learn and adapt through real-time data, the robot combines advanced sensing, rapid motion control, and machine learning to analyze opponents’ movements and return shots with remarkable precision. In trials and demonstrations, Ace was able to compete at a level comparable to national-level players, highlighting the growing sophistication of AI-driven robotics in dynamic, fast-paced environments.

Beyond its immediate performance, the development signals broader implications for robotics and human–AI interaction. The system is not only about winning matches but also about studying collaboration between humans and intelligent machines, with potential applications in training, rehabilitation, and skill development. This reflects a wider trend in Japan toward integrating AI into physical activities, showcasing how embodied intelligence, where software meets real-world movement, can push the boundaries of both sports technology and human–machine cooperation.

More information:

https://www.asahi.com/ajw/articles/photo/77463835

29 April 2026

TVCG 2026 Article

Recently, I co-authored a journal paper that was published at IEEE Transactions on Visualization and Computer Graphics. The paper is entitled “Interaction Under Whole-Body User Rotations in VR Space”. The study investigated how changes in a user’s virtual pitch orientation affect interaction performance and subjective experience. Using a within-subject design, 30 seated participants were exposed to 12 virtual tilt conditions ranging from moderate to extreme angles (±180°), while measures of comfort, simulator sickness, perceptual responses, and task performance were collected.

Results showed no significant increases in nausea, disorientation, or discomfort, with moderate tilts performing similarly to baseline conditions; even extreme tilts produced only low levels of nausea. Performance outcomes were mixed, as forward tilts resulted in similar or slightly improved performance, whereas backward tilts caused modest but statistically insignificant declines. Overall, the findings suggest that VR experiences with virtual body orientations differing from the user’s physical posture can be implemented without compromising comfort or performance.

More information:

https://www.computer.org/csdl/journal/tg/5555/01/11475228/2fuM7XOCKcg

28 April 2026

AI Robot Boosts Tomato Harvesting Efficiency

Researchers at Osaka Metropolitan University have developed an AI-powered tomato-harvesting robot that improves picking efficiency by evaluating how easy each tomato is to harvest before attempting to pick it. Instead of simply detecting ripe fruit, the system analyzes the tomato’s position, surrounding obstacles, and possible approach angles to predict the likelihood of a successful harvest. The robot then chooses the most effective picking path, allowing it to adapt to crowded or complex plant arrangements.

By planning its movements in advance and adjusting its strategy dynamically, the robot achieved a harvesting success rate of 81%, a significant improvement for automated agriculture systems. Researchers believe the technology could help address labor shortages in farming and reduce wasted time from failed picking attempts. The approach may eventually be adapted for harvesting other fruits and vegetables, supporting more efficient and autonomous farming operations in the future.

More information:

https://www.sciencedaily.com/releases/2025/04/250411175506.htm

21 April 2026

Monkeys Navigate VR with Thought

Researchers have unveiled a new intracortical brain–computer interface (BCI) that enables macaque monkeys to navigate complex 3D virtual reality environments using only their brain activity. Developed using neural signals from multiple brain regions, including the primary motor cortex and both dorsal and ventral premotor cortices, the system significantly improves the precision and flexibility of decoding real-time movement compared to earlier BCIs. The study demonstrates how combining signals from these areas allows for more natural and continuous control in immersive digital spaces.

In experimental trials, the monkeys successfully completed navigation tasks in VR without any physical movement, relying solely on neural input. They also showed the ability to learn and improve performance over time, with the system generalizing across different tasks without requiring retraining. Researchers highlight the potential of this technology for real-world applications, particularly in assisting people with paralysis to control wheelchairs, prosthetic devices, or explore virtual environments. The findings mark an important step toward more intuitive and adaptive brain-controlled interfaces.

More information:

https://www.rdworldonline.com/new-brain-computer-interface-allows-monkeys-to-navigate-3d-virtual-reality/