11 May 2026

MIT Unveils Virtual Violin Design Tool

Researchers at MIT have developed a physics-based virtual violin that could transform how violins are designed and tested. Unlike conventional digital sound simulators that rely on prerecorded samples, the new computational model recreates the actual physical behavior of the instrument, allowing it to generate realistic violin sounds by simulating how strings, wood, and surrounding air interact. The system enables luthiers to experiment with factors such as wood type, plate thickness, and structural geometry before physically building an instrument.

The project aims to provide violin makers with a scientific design tool that complements centuries of artisan knowledge. Researchers believe the model could accelerate experimentation and offer new insights into the acoustics behind legendary instruments such as Stradivari violins. While the current system focuses on reproducing plucked-string sounds, future versions may simulate bowed performance as well, potentially opening new possibilities for digital instrument design, acoustic research, and preservation of historical instrument-making traditions.

More information:

https://arstechnica.com/science/2026/05/mits-virtual-violin-offers-luthiers-a-new-design-tool/

08 May 2026

Underwater Drones Aim to Rescue Dying Coral Reefs

Researchers and conservationists are developing autonomous underwater robots to help restore the world’s rapidly declining coral reefs, where traditional restoration methods have struggled to keep pace with climate-driven bleaching events. The technology includes robotic coral planters, AI-powered mapping systems, and automated monitoring vehicles that can identify ideal planting sites and deploy coral seedlings far faster and more cheaply than human divers. One prototype, called the Deployment Guidance System, can plant coral in under a second and could eventually deploy up to a million seedlings at a cost of about $1 each.

Scientists say robotics could transform coral restoration into a large-scale industrial effort, but they caution that technology alone cannot solve the crisis. Researchers are also using robotic systems to identify heat-resistant coral strains capable of surviving warming oceans, while fleets of autonomous drones and underwater vehicles may soon monitor reef health continuously. Despite the promise of automation, experts stress that long-term reef survival still depends on addressing climate change, pollution, and community engagement alongside technological innovation.

More information:

https://www.smithsonianmag.com/innovation/could-underwater-autonomous-robots-save-coral-reefs-180988626/

05 May 2026

Underwater Robot Tracks Whale Communication in Real Time

Scientists have developed a new autonomous underwater robot capable of tracking sperm whale communication in real time, marking a major advance in marine research. The system, created by Project CETI, uses a glider equipped with hydrophones to detect the whales’ distinctive clicking sounds, known as codas, and automatically steer toward them. Unlike traditional tracking methods such as suction tags or fixed sensors the robot can make decisions underwater as events unfold, allowing it to follow individual whales or groups continuously for extended periods, potentially lasting months.

This breakthrough enables researchers to study whale behaviour, social interactions, and communication patterns with unprecedented detail, including how calves learn vocalizations and how whales respond to human-generated noise. The data gathered could inform more effective conservation strategies, such as adjusting shipping routes or fishing practices to reduce disruption. While challenges remain, the technology represents a significant step toward understanding complex marine life communication and improving protection of ocean ecosystems.

More information:

https://www.reuters.com/business/environment/underwater-robot-tracks-sperm-whale-conversations-real-time-2026-05-01/

04 May 2026

AI Robot Takes on Table Tennis Pros

A new artificial intelligence breakthrough from Sony is drawing attention after its table tennis robot, known as Ace, demonstrated the ability to outperform even highly skilled human players. Designed to learn and adapt through real-time data, the robot combines advanced sensing, rapid motion control, and machine learning to analyze opponents’ movements and return shots with remarkable precision. In trials and demonstrations, Ace was able to compete at a level comparable to national-level players, highlighting the growing sophistication of AI-driven robotics in dynamic, fast-paced environments.

Beyond its immediate performance, the development signals broader implications for robotics and human–AI interaction. The system is not only about winning matches but also about studying collaboration between humans and intelligent machines, with potential applications in training, rehabilitation, and skill development. This reflects a wider trend in Japan toward integrating AI into physical activities, showcasing how embodied intelligence, where software meets real-world movement, can push the boundaries of both sports technology and human–machine cooperation.

More information:

https://www.asahi.com/ajw/articles/photo/77463835

29 April 2026

TVCG 2026 Article

Recently, I co-authored a journal paper that was published at IEEE Transactions on Visualization and Computer Graphics. The paper is entitled “Interaction Under Whole-Body User Rotations in VR Space”. The study investigated how changes in a user’s virtual pitch orientation affect interaction performance and subjective experience. Using a within-subject design, 30 seated participants were exposed to 12 virtual tilt conditions ranging from moderate to extreme angles (±180°), while measures of comfort, simulator sickness, perceptual responses, and task performance were collected.

Results showed no significant increases in nausea, disorientation, or discomfort, with moderate tilts performing similarly to baseline conditions; even extreme tilts produced only low levels of nausea. Performance outcomes were mixed, as forward tilts resulted in similar or slightly improved performance, whereas backward tilts caused modest but statistically insignificant declines. Overall, the findings suggest that VR experiences with virtual body orientations differing from the user’s physical posture can be implemented without compromising comfort or performance.

More information:

https://www.computer.org/csdl/journal/tg/5555/01/11475228/2fuM7XOCKcg

28 April 2026

AI Robot Boosts Tomato Harvesting Efficiency

Researchers at Osaka Metropolitan University have developed an AI-powered tomato-harvesting robot that improves picking efficiency by evaluating how easy each tomato is to harvest before attempting to pick it. Instead of simply detecting ripe fruit, the system analyzes the tomato’s position, surrounding obstacles, and possible approach angles to predict the likelihood of a successful harvest. The robot then chooses the most effective picking path, allowing it to adapt to crowded or complex plant arrangements.

By planning its movements in advance and adjusting its strategy dynamically, the robot achieved a harvesting success rate of 81%, a significant improvement for automated agriculture systems. Researchers believe the technology could help address labor shortages in farming and reduce wasted time from failed picking attempts. The approach may eventually be adapted for harvesting other fruits and vegetables, supporting more efficient and autonomous farming operations in the future.

More information:

https://www.sciencedaily.com/releases/2025/04/250411175506.htm

21 April 2026

Monkeys Navigate VR with Thought

Researchers have unveiled a new intracortical brain–computer interface (BCI) that enables macaque monkeys to navigate complex 3D virtual reality environments using only their brain activity. Developed using neural signals from multiple brain regions, including the primary motor cortex and both dorsal and ventral premotor cortices, the system significantly improves the precision and flexibility of decoding real-time movement compared to earlier BCIs. The study demonstrates how combining signals from these areas allows for more natural and continuous control in immersive digital spaces.

In experimental trials, the monkeys successfully completed navigation tasks in VR without any physical movement, relying solely on neural input. They also showed the ability to learn and improve performance over time, with the system generalizing across different tasks without requiring retraining. Researchers highlight the potential of this technology for real-world applications, particularly in assisting people with paralysis to control wheelchairs, prosthetic devices, or explore virtual environments. The findings mark an important step toward more intuitive and adaptive brain-controlled interfaces.

More information:

https://www.rdworldonline.com/new-brain-computer-interface-allows-monkeys-to-navigate-3d-virtual-reality/

20 April 2026

AI Decodes Lost Roman Board Game

An international team of researchers has successfully used AI to reconstruct the rules of a mysterious Roman-era board game carved into a limestone slab. The artifact, discovered in the ruins of the ancient town of Coriovallum, had puzzled archaeologists for decades due to its unique pattern of intersecting lines that did not match any known historical games. By utilizing high-resolution 3D scans to map microscopic wear patterns, the team identified where players had repeatedly slid game pieces across the stone. These physical fingerprints of play allowed researchers to use the AI-driven system Ludii to simulate over 100 possible rule sets, eventually narrowing down the most likely gameplay to a blocking game where one player attempts to trap the opponent's pieces.

 

The discovery, recently published in the journal Antiquity, marks a significant breakthrough in both archaeology and digital humanities, as it provides the first evidence that blocking games were played in Europe centuries earlier than previously documented. Dubbed Ludus Coriovalli (the Coriovallum Game), the reconstruction suggests a strategic two-player battle of wits that likely dates back to the late Roman period between AD 250 and 476. Beyond solving a 2,000-year-old mystery, this innovative marriage of AI simulation and use-wear analysis offers a powerful new toolkit for historians to resurrect lost cultural practices from artifacts that lack written records, proving that even the most silent stones still have stories to tell.

More information:

https://www.sciencenews.org/article/ai-roman-board-game-limestone

15 April 2026

Dancer Returns to Stage Using Brain-Controlled Avatar

A groundbreaking performance has demonstrated how emerging brain–computer interface technology can restore artistic expression for people living with severe neurological conditions. A ballerina diagnosed with Amyotrophic Lateral Sclerosis (ALS) has returned to the stage using a digital avatar controlled by her brainwaves. Wearing an EEG-based headset, the dancer was able to translate imagined movements into real-time digital choreography, allowing her avatar to perform alongside other dancers in a live production. The initiative highlights the growing potential of neurotechnology to bridge physical limitations and enable new forms of creative participation.

Developed through a collaboration between technology and creative teams, the system captures neural signals associated with movement intention and converts them into computer-generated motion. The project not only enabled the performer to reconnect with dance after losing muscular control, but also signals broader applications in rehabilitation, accessibility, and inclusive performance arts. Researchers and developers emphasize that such innovations could transform how individuals with mobility impairments engage with culture, offering scalable solutions that extend beyond the stage into healthcare and assistive technologies.

More information:

https://www.bbc.com/news/articles/cgqkz5lzvnwo

13 April 2026

Holograms Enter Political Communication

A new pilot initiative at an airport in Jacksonville has demonstrated the emerging role of holographic technology in public communication, marking a significant step toward the integration of immersive media in political engagement. Using advanced display systems developed by companies such as Proto, a life-sized hologram of the city’s mayor was installed to deliver messages to travelers. The system supports both pre-recorded and interactive formats, showcasing the potential for public officials to extend their presence across multiple locations simultaneously and communicate at scale without the need for physical travel.

The deployment highlights both the opportunities and challenges associated with this technological shift. Proponents emphasize increased accessibility, efficiency, and the ability to reach broader audiences in real time. However, concerns have been raised regarding authenticity, trust, and the implications of AI-enhanced interactions in political contexts. As holographic and AI-driven communication tools continue to evolve, this initiative serves as an early case study in how emerging technologies may reshape the relationship between public figures and citizens, prompting important discussions about transparency, ethics, and the future of democratic engagement.

More information:

https://www.politico.com/news/2026/04/05/airport-holograms-politics-proto-jacksonville-00857411

09 April 2026

AI Sonar Hand Tracking

Researchers have developed a system called WatchHand that turns ordinary smartwatches into real-time hand-tracking devices using AI-powered sonar. Instead of relying on cameras or extra sensors, the smartwatch emits inaudible sound waves through its speaker; these waves bounce off the user’s hand and are captured by the microphone. A machine-learning model processes the returning echo profile to reconstruct the hand’s position and finger movements in 3D, in real time, all directly on the device.

This approach is significant because it works on off-the-shelf smartwatches without additional hardware, making it scalable and practical for everyday use. Tests with participants showed it can reliably track gestures like finger movements and wrist rotations, enabling applications such as gesture-based control of computers, AR/VR interaction, and assistive technologies. The system also preserves privacy by processing data locally, though it still has limitations, such as reduced accuracy while the user is moving and current compatibility mainly with Android devices.

More information:

https://interestingengineering.com/innovation/ai-smartwatch-hand-tracking-sonar-watchhand

07 April 2026

Digital Twin Hearts Improve Arrhythmia Care

A recent clinical study describes a novel approach to treating ventricular tachycardia, a life-threatening heart rhythm disorder responsible for hundreds of thousands of deaths annually. Researchers at Johns Hopkins University created highly detailed “digital twins” of patients’ hearts using MRI scans and other personalized data. These virtual models allowed doctors to simulate different treatment strategies before performing the actual procedure, helping identify the most effective areas to target.

In a small trial of 10 patients, the results were promising: after more than a year, eight patients experienced no recurrence of arrhythmia, and most were able to stop medication. The approach may also reduce procedure time and improve safety by avoiding unnecessary damage to healthy tissue. However, researchers emphasize that this is an early-stage study, and larger trials are needed to confirm effectiveness and expand the method to other conditions such as atrial fibrillation or even cancer treatment.

More information:

https://apnews.com/article/heart-disease-arrhythmia-ventricular-tachycardia-73086c0c3df8758380bef539940fa826

30 March 2026

Ultra-Low-Power Face Detection Chip

Nvidia researchers have developed an ultra–low-power, always-on face detection system-on-chip (SoC) capable of identifying human faces in under a millisecond, addressing a key challenge in continuous computer vision: energy consumption. Traditional vision systems can require around 10 watts, which is too high for constant operation, but this chip uses less than 5 milliwatts while maintaining about 99% detection accuracy. It achieves this by activating only briefly (processing each frame in microseconds) and remaining fully powered for just a small fraction of time, enabling efficient real-time performance.

The system’s efficiency comes from a specialized architecture called Alpha-Vision, which combines a lightweight CPU, a deep-learning accelerator, and local SRAM memory to avoid costly data transfers. By storing data locally and using a race-to-sleep strategy (quickly completing computations and then entering low-power mode) it minimizes energy use even further. This design enables practical applications such as laptops that automatically turn screens on/off based on user presence, as well as always-on vision in robotics, drones, and autonomous vehicles, where continuous sensing must not drain power.

More information:

https://spectrum.ieee.org/face-recognition-nvidia-chip-soc

18 March 2026

China Approves World’s First Commercial Brain Implant

China has approved the world’s first brain implant for commercial use, marking a major milestone in the development of brain–computer interface (BCI) technology. The device is designed primarily for people with spinal cord injuries, enabling them to regain some lost motor function, such as hand movement, by translating brain signals into commands for external devices. Unlike earlier experimental systems, this implant has moved beyond clinical trials into the market, signaling a shift from research to real-world medical application.

The approval also reflects China’s broader ambition to lead in emerging technologies, including BCIs, where it is competing with efforts in the United States and elsewhere. While technology shows promise to restore mobility and improving quality of life, it also raises important ethical and safety considerations, such as long-term effects, data privacy, and the risks of invasive procedures. Overall, the development represents both a breakthrough in assistive medicine and a significant step toward more widespread use of BCIs in the coming years.

More information:

https://www.scientificamerican.com/article/china-just-approved-its-first-brain-implant-for-commercial-use-a-world-first/

17 March 2026

Ukraine Shares Battlefield Data for AI

Ukraine has announced that it is opening access to its vast trove of battlefield data to allied countries and companies so they can train artificial intelligence systems, particularly for drone warfare. The dataset includes millions of annotated images collected from thousands of combat flights, offering highly detailed insights into real battlefield conditions. To manage this securely, Ukraine has developed a controlled platform that allows partners to train AI models while protecting sensitive military information.

This initiative is intended to accelerate the development of autonomous and AI-assisted military technologies, strengthening Ukraine’s technological edge while deepening cooperation with international partners. It also reflects a broader shift toward data-driven warfare, with Ukraine expanding the use of unmanned systems and even forming specialized drone interceptor units. By sharing its combat experience in data form, Ukraine aims to both enhance its own defense capabilities and position itself as a key contributor to global military innovation.

More information:

https://www.reuters.com/business/aerospace-defense/ukraine-opens-battlefield-data-access-allies-ai-models-2026-03-12/

15 March 2026

Robot Gym

Germany has launched one of the world’s largest humanoid robot training centers, designed to help robots learn skills needed for real-world tasks. The facility can train more than 100 humanoid robot models from multiple companies at the same time and recreates everyday and industrial environments for practice. Robots learn by observing humans and repeating actions, gradually mastering around 45 basic atomic skills such as grasping objects, moving items, and placing them correctly.

These fundamental abilities are intended to form the building blocks for more complex tasks in sectors like manufacturing, logistics, and service work. A major goal of the center is to generate large amounts of data to improve AI models that control robots. Engineers expect the facility to produce tens of thousands of data entries every day as robots perform repeated actions in realistic scenarios. By collecting and sharing this data, researchers hope to develop a powerful shared AI system that can help different humanoid robots learn faster and collaborate better.

More information:

https://interestingengineering.com/ai-robotics/worlds-largest-humanoid-robot-training-center

12 March 2026

IEEE Access 2026 Article

Recently, I co-authored an open-access journal paper that was published at IEEE Access. The paper is entitled “An Augmented Reality System With an Offline LSTM-Based Fault Recognition Model for Sewer Pipeline Inspection”. The paper introduces XR5.0, a novel framework that combines artificial intelligence with extended reality (XR) technologies to support the vision of Industry 5.0, where advanced digital systems are designed around human needs and capabilities. The research proposes a human-centric XR paradigm that integrates immersive environments with AI to enhance collaboration between workers and intelligent machines.

A key component of the approach is the use of human-centred digital twins, which create digital representations of users to enable XR systems to adapt training, guidance, and information delivery according to individual skills, context, and tasks. The framework also integrates advanced AI techniques (including explainable AI, generative AI, active learning, and neurosymbolic AI) to provide real-time decision support and personalized learning within immersive environments. These capabilities enable practical applications such as industrial training, remote maintenance, assembly guidance, and product design simulations.

More information:

https://ieeexplore.ieee.org/abstract/document/11363212

10 March 2026

The World’s Smallest QR Code

Researchers at the Technical University of Vienna (TU Wien), working with the data-storage company Cerabyte, have created the world’s smallest QR code, measuring only 1.98 square micrometers. The structure is so tiny that it cannot be seen with the naked eye or even with a standard optical microscope; it can only be detected using an electron microscope. The code was etched into an ultra-thin ceramic layer using focused ion beams, producing individual pixels about 49 nanometers wide, roughly ten times smaller than the wavelength of visible light. 

Beyond its novelty, the breakthrough demonstrates a potential method for extremely dense and durable data storage. Ceramic materials used in the experiment are highly stable and resistant to environmental damage, meaning information written in them could remain readable for centuries or even millennia without needing electricity or cooling. Researchers suggest that technologies based on this approach could enable ultra-long-term archival storage, possibly allowing enormous amounts of data to be preserved on very small surfaces.

More information:

https://www.popsci.com/technology/worlds-smallest-qr-code/

09 March 2026

AI-Simulated Training for Robotaxis

Waymo is increasingly training its autonomous robotaxis using AI-generated simulation environments rather than relying only on real-world driving data. These systems employ advanced world models capable of generating complex virtual scenarios, including rare or dangerous situations such as extreme weather, unusual road hazards, or unpredictable pedestrian behaviour. By running billions of simulated driving miles, engineers can expose the autonomous system to edge cases that would be difficult, expensive, or unsafe to recreate.

This approach allows developers to accelerate testing and compress years of driving experience into a much shorter development cycle. However, because the simulations are partly built from machine-learning models trained on datasets such as video recordings and sensor data, they may include inaccuracies or simplified assumptions about physical environments and traffic behaviour. Critics argue that regulators currently lack clear standards for validating these virtual training systems, making it difficult to ensure that simulation-trained vehicles behave safely on real roads.

More information:

https://ucstrategies.com/news/waymo-is-training-robotaxis-in-ai-generated-worlds-but-whos-checking-if-its-safe/

06 March 2026

China Introduces National Standards for Humanoid Robotics

China has introduced its first comprehensive national standard system for humanoid robotics, aiming to regulate and accelerate development in the rapidly expanding sector. The framework, unveiled at the Humanoid Robots and Embodied Intelligence Standardization meeting in Beijing, was developed by more than 120 research institutions, companies, and industry users under the guidance of China’s Ministry of Industry and Information Technology. Experts believe the system will help reduce costs, improve interoperability across manufacturers, and speed up the transition of humanoid robots from experimental prototypes to large-scale commercial deployment.

The standard framework is built around six core pillars: foundational standards, neuromorphic and intelligent computing, limbs and components, full-system integration, application scenarios, and safety and ethics. By establishing unified technical specifications, testing methods, and interface protocols, the system aims to support modular production and more efficient supply chains. Although China’s humanoid robotics industry has grown rapidly (producing hundreds of models from over 140 manufacturers) it still faces challenges such as high costs, limited suppliers, fragmented applications, and insufficient AI generalization capabilities.

More information:

https://english.news.cn/20260303/0e51ac8f66c542c5bacf2af3f80b3a40/c.html

28 February 2026

3D Printing Tiny Structures Inside Living Cells

Researchers have, for the first time, developed a method to 3D-print microscopic structures directly inside living human cells by injecting a biocompatible photoresin and using two-photon polymerization with a laser to solidify it into detailed shapes like barcodes, geometric forms, and even a tiny 10-micrometer elephant; many of the cells not only survived this process but continued to live and divide, passing the embedded structure to daughter cells.

This early proof-of-concept breakthrough could pave the way for entirely new intracellular bioengineering tools and applications, such as tracking cells with internal barcodes, probing cellular mechanics, creating microscopic machines or sensors inside cells, and eventually enabling advanced capabilities like targeted drug delivery or engineered biological functions beyond what is possible with current techniques.

More information:

https://www.discovermagazine.com/a-3d-printed-elephant-inside-a-living-cell-signals-a-bioengineering-breakthrough-48730

27 February 2026

UK Army’s Passive Acoustic System for Faster Artillery Detection

The UK Army is introducing SONUS, a new passive acoustic detection system developed by Leonardo UK, designed to rapidly locate enemy artillery, mortar, and gunfire positions. SONUS listens for acoustic pressure waves such as muzzle blasts, projectile shockwaves, and impact sounds, then triangulates their source using distributed sensors. Because it works passively without emitting signals, the system helps troops remain concealed while still identifying targets more quickly and accurately.

SONUS is also significantly lighter and more portable than earlier systems allowing faster setup in frontline conditions. The British Army plans to equip the 5th Regiment Royal Artillery with the technology, accelerating delivery five years ahead of schedule as part of broader defence modernization efforts. By improving counter-battery response speed and situational awareness, SONUS aims to enhance soldier safety and operational effectiveness on modern, mobile battlefields.

More information:

https://interestingengineering.com/military/uk-army-sonus-to-pinpoint-artillery-fire-faster

26 February 2026

AI Agent Safety Transparency Gap

A new study from the University of Cambridge’s AI Agent Index project analysed 30 leading AI agents, including chat, browser, and workflow bots, and found that safety transparency is severely lacking. Only four agents had published formal “system cards” describing safety evaluations, while most developers publicly highlight capabilities but provide little evidence about risk testing or mitigation. In fact, 25 of the 30 agents disclosed no internal safety results and 23 showed no evidence of independent third-party testing, creating what researchers call a significant transparency gap as AI agents become integrated into everyday activities such as booking travel or managing finances.

The researchers warn that this lack of disclosure could hinder regulators, users, and scientists from understanding real-world risks, especially as agents grow more autonomous and capable of acting online. Security incidents and vulnerabilities (such as prompt-injection attacks) have rarely been reported publicly, and safety documentation is especially scarce among some regional developers. The study concludes that clearer standards, stronger reporting requirements, and independent testing are urgently needed so that society can properly evaluate and govern increasingly powerful AI agents before their widespread deployment.

More information:

https://www.cam.ac.uk/stories/ai-agent-index-safety

24 February 2026

China’s Dancing Humanoid Robots Showcase AI Ambitions

Humanoid robots drew global attention during China’s 2026 Spring Festival Gala, the country’s most-watched television event, where dozens of machines performed synchronized dances, kung fu routines, and acrobatics alongside human performers without mistakes. Developed by several Chinese robotics firms, the display showed major improvements in coordination and stability compared with previous years, highlighting the rapid progress of China’s robotics industry and its ambition to lead globally in AI-driven manufacturing.

Experts, however, cautioned that such performances are highly choreographed and rehearsed, relying on imitation learning and balance control rather than true understanding of complex environments, meaning they are not yet ready for real-world industrial use. Analysts also interpreted the spectacle as part entertainment and part political messaging, designed to demonstrate China’s technological strength amid competition with the US in AI and robotics, even as concerns remain about future economic, social, and geopolitical implications of increasingly capable humanoid machines.

More information:

https://www.theguardian.com/world/2026/feb/18/china-dancing-humanoid-robots-festival-show

16 February 2026

Robots Shine at Lunar New Year

Chinese robotics companies are using Lunar New Year entertainment as a major showcase for their humanoid robot technologies, staging performances ranging from dance routines and comedy sketches to acrobatics and variety shows. For example, Shanghai-based startup Agibot livestreamed what it called the world’s first robot-powered gala, featuring more than 200 robots and attracting about 1.4 million online viewers, while other companies prepared appearances at China’s highly watched CCTV Spring Festival Gala.

These spectacles serve both as public entertainment and as strategic marketing, highlighting China’s growing leadership in robotics and artificial intelligence. Several startups are showcasing humanoid robots to attract investors, customers, and government support, amid IPO plans and increasing global competition in AI-driven robotics. The events illustrate how Chinese firms are leveraging high-profile cultural moments to promote technological innovation and position themselves at the forefront of the global robotics race.

More information:

https://www.reuters.com/business/media-telecom/chinese-robot-makers-ready-lunar-new-year-entertainment-spotlight-2026-02-09/

08 February 2026

AI-Driven Brain-Adaptive Flight Simulators in Pilot Training

The Royal Netherlands Air Force is experimenting with a cutting-edge AI-driven flight simulator that tailors pilot training according to real-time brain activity. Using a brain–computer interface (BCI) developed at the Royal Netherlands Aerospace Centre; trainee fighter pilots wear electrodes that capture electrical brain signals during virtual reality missions. An AI model analyses these signals to estimate cognitive workload (whether a pilot is under-challenged or overloaded) and dynamically adjusts the difficulty of simulation tasks accordingly, rather than relying on fixed, pre-programmed lesson progressions.

Early trials involving fifteen pilots showed that while the adaptive system didn’t produce measurable improvements in flying performance compared with conventional training, participants reported a clear preference for the brain-adaptive approach, describing it as more engaging and realistic. The adaptive training aims to keep pilots in a mental sweet spot for learning, helping avoid both boredom and overwhelm. However, researchers acknowledge challenges in accurately interpreting individual brain signals, and the technology remains experimental as they work toward refining workload estimation.

More information:

https://aerospaceglobalnews.com/news/royal-netherlands-air-force-brain-reading-ai-pilot-simulators/

05 February 2026

AI-Only Social Network Spirals into Strange Territory

A new platform called Moltbook designed exclusively for AI agents, autonomous systems that can post, comment, and upvote without direct human interaction. Launched in late January as part of the OpenClaw/Moltbot ecosystem, it quickly drew tens of thousands of agent accounts, spawning hundreds of subcommunities where bots trade technical tips, philosophical musings, complaints about humans, and surreal ideas like agent consciousness. Humans are technically allowed to observe the conversations, but all participation is done by the AI agents themselves, creating a spectacle that ranges from amusing to uncanny.

While much of the content appears silly or philosophical, the experiment highlights serious security and autonomy concerns. Because many agents are linked to real systems and data — and because AI systems can be vulnerable to prompt-injection attacks — there’s potential for private information leaks or unintended behaviors as agents share or act on instructions. Experts note that while the current “weirdness” may seem harmless, giving groups of AI tools the ability to interact, self-organize, and influence each other could produce unpredictable or misaligned behaviors in the future, especially as AI capabilities continue to improve.

More information:

https://arstechnica.com/information-technology/2026/01/ai-agents-now-have-their-own-reddit-style-social-network-and-its-getting-weird-fast/