Researchers at the University of Utah have developed an AI co-pilot system for prosthetic bionic hands that uses advanced sensors and machine learning to make gripping and manipulation more intuitive and natural for users. By equipping commercial prosthetic hands with pressure and proximity sensors and training an AI model to interpret that data, the system can autonomously adjust finger positions and grip force in real time, significantly improving success rates in tasks like picking up fragile objects.
The shared-control approach balances human intention with AI assistance, reducing cognitive burden and addressing a major reason many amputees abandon their prosthetics. Early studies show greater dexterity and precision compared with traditional myoelectric control, and the team is exploring future enhancements like tighter neural integration to further blur the line between artificial and natural limb control as the technology moves toward real-world use.
More information:
https://arstechnica.com/ai/2025/12/scientists-built-an-ai-co-pilot-for-prosthetic-bionic-hands/