X-ray vision has long seemed like
a far-fetched sci-fi fantasy, but over the last decade a team from MIT's
Computer Science and Artificial Intelligence Laboratory (CSAIL) has continually
gotten us closer to seeing through walls. Their latest project, 'RF-Pose', uses
artificial intelligence (AI) to teach wireless devices to sense people's
postures and movement, even from the other side of a wall. The researchers use
a neural network to analyze radio signals that bounce off people's bodies, and
can then create a dynamic stick figure that walks, stops, sits and moves its
limbs as the person performs those actions.
The team says that the system
could be used to monitor diseases like Parkinson's and multiple sclerosis (MS),
providing a better understanding of disease progression and allowing doctors to
adjust medications accordingly. It could also help elderly people live more
independently, while providing the added security of monitoring for falls,
injuries and changes in activity patterns. The team is currently working with
doctors to explore multiple applications in healthcare. Besides health-care,
the team says that RF-Pose could also be used for new classes of video games
where players move around the house, or even in search-and-rescue missions to
help locate survivors.
More information: