30 April 2023

AR Art In Sheffield

In February, Sheffield launched one of the world’s biggest AR art trails. Titled “Look Up!,” the trail counts four buildings, each of them paired with a QR code on the sidewalk below. Using a free app, viewers can scan that QR code to follow a bunch of animated arrows that lead their gaze upward. There, from the roof of a building, they can watch a stick figure made of different-colored balloons drift up, swirl around, and fade into the sky—all through their phone’s screen. In the week following the launch, over 1,500 people had downloaded the app and almost 2,000 QR codes had been scanned.

The platform and app were created by a local company called Megaverse, which worked closely with Niantic, the San Francisco company behind Pokémon Go. The virtual artworks were created by two other local firms: Universal Everything and Human Studio. The impetus of the project goes back to a single building located smack dab in the middle of Sheffield. The John Lewis department store had been an anchor in Sheffield since the 1960s, when the building was still known as the Cole Brothers store. And then the pandemic hit, the store closed, and John Lewis withdrew from the building.

More information:

https://www.wired.com/story/sheffield-uk-augmented-reality-art-look-up/

22 April 2023

Sensor Recognises Moving Objects and Predicts Path

Scientists at Finland's Aalto University have developed a neuromorphic visual sensor that can recognize moving objects in a single video frame and anticipate their trajectories. An array of photomemristors that generate electricity when exposed to light forms the heart of the sensor, the current's gradual decay after the light's removal enables the devices to recall recent exposures, providing a dynamic memory of the preceding instants. Current motion detection systems need many components and complex algorithms doing frame-by-frame analyses, which makes them inefficient and energy-intensive. Inspired by the human visual system, researchers have developed a new neuromorphic vision technology that integrates sensing, memory, and processing in a single device that can detect motion and predict trajectories. To demonstrate the technology, the researchers used videos showing the letters of a word one at a time. Because all the words ended with the letter ‘E’, the final frame of all the videos looked similar.

Conventional vision sensors couldn’t tell whether the ‘E’ on the screen had appeared after the other letters in ‘APPLE’ or ‘GRAPE’. But the photomemristor array could use hidden information in the final frame to infer which letters had preceded it and predict what the word was with nearly 100% accuracy. In another test, the team showed sensor videos of a simulated person moving at three different speeds. Not only was the system able to recognize motion by analysing a single frame, but it also correctly predicted the next frames. Accurately detecting motion and predicting where an object will be are vital for self-driving technology and intelligent transport. Autonomous vehicles need accurate predictions of how cars, bikes, pedestrians, and other objects will move to guide their decisions. By adding a machine learning system to the photomemristor array, the researchers showed that their integrated system can predict future motion based on in-sensor processing of an all-informative frame.

More information:

https://www.aalto.fi/en/news/a-neuromorphic-visual-sensor-can-recognise-moving-objects-and-predict-their-path

21 April 2023

Timing for Autonomous Bus Sounds

Cornell University and Linköping University are using sound to improve autonomous buses' navigation and communication capabilities. Researchers found the timing of sounds is the critical factor in ensuring effective social engagement by autonomous buses in the Swedish town of Linköping. The researchers designed potential bus sounds through an iterative process.

They played various sounds through a Bluetooth speaker outside of a bus to alert pedestrians and cyclists to the vehicle’s approach, analyzed videos of interactions, and chose new sounds to test based on that information. They observed through video analysis that timing and duration, rather than sound type, were key to signaling the bus's intentions.

More information:

https://news.cornell.edu/stories/2023/04/autonomous-bus-sounds-its-all-about-when-not-how