C/AMIGObot: Creating virtual soundscapes with robotic senses

C/AMIGObot: Creating virtual soundscapes with robotic senses first appeared in Ryerson’s INNOVATION NEWSLETTER

Jason with robot

Jason Nolan has been immersed in virtual reality (VR) in its various permutations for decades, long before its most recent incarnations of high-definition headsets and experiences.

“VR and augmented reality (AR) are most often associated with immersive visual environments, but AR/VR environments run the gamut from text-based simulations that have been around since the early 1980s to vibro-tactile hardware and the present trend in the form of VR glasses,” said Nolan, a professor with Ryerson’s School of Early Childhood Studies. “My present AR/VR project focuses on an interactive environment-sensing robot that we are calling ‘C/AMIGObot: A Creative Autonomous Mobile Interactive Generative-music Object roBot,’ which generates sound based on data from over 20 sensors.” These sensors can detect environmental information such as proximity to objects and people, ambient noise, environmental factors, and light intensity.

Nolan’s cross-disciplinary team is finishing the second prototype of the C/AMIGObot and hopes to begin field testing in the new year to assess how this method of “sonifying” spaces might influence our perception and understanding of the physical spaces around us. He is the director of the Responsive Ecologies Lab and the Experiential Design and Gaming Environments Lab, where the project is housed.

C/AMIGObot’s virtuality is perceived through auditory stimulation of space and participants, rather than through sight. C/AMIGObot takes the data that its sensors collect and uses it to generate ambient sound that in turn represents spaces virtually. “All of this information is processed into data that can then be assigned to various elements of music synthesis such as various generators and oscillators, and circuits,” says Nolan. “This would enable the general public or musicians to create music with the data generated by physical spaces, micro-environmental conditions, and how the individuals move in and about the space.”

Potential uses for the C/AMIGObot run the gamut from helping children to understand their learning environments, to modifying the perception of institutional spaces, to giving musicians tools to rethink how musical compositions represent and interact in mixed-reality (AR/VR) spaces.

Nolan is autistic, and this project is centred on Nolan’s curiosity about how young children explore and physically engage with sensory information as the foundation for their learning. The project is heavily influenced by the British musician and producer Brian Eno, and his ideas and work in generative and ambient music. Nolan believes that moving beyond an “ocular-centric perspective” offers new research, design and learning opportunities.

“Though I primarily see C/AMIGObot as a learning tool to encourage people to re-think how we perceive spaces, I look forward to supporting new ways of interacting with and through the spaces in which we live,” said Nolan.

About Jason

Director of the RE/Lab & EDGE Lab and associate professor in the School of Early Childhood Studies.
This entry was posted in Core Faculty, News. Bookmark the permalink.