top of page

Post-Bit Human Universe: Signal Transmutation

Guangxi Experimental Art Biennale
Guilin Museum of Arts

Opening: November 5 2022

Post-Bits Human Universe—Signal Transmutation (PBHU-ST) is an interactive installation that enables audiences from China (Guizhou Normal University), Canada (York University), and the United States (2022 Hackday Supercon) to immerse themselves in the imagined post-anthropocene and interact with the multifunctional artificial intelligence programme "AI Philosopher" via sound, body motions, and text input. Through the "AI Philosopher's" data transformation module, PBHU-ST connects human audiences in different geographical locations to the new iteration of PBHU. The PBHU-ST system includes mixed reality applications, an "AI Philosopher" interface device, and spatial projection mapping.

Audiences interact with the imaginary scenario of PBHU, using VR headsets, interactive spatial projection, and microphones. In addition, in order to implement the signal transmutation of biological intelligence with artificial intelligence "AI Philosopher," human audiences must imitate the chirping of insects into a microphone. While audiences in China make the chirping sound, the corresponding response generated by AI Philosopher will be presented on spatial projection scenes in the exhibit site in China and Alice Lab (Toronto, Canada). On the other hand, AI Philosopher also captures and collects the characteristics of audience body movement trajectories from exhibit sites in China and Alice
Lab using pre-set IP cameras to integrate them with the chirping sound they make and the related generative content.

During the exhibition, we will host a workshop for Hackaday Supercon participants, allowing them to perform live coding on the source code of AI Philosopher and PBHU to communicate remotely with spectators in various geographical locations. The nature of the co-creative framework emphasises the machine's role as a central agent in virtual world-building and whose creative and artistic decisions are separate from, but of a collaboratory nature to, a human actor. Complex worlds can be made from within Virtual Reality (VR) by using powerful procedural 3D animation and visual effects tools
that are used as industry standards by digital artists to make the best cinematic results. Presented here is a unique system that combines Artificial Intelligence (AI), VR, and complex content generation that utilises Web-and Cloud-based frameworks to integrate real-time 3D rendering with procedural modelling and dynamic simulation. Collaborative creativity (CC) is therefore made accessible to both multiple human agents through tele-presence and to artificial agents-creatively responding to their constantly evolving virtual world.


independent display & HQ

The Fields Institute for
Research in Mathematical Sciences, (display) 
and Alice Lab at York University (HQ)
Toronto Canada
bottom of page