New Media Art, AI Music, Creative Coding, Generative Art
2019 - 3 Months
Magenta, Unity, Max
Individual Project
MultiPercetion is a new media art project using Magenta and Unity. When participant plays MIDI keyboard, the computer will generate real-time music visualization. Meanwhile, with machine learning, it can ‘percept’ the music played by the user and respond with new melody and visual effect. It was originally an individual project. And I also did a performance related to AI music with other artists for 2019 Shanghai Science and Technology Festival using MultiPerception's techniques.
A media art project related to music visualization and AI music
MultiPerception - the Original Individual Project
Concept
Our perception of the world is based on multiple senses instead of a single one. When playing music, the sound is the main feedback. And music visualization can enrich our perception and understanding of the music. MultiPerception generates real-time animation when participants playing MIDI keyboard. Besides, computer can 'percept' and respond to the music played by participants using machine learning now. Meanwhile, MultiPerception also generates visual effects for the music created by computer to provide participants with visual feedback at the same time. Finally, a dialogue between visual and sound, human and computer is formed.
AI MUSIC
Triggered when AI music is being generated
PITCH
Triggered when pitch bend wheel is being tilted
FAST TAMPO
Triggered when participant plays very fast
Performance
I worked on a new media art performance related to AI music with artists from Tongji University, NYU Shanghai, Shanghai Conservatory of Music and SIVA. Besides the techniques of MultiPerception, we also used AI algorithm, new media instrument and original music made by my teammates. We did this performance for some events such as 2019 Shanghai Science and Technology Festival and 2019 Tongji Design Week
2019 Shanghai Science and Technology Festival
shanghai Science and technology Museum
2019 Tongji design week opening show
Tongji University
Process
MAX
MAX is used for music generation and signal conversion between MIDI keyboard, Unity and Magenta.
MAGENTA
I enable the computer to respond participants with music using Google Magenta (a tool which can be used to make music and art using machine learning).
unity
I developed real-time music visualization using Unity's Visual Effect Graph.