Instruction: Just click the run button and position your body in the required place. You may need to fold your laptop into a smaller angle.
Interaction: Just feel free to play the piano, guitar and drum. There are some things that need to be known: 1. When playing the piano, remember not to put your hands too high. 2. You can use your left hand to control the position of the guitar. 3. You need to put your right hand on the left side of your body to play the drum.
I used the angle of the user’s arm as the factor to control the type of instruments. For example, when the angle is smaller than a certain value, the system will generate a guitar figure and play the guitar’s sound. And the trigger of the drum is that the x value of the right hand is smaller than the x value of the nose. This is not as good as machine learning for sure but it works if the user is familiar with the syntax of the system. Instead of the manipulating part in my artist statement, I set it free for users to change instruments during the music. Not only because of the time limitation, but also because it is really interesting to change instruments when interacting with the song. I will develop the manipulation part of course in the future. According to my statement, “I want to explore how the musician’s movements direct the music”. After several user tests with my project, I think I should change the purpose into “I want to explore how the music directs users’ movements”, because I find people will act differently according to the music. The music does have a power to make people “dance”. And to some extent, the user’s movements are choreographed by the music.
To change between different figures and audios, I used the switch method in p5js. This is helpful for me to combine different effects in one pattern.