Community – Max Patch

The ideas of the collective often get jumbled and the result of that is poor decision making.  In this interactive performance created using Cycling74 Max, I attempt to demonstrate this idea.  Using public domain videos from the C-Span archive, since politicians are by definition the personification of the collective of their respective district, state, or country, many different videos are mashed together into one.

Main Patch

max patch.png

“rec” Subpatch


“editor” Subpatch


“convert to midi” Subpatch

convert to MIDI.png

“sound” Subpatch


Using the toggle button labeled “Record,” you’re able to record yourself using the “rec” subpatcher and save it to a separate folder.  When you have enough videos, preferably 10 or more, you then drag the video folder into the 3 MOVIEFOLDRs.  The videos are then routed to the “editor” subpatcher where they are manipulated by the ALPHABLENDR, FEEDR, and SLIDR to get the visual aspect of the performance.  The visuals are then routed to a VIEWR, for monitoring, the PROJECTR, where the performance can be shown on a separate screen, and a RECORDR, so that the performance can be recorded.

For the sound, the visuals are routed to the “convert to midi” subpatcher where the RGB channels are made into MIDI notes.  These notes are then put through the “sound” subpatcher where they are put through 3 QUANTIZERs and then a CURVY.  The red channel is put into the trigger, the green into the shape, and the blue into the scale.  These results are then patched into a FREEVERB and a FEEDBACK DELAY, using the settings shown in the image above.  This signal is then put through a STEREO, for output, and a RECORDR, for recording purposes.

After loading in the video folders to the MOVIEFOLDRs and pressing the toggle button labeled “Playback,” the performance would begin.  The videos can be changed by using the 3 dials above the MOVIEFOLDRs.

Big Ideamax.jpg


My big idea for a Max patch, that subsequently led to the creation of the one above, consisted of a sensor that the user would hold that would track the temperature of the person holding it.  Using this, the object would play a musical note based off of that.  The object would also be connected to another sensor that would play the note of the user and the note of a different user’s when they are touched.  Participants would be put in a room with these sensors and instructed to discover other people’s notes and create chords.  This creates a sort of biofeedback that allows people to discover others in a way they couldn’t before.

This big idea relates to my final product in that a sense of community is created and people are show together.  Although the big idea has a more positive message whereas the final is negative, the final could be made closer to the former by setting up cameras in a room and then moving through different live streams in the same way you would with the videos.  Another way it could is by having people walk through a room one by one with a floor toggle in the beginning to start the recording and another at the end to stop it.  When the entire audience is in the room, they can then watch themselves and all the people around them in the video.  If this were to be pushed father, cameras could be placed around a city and the streams would then be manipulated.

Through the making of this patch, I was able to learn how to use the program relatively well, with the little experience I had before using it.  In the future, I want to push myself even further in making interactive pieces and installation art using this program.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s