Introduction: An Attempt on Live Visual Music

Hello and welcome to my first attempt at making Live Visual Music! My name is Wesley Pena, and I am an Interactive Multimedia Major at the College of New Jersey. This instructable is part of a final for my Interactive Music Programming class, where we work in the intersection of technology and music to create something hopefully inventive and fun!

This project combines Max/MSP/Jitter, a visual-based programing language designed for music, Processing, an open-source language used primarily for making visual designs, and any Midi Keyboard to create some Live Visual Music. In this instructable, I will very quickly go over the step-by-step process of how I went out about socketing all of the software together and go over the many possibilities that come with them.




The oscP5 library for Processing

Any Capable Midi Instrument

Step 1: Step One: Open Sound Control and Communicating With Other Software

One of the beautiful things about Max8 is that it is able to communicate with MIDI devices fairly easily, and while there are libraries for Processing that allow it to connect to MIDI as well, it doesn't compare to what Max can do musically with all that data. So, you want to use both pieces of software. How do you go about getting them to talk to each other?

To accomplish this, we use a protocol called Open Sound Control (OSC). This allows us to send midi data with an address attached into the out local machine, where it can then be called back through any other piece of software. With this. We have effectively connected our Midi Keyboard to Max and Processing!

For a more in-depth guide on how to route the software together, This Article by Corey Walo goes over how it's done.

Step 2: Step Two: Adding Functionality in Max

The cool thing about having separate specialized software working together is being able to add a whole bunch more functionality. You can create generators, arpeggiators, custom functions like doubling notes, or play chords with the press of one key. Any function that is imaginable in Max, using the OSC Protocol, can be sent into Processing for some more fun visuals!

In this project, I added the functionality of an arpeggiator.

Here is a link to my code!

Step 3: Step 3: Coding Visuals in Processing

This is what I affectionately refer to as "The Hard Part". You have the data going in, now all that's left is the visuals. Handling data that is coming in real-time can be disorienting in object-oriented programming but with little practice, the visuals that can be created with processing can truly be wonderful.

For my sketch, I had intended for a raindrop to fall for every note that was played on the midi keyboard. It may not work exactly as I describe, but that is through no fault of the software.

Here's a zip file with the code!

Step 4: A Taste of What's Possible

Here is what I ended up producing through all of this experimenting. With a little more practice, I'm sure this could have been a much better sketch, but that is not the point of this instructable

. By making this, my intention was to show that despite my lack of advanced knowledge in programming visuals, it was still relatively easy to connect the software together. I wanted to show that there doesn't have to be this barrier between coding visuals and coding music, that it was possible to play with both. I hope that in reading this, you play with it too, and make something better!

Thank you for taking the time to read my instructable, and have some fun!