First of all, this proposition is very open-ended so we will need to discuss your ideas thoroughly before any work begins. I would be glad to brainstorm and throw ideas back and forth with you. Also, the more work you want to put into it and the better your ideas are, the more I would certainly be willing to pay for it, so keep that in mind. You could probably take this idea very in-depth if you wanted to. On the other hand, a much more minimal implementation would be great too!
I would like to take the idea of an audio-responsive visualizer one step further by creating a new visualizer (or upgrading any existing one) with 2 main features:
- It needs to be designed to handle multiple separate tracks, rather than just a single audio signal. These tracks would wind up being various different instruments, whether already-recorded stems or a user's active real-time playing, drum tracks, guitar tracks, synth tracks, etc.
- The visuals generated by each of these tracks need to interact with each other in some way, based on some separate secondary rule(s), that specifically takes into account the individual track(s) that are interacting.
The idea here would be that new emergent behaviors could be generated in an unpredictable fashion, creating something you could watch endlessly and never see the same thing happen twice. So say, if some aspect of one track's visuals overlapped with that of another track, new reverberations might shoot out, affecting both. There's really no limit to the creative things that could be done here.
To first think of it as a simple song visualizer would probably be best, but one could imagine adding any number of arbitrary rules to the mix that gradually turn it into much more of a game. Each of the tracks could be seen as different players, all competing with each other, and there could even be ways for them to "win" or "lose". For example, fluctuations in each track's amplitude could cause some territory of the screen to become filled with that track's color. If some track had all of its territory absorbed by colors of the other tracks, it might be effectively eliminated from the "game". There could even be numerical scores in the corners for each track, and certain arbitrary events in the visuals could trigger them to go up or down.
Then consider that some of these tracks could be playing sounds generated by an instrument you hold in your hands and play and it becomes a game played and controlled by music itself. This is essentially the novel idea I'm trying to pursue. With your keyboard or drum set as your controller, a whole new angle of intentionality comes into the picture and the user could "play" the visuals, just like a videogame. All sorts of emergent strategies could come into existence and any sort of game you could imagine might evolve. All the while, the soundtrack for this game is being actively written in real-time by the players themselves.
the template I will give you for creating your patch
I'd like to give any developer creating a visualizer free reign over the design they bring to it. That being said, here are some more random points if you really want to go in-depth with the game idea:
- If Jitter can only display things and there's no way to analyze or compare, then that could just be done with the audio before it was visualized. In which case "making the visuals interact" could be as simple as using, for example,
>~
to compare the different channels of audio and reacting based on the results. - The audio being fed into the patch could potentially be from any source, but in my project the humans jamming in the left inlet will be getting recorded in real-time and these recordings will be the loops coming in the right inlet. Therefore a really interesting feedback loop could start occurring, depending on the rules you create.
- There could be NPCs running around to kill or defend, terrain to traverse, levels to beat and environments to explore, ways to lose and ways to win: anything you might expect from an ordinary video game.
- It might be best to think of it as a rhythm game, since audio is generally going to pulse with the beat of the music playing. Therefore you could consider moments when the beat of a computer "team" matches up with that of the player "team" to trigger certain consequences, either beneficial or detrimental.
- It might help to come up with a central mechanic and think of everything in terms of that. So every decision that needs to be made (layout, etc.) could be made to maximize the number of interesting ways that mechanic could come into play. With it as densely woven around one mechanic as possible, the players' control over the game will feel nice and articulate and the gameplay responsive.