🀘

Difficult to see, but the visuals are generated with the input from your synths?

Besides nostr:npub1yfg0d955c2jrj2080ew7pa4xrtj7x7s7umt28wh0zurwmxgpyj9shwv6vg and nostr:npub1h8gzew8am6cezuq7cpjgudldra40hgnruqrqlsrqnxnzs5wjtczqztps02 ,you can also look to host a livestream on nostr:npub1eaz6dwsnvwkha5sn5puwwyxjgy26uusundrm684lg3vw4ma5c2jsqarcgz !? 😎

Reply to this note

Please Login to reply.

Discussion

The midi sequencer sends midi to the synthesizers and samplers, that midi is forwarded to a soundcard connected to a raspberry pi. The raspberry pi then turns the midi signals into visuals using "processing", a java like script language. Unfortunately it doesn't work when I'm streaming, because the soundcard is them connected to the laptop πŸ˜…

I know Processing! I’ve made a interactive art installation with it more than 10 years ago!

Got any references / pictures / code to that? I'd be interested to see!

https://attack.sebastix.nl/

I have a video somewhere as well, will send it later when I’ve found it

Reminds me of Dexter, the series πŸ˜‚