I used to code PureData to create music reactive visuals. Then last night I started to learn about video synth:
The researching about if I could make it myself, I search for raspberry pi video synth and came across this:
Well can only put 2 links in my post, so check out critterandguitari site for the EYESY device.
So I was wondering if the code was available somewhere and found this:
I think that the PiSound could really be a good start for a video synth where you can connect the Midi in for controlling your video synth and the Audio in (2 channel mono) which would then make it quite simple to make it happen.
NIce this is great - My research lead me to use openFramework for this type of thing, I’m glad to see the body of work you’ve done already.
Does any of these project uses the PiSound? Mostly wondering how to get the Midi input into a project, I see there is a library for OF that has midi in, but not sure how to get it from the PiSound to OF.
Apart from that, Awesome work you are doing, this is pretty sleek. It feels like the PiSound would be the perfect platform for a video synthesizer!
Ah Phosphorm looks amazing. Any ideas where I would start getting this working on a Raspberry Pi 4 + PiSound??? I’m fairly new to RPi so no idea where to start really.
phosphorm is not my synth you can ask andrei at the link I posted though and I’m sure he would get you squared away. In the link I posted I talk specifically about how to start using one of these synths step by step from scratch so that should really get you going. RPI4 is not supported currently there is a lot you can do on the 3b/3b+.
I don’t think you are going to be able to do something like pisound and a video synth on the same RPI I’m not totally sure if that is what you are asking about but I just want to make sure that is clear from the get go.
In my latest distribution of scrawl I’ve got a couple of my synths and several others loaded up and ready to run by double clicking icons on the desktop.
it is really as simple as download/burn/put in sd card/run
Why is that? Is it because PiSound would take too much resource? I can build a really minimal setup in Linux on rpi if needed (no X or desktop for example) and that is why I was looking into openFramework. I started the process and setup a dev environment with the rpi connected to my computer (so that I can compile openFramework on my desktop and send the code to the rpi) but I have to say that I am lagging behind as far as coding goes…
I am going trough Andrei resources links at the moment which are quite interesting:
I would use two Pis for this. Maybe you can figure out how to make both work at the same time but it has never been something I’ve been looking to do so I can’t really help. I would imagine you will not have enough computing power to run both things at the same time. It is just a guess.
There is tons of information out there about RPIs though so maybe you’ll make some headway!
I haven’t been programming on the RPI for a little while now as I’ve been focused on the physical side of video synthesis but I’m sure I’l come back as I have at least two unfinished free video synth projects to release still.
Ah right I see. I’ve just ordered a RPi3 + Hifiberry DAC+ ADC HAT … hopefully that will work I just hoped since I had the RPi4 + PiSound I could get it working with that.
I’m playing around with RPi3, using puredata and GEM to do some video input processing and blob-tracking to control sound from PD in real time. So far, I’m surprised it is working ok. Once I get what I need working, I’ll try to add Pi-sound and move it over to the Pi4 to get a bit more grunt. I imagine I’ll be definitely adding fans in this scenario. The work is not generating graphics, except to see what’s going on with the tracking and tweaking things.
It will hopefully be a little box with a camera and an audio out to mount up high and make sound that responds to people walking below.
Ultimately, it will run headless and be remotely tweakable. Frame rates often plummet to slide-show levels but I think I can optimise things a lot more.
I had a few setbacks but I think I’m getting it to work.
It seems the key thing was to not use the new built-in Deken installer for externals to install GEM. When I did that, some libraries were broken and a few other things weren’t right.
From a standard Patchbox installation, set up to boot into the desktop.
I opened up a terminal and used sudo apt-get install gem
and sudo apt-get install iemmatrix
There is so much conflicting info out there, I hope I’m not adding to it.
I have found it capable for basic movement and blob-tracking, so far. I am running the audio and graphics stuff in separate processes. For audio, I’m using the default PD as installed by Patchbox with realtime priority etc. For video, I’m running a separate instance with -nrt and -nosound in order to relinquish realtime priority and to keep DSP switched off. This stops the sound getting glitchy when the graphics get busy.
It is running unattended but the graphics instance seems to often crash in the middle of the night.
Because it’s set up onsite, and I can only VNC in to mess with it via my phone’s hotspot, I have not found why this happens yet. It may be that I need to get a Pi with more RAM.
Here is a recording of it in action, responding to people’s movement underneath the big sculpture. The tracking camera is not far from where this video was shot.
tried on an extra pi4 but can’t get past a black screen, but I’m also messing around with a really weird long screen, so I may wait to try on something with some standard resolutions. I’ll try on my pisound once Ive had a chance to backup the system as I’ve been playing around with MODEP quite a bit lately
I don’t think you’ll see anything on the GUI as it will be used for the video synth, you should only be able to connect to it trough ssh or the web interface. My raspberry pi is fried and I used the last 3 on another project which is now gone. I had to order a new raspberry pi so I can’t test it for a while
I’ve got the webinterface up, but when I try to start it, it tries to start, but the x errors out. Tried installing on pisound, but the install script seems to be expecting the pi user rather than patch, which makes me think I might have better luck with the pi4
Did it produce any errors? If not, it may still work - the /home/pi is symlinked to /home/patch to handle some of the cases where ‘pi’ name is assumed.
The systemd .service files should be edited then to use patch for both.
How I’ve solved it for MODEP .deb packages was to create a dedicated modep user for running the relevant services, so they can be installed both on Raspbian and Patchbox. Not sure if the same solution would work in your case too. Just in case, here’s a modep-common package on which modep-mod-host and modep-mod-ui depend, so that modep user is there by the time they get installed: