Help getting the output audio buffer from MODEP, sent via websocket

Hello -

I’m new here and still in awe at discovering this community, and the potential of patchbox + modep possibilities.

A project I wanted to try is broadcasting the output audio buffer from MODEP to a local (or remote) webpage via websockets. Then in a browser I can do fun stuff with the waveform using three.js, glsl, etc… I’m new to linux tho and struggling to get this to work. Not sure the best place to start. Here’s what I’ve tried so far:

  1. run a local node.js server on the Pi. Capture the output audio stream from MODEP and send it out as a float array via websockets. I’ve confirmed this will all work fine for my use case, except, I can’t find a node.js library that will let me capture the output from MODEP. Several come up when I google this question (like this one, for example) , but none seem to work (or I don’t understand how to make them work).

  2. Reverse engineer the MOD UI server, and have that send the output audio buffer out via a websocket connection. It looks like it already uses websockets to control the MOD host effect, but I’m unclear if the MOD UI even has access to the audio buffer.

Is there a better, 3rd option? Again my goal is just to get the output audio waveform sent via a websocket connection from the Pi. Any hints much appreciated! Thx!

I actually found another way to get the audio buffer using webaudio in a browser. I can get the microphone input with something like this.

By default, this is just the input audio, not the processed audio that MODEP outputs. However in patchage if I plug an effect out into PulseAudio JACK source, then I can see the final waveform.

This is almost perfect for me, because I can still send the waveform to node.js via websockets and along to other remote clients. The remaining problem is that when I change my effect in MODEP, the patches are rearranged (which I can see in patchage) and either MODEP crashes or I’m no longer monitoring the correct signal.

Not sure if this is a smart path to pursue or not. Cool that it works at all I suppose!

Hi dskill. I’m a newbie here.

Have you tried just playing an audio file through alsa or pulseaudio instead?
This is to differentiate if those libraries are reading from the pulseaudio output instead of JACK.

If that works, then maybe connecting the JACK sink to the pulseaudio output sink can do the job.

I can take a look tomorrow at my RPi and see what can I get on that regard.

I haven’t tried playing audio directly to see what those libraries are reading
from. I’ll give that a shot.

From patchage, if I disconnect the system capture_1 and capture_2 nodes from PulseAudio JACK Source front-left and front-right, then the waveform in my browser .js code goes silent. I’m not familiar enough to know if that answers the question of what those libraries are reading from.

If that works, then maybe connecting the JACK sink to the pulseaudio output sink can do the job.
I don’t see a JACK sink in patchage, but maybe there’s another way to make that connection?

I’ll keep poking. Thx for the suggestions!

here’s a screenshot:

Oh wow - shortly after posting this I accidentally corrupted my OS (I disabled the MODDEP module and then re-enabled it, and something went sideways in that process).

But… happy accident: I was on an older version I think, cause after starting out with a fresh image I now have an incredibly convenient node called mod-monitor which does exactly what I’m after.

So now my question boils down to this:
Is there some way I can configure MODDEP to automatically connect the mod-monitor node to the PulseAudio JACK Source node on startup (as in the screenshot below) ?

1 Like

You may try adding a systemd service that uses jack_connect to make the necessary connection, that gets started/enabled once modep-mod-host service is running:


There’s probably more properties that are required to make this work smoothly, see systemd.unit(5) - Linux manual page and search the internet for information on how to get a service to run every time another service gets started.

One key thing in MODEPs case is EnvironmentFile=/etc/environment must be specified within [Service] section, so the correct Jack server is found.

Another easily overlooked thing is that you have to use absolute paths as much as possible in your .service file - like you have to use /usr/bin/jack_connect instead of jack_connect to run it.

Oops! I read this super helpful response but failed to follow up. Thank you! I started to migrate my work to a super collider module instead of relying on MODEP. That made it a little easier to send the waveform directly to the browser via OSC instead of relying on the web-audio microphone input API’s. Depending on how far I get, I may return to this initial implementation using MODEP. Much appreciated!

1 Like