Latency Measurment with jack_iodelay

Hello,

I’m trying to run a latency measurement with the jack_iodelay tool, but so far it shows me that the signal is below threshold. Here is the output of my jack connections as well as the jack_iodelay:

patch@patchbox:~ $ jack_lsp -c
system:capture_1
   system:playback_1
system:capture_2
   system:playback_2
system:playback_1
   system:capture_1
system:playback_2
   system:capture_2
system:midi_capture_1
   mod-midi-merger:in
system:midi_playback_1
   mod-midi-broadcaster:out
system:midi_capture_2
   mod-midi-merger:in
system:midi_playback_2
   mod-midi-broadcaster:out
system:midi_capture_3
   mod-midi-merger:in
system:midi_playback_3
   mod-midi-broadcaster:out
mod-midi-broadcaster:in
mod-midi-broadcaster:out
   system:midi_playback_1
   system:midi_playback_2
   system:midi_playback_3
   system:midi_playback_4
mod-midi-merger:in
   system:midi_capture_1
   system:midi_capture_2
   system:midi_capture_3
   system:midi_capture_4
mod-midi-merger:out
   mod-host:midi_in
mod-host:midi_in
   mod-midi-merger:out
system:midi_capture_4
   mod-midi-merger:in
system:midi_playback_4
   mod-midi-broadcaster:out
system:midi_playback_5
system:midi_capture_5
patch@patchbox:~ $ jack_iodelay
new capture latency: [0, 0]
new playback latency: [0, 0]
Signal below threshold...
Signal below threshold...
Signal below threshold...

Output is physically connected to the input with a jack cable. I tried tweaking the input and output gain knobs of Pisound but to no avail.

I’m running on:

Linux patchbox 4.19.71-rt24-v7l+ #1 SMP PREEMPT RT Wed Mar 11 17:15:58 EET 2020 armv7l

MODEP module v1.8.0

Any hints on what I am missing?

Did you make the jack connections to jack_iodelay client? You may do so using jack_connect or using software like Patchage.

Good point. Following was missing:

patch@patchbox:~ $ jack_connect jack_delay:out system:playback_1
patch@patchbox:~ $ jack_connect jack_delay:in system:capture_1

Now the measurement is working. I was a bit surprised with the result though. I measured the 64 samples 48kHz 2 periods configurations and got:

235.286 frames      4.902 ms total roundtrip latency
     extra loopback latency: 43 frames
     use 21 for the backend arguments -I and -O ??

I was expecting roughly 2 * 64 / 48000 + ~1ms (CODEC) =~ 3.7ms. jackd is running on sync mode so I assumed no extra buffering period:

patch@patchbox:~ $ ps -aux | grep jack
(...) /usr/bin/jackd -t 2000 -R -P 95 -d alsa -d hw:pisound -r 48000 -p 64 -n 2 -X seq -s -S

How should I interpret this: is the Pisound CODEC taking 3.3ms latency? Or is jackd somehow introducing additional buffering, given that it reports 43 frames of extra latency (235.286 - 43 ~= 192 frames expected? )?

By the way, I manually started jack on async mode and the latency was roughly the same (0.02ms slower):

patch@patchbox:~ $ jackd -t 2000 -R -P 95 -d alsa -d hw:pisound -r 48000 -p 64 -n 2 -X seq -s
   236.206 frames      4.921 ms total roundtrip latency
        extra loopback latency: 44 frames
        use 22 for the backend arguments -I and -O

In sync mode I got:

   172.273 frames      3.589 ms total roundtrip latency
        extra loopback latency: 44 frames
        use 22 for the backend arguments -I and -O

It looks like the position of -S is important, as once the parameters start describing the ALSA audio config, it is then interpreted whether to use 16 or 32 bit samples.

exec /usr/bin/jackd -S -t 2000 -R -P 95 -d alsa -d hw:pisound -r 48000 -p 64 -n 2 -X seq -s -S

As I understand, the reported ‘extra loopback latency’ is the time spent in the codec, this value is consistent with the expected time (~1ms, this includes both input and output processing). The rest of the time is spent in software buffers.

Btw, I wasn’t aware of the Sync flag before, it does report better latency, is it a good idea to have it on by default, or are there some possible caveats?

Nice catch :slight_smile: By the way, you kept the second -S at the end probably by mistake.

When running in sync mode, the normal ping-pong buffering is implemented. When in async mode, there is an extra frame to avoid glitches for plugins that didn’t end up their processing within the real time window. Basically, it tries to mask those glitches under heavy loaded conditions. To quote falkTx from here:

PS: For those wondering what “sync/async mode” is…
Basically the audio engine we use (JACK2) uses an async audio model by default, where the audio renders to a non-active buffer and plugins that were able to finish rendering on time get their buffer copied into the real/active one. This is to prevent misbehaving plugins from causing audio glitches, the audio from such plugins will just not be used. So on a parallel chain of plugins, audio still keeps running except for the chain that includes the bad plugin.
The latency added by this async mode is the same as one audio period.
When using sync mode, the plugins render directly into the active audio buffer. This has lower latency, but makes xruns much more noticeable (one bad plugin can ruin the entire audio graph, even if disconnected)

I think it would be a nice idea to put this option in MODEP’s jack configuration and leave it on by default. X-Runs should not happen anyway under normal circumstances.