Jack Bridge Software

Jack's Travel Professionals International. Toll Free: 866.326.0871; Telephone: 502.495.6944; Fax: 502.495.6254; Email: Kyle.Robinson@onofrio.com; ×. The Ellis Bridge Jack / Modular Building Jack can support up to 80,000 pounds. It is the most rugged and durable screw jack Ellis offers. This screw jack comes standard in 4 sizes, see below for details. Specifications The Ellis Bridge Jack / Modular Building Jack can support up to 80,000 pounds. It is the most rugged and durable jack Ellis offers.

By Thorgal - May 3rd 2010

Contents

  • 2 The ALSA Loopback 'Sound card'
  • 3 Building an asoundrc file
  • 4 The Jack Bridge
  • 5 Alternative Setup: hardware and software based solution
  • 6 Measuring the latency introduced by the Loopback device

Introduction

Some people may use PCs where the Jack Audio ConnectionKit is running all the time. As one of theseusers myself, my DAW PC uses a very lightWM (fluxbox) without any of the audio layers provided by the morefeature rich WMs (like KDE or Gnome). I have jack started at login andhopefully, it never goes down until the next PC shutdown. Since I do notwant any other audio layer such as Pulseaudio, etc, in between Jack andALSA in my case (but it could well be FFADO for firewire devices), thishas the slight disadvantage of making non jackified applicationsunusable.

So how can one provide a permanent bridge between non jackifiedapplications and Jack? Well, there are different ways. One can bepurely hardware: enable another soundcard (e.g the onboard sound chip)and physically link it to your DAW sound-card if you are like me with adedicated audio h/w for DAW operations. Let this extra soundcard be thedefault (ALSA index 0) so that apps like flashplayer, skype, etc, use itby default. However, this may limit the number of h/w IOs of your DAWsoundcard for your pro work. I do not like this physical link because myRME Multiface II has only 8 analog mono INs, while the onboard soundchip (Intel HDA) has no digital output that I can link to the Multifacedigital input. I would be forced to patch two INs of the Multiface tothe stereo output of the onboard chip. It is too expensive to considerin terms of physical IOs.

The alternative is a software solution or a mix h/w - s/w (as used in myfinal setup). So, I was looking for a solution in the form of permanentJack clients, playback and capture ideally, or at least playback since Ican use the capture device of a second soundcard like the onboard chipor anything else (I will clarify this further down but it is notnecessarily an average setup). In terms of software solutions, the ALSAjack PCM plugin is in my opinion not ideal because the Jack client willdisappear as soon as the application stops outputting audio.Furthermore, this PCM plugin has not been updated (except lately byTorben Hohn but the patch is not widespread) and I found the pluginquite buggy / unstable in many situations.

It is not until recently, as I was fiddling with the ALSA Loopbackdevice, that I saw a way to achieve what I needed.

The ALSA Loopback 'Sound card'

The ALSA Loopback sound card is a virtual soundcard that is created oncethe ALSA kernel module snd-aloop is loaded. This virtual soundcarddevice, as its name indicates, sends back the output signal ofapplications using it back to itself, so one has a chance to e.g. recordthis signal from the same device. Simply imagine that you have aphysical link between one OUT and one IN of the same device.

By default, the Loopback sound card consists of 2 devices, each composedof 8 subdevices. Once the kernel module snd-aloop is loaded, you canverify that the sound card has been created:

Note that you can control the number of subdevices with the moduleoption pcm_substreams (8 by default). You can always set it to 2 onlyif you wish at loading time. As an example, here is my ALSA moduleconfig file (/etc/modprobe.d/sound.conf on my debian-based DAW)

As you can see, I do fix indexes even though ALSA and Jack can work withnames only. It is motivated by the special position that is index 0, theALSA default device that flashplayer will try to use.

Compiling snd-aloop if needed

Update: it may not be needed any longer as of kernel 2.6.38 ...

It may well be that the ALSA Loopback kernel module was not included inyour distribution's kernel package (it is the case in e.g. debian, asfar as I know). This is no bother as we can easily compile it. Note thatthere is no way around since the loopback ALSA module is not part of thekernel baseline in general. So unless your kernel packager had done thefollowing work, you will have to do it yourself ...

Warning: I tried alsa-driver 1.0.21 against 2.6.33.5-rt22 and while itcompiled fine, it would not load at all, even when forced. So don'twaste your time with this version combo.

Make sure you really don't have it installed. Better check that not :)

If modinfo reports nada, time to check that you have installed thekernel headers corresponding to your presently running kernel. I'llleave this to you as this is very distro dependent. In debian baseddistros, the package is called something like linux-headers-xxx andmust match the installed kernel (package linux-image-xxx).

Time to make a backup of the installed kernel modules. Example;

Prerequisite: you of course need a compiler and other tools. In debianbased distros, you can check that you have a package calledbuild-essential installed:

If not, just get it:

Now grab the alsa-driver source code (same version as your installedALSA, in my case 1.0.23 which I will use in my description) from theThe ALSA website, uncompress, untar itand cd to the alsa-driver top dir. Here is a command summary ...

Now you have to configure the source package for compilation. To helpyou, look at what ALSA modules are currently loaded:

And check what card they correspond to by typing

You will see a big list of possible cards. Pick the ones you areinterested in. As an example, this is how I configured the alsa-driversource on my DAW system:

and on my laptop:

So, feel free to configure it the way you want it. Once you haveconfigured the ALSA driver source, you just go through the usualsequence:

It will normally install all the compiled modules into the correctlocation of your kernel installation. Now check that the kernel knowsabout the loopback module:

Allrighty, time to load it. But before that, shut down all audio apps(including firefox). Once done, do this:

Now, see if it works:

If all was cool and dandy, just add snd-aloop in /etc/modules. (Ifyou wish, you can give the loopback soundcard another name than'Loopback' in a modprobe option but I kept the default throughout theentire HOWTO and there is no need to change it.)

In case anything went wrong and you wish to go back to your previousALSA installation, no problem:

Understanding the ALSA Loopback sound card structure

Well, this is not too difficult to grasp. This virtual sound cardconsists of 2 devices:

  • hw:Loopback,0
  • hw:Loopback,1

If an application outputs its audio to one of the subdevices e.g. sayhw:Loopback,0,0 the audio will be available as input in thecorresponding subdevice hw:Loopback,1,0 because the whole point forthis card is to send the signal back to itself.

So the generic principle is that an output signal to subdevicehw:Loopback,i,n becomes an input signal from hw:Loopback,j,n with

Building an asoundrc file

The goal is to create a default ALSA plug device out of the Loopbackcard. For a complete software solution, we need one PCM playback, soALSA apps can send audio to it, one PCM capture, so ALSA apps can getaudio from it, and combine these 2 PCMs into a nice full duplex 'plug'device.

Note that the underlying goal is this: I want the audio of my jacksystem capture ports (from my RME card) to be available at the ALSAcapture device and vice-versa: hear from my jack system playback portswhat ALSA apps are playing back to the ALSA playback device. Tricky...

asoundrc definition

The asoundrc below should work in most situations.

In summary:

This asoundrc is very generic and one can of course tailor it in termsof sample rate, audio format, buffer size, etc. One can find therelevant parameters in the ALSA-libdocumentation.

Some html5 browsers (i.e. Firefox 30; Chrome 38) will fail to open the pcm.!default device for audio playback. This can be fixed by using pcm.card0 instead:

Here is an example applicable to my DAW. I left some notes so youunderstand the extra stuff I also removed unnecessary dsnoop's anddmix's because when you analyse things a bit more, you realize that someof the ALSA PCMs will only be used by one single client (alsa_in/out)so there is no need to use dmix / dsnoop. Dmix only makes sense for theALSA playback PCM because you can have more than one client outputtingto ALSA at the same time. Anyway, note the hardware parameters I haveadded so that it matches my RME Multiface II requirements. For the dmixbuffering parameters, read on below.

Testing our new default ALSA device

Save this asoundrc config into $HOME/.asoundrc but make sure beforethat you are not overwriting an existing asoundrc file (back up whateveryou have if it already exists).

OK, now we can test it from the command line. If jack is running on yourother hardware (RME card in my case), you of course will not hearanything since we have not bridged yet our default ALSA device to thejack graph.

You can use another app (aplay for example). The idea is that an ALSAapp using the default device we have just created will not spit errormessages and will play along nicely. Try for example lmms using theALSA default :)

The Jack Bridge

OK, this is where it will get a little bit confusing because of theloopback nature of the virtual device ;)

Creating permanent Jack clients using alsa_in and alsa_out

Since we used subdevice 0,0 for playback and subdevice 0,1 for capture,remember that signals from these subdevices will be available byloopback to the corresponding subdevices, respectively 1,0 and 1,1 inthis case. So the trick for jack is to use alsain and alsaout on thelatter subdevices :) Brilliant ins't it ? :D

Let's do it from the terminal

I hope you start to see the underlying idea. Once these clients show upin the graph, when an ALSA app plays back to subdevice 0,0 (default ALSAdevice defined in our asounrdc), the signal will be available insubdevice 1,0, which alsa_in listens to. The 'cloop' client we createdcan now be connected to the jack system output ports and o miracle, youwill hear your ALSA app :)

In order to avoid the warning messages from alsa_in/out, you can addthe relevant parameters, e.g. (my case):

On the other hand, if you connect a jack system input port to the'ploop' client created by alsa_out, the signal is sent to loopbacksubdevice 1,1 which will be looped back to subdevice 0,1. This subdeviceis nothing but our ALSA capture device, defined in asoundrc :). So nowyou can record say your bass or guitar or voice (from your jackhardware) to an ALSA app that does not support jack. I tried skype, andit works just fine. You can also try the command line app calledecasound, which does support jack but also ALSA. It is a VERY convenienttool to have around (see further down).

The beauty of it is two-fold:

  • permanence of the 'cloop' and 'ploop' clients (if you shut down yourALSA app, cloop and ploop will remain, always listening
  • if jack crashes, it will bring down cloop and ploop but will notdisrupt the ALSA apps since they only talk to the loopback soundcardwhich is completely independent of the jack environment :D

Create scripts to automate bridge initialization via QjackCtl

The creation of the 'p/cloop' clients can be automated, together withtheir connection to jack system ports. Here is my script:

Note that I used -q 1 as an option to alsa_in/out. This has to dowith the resampling quality. At 2.3ms latency, 96kHz s.r. on a 2 x 2.4GHz dual core CPU system and using Jack2, I get a low CPU usage (1-2%)and the quality is reasonable. If you push it to 2, 3 or 4, the CPU willincrease quite a lot at small buffering / latency

In qjackctl (which I use, YMMV), go to Options -> Execute afterserver startup and add

OK, now that you have added all this, save the qjackctl config, quit andrestart it. Start jack, you should see the ploop and cloop clients inthe graph with the connections between the ports we chose in theloop2jack script.

Test it: open say lmms, load a demo project, play it :)

And voila! try skype, which you can record in ardour if you want (don'tforget to connect your jack system mic directly to the ardour trackwhich you had connected to the 'cloop' client or you will miss recordingwhat you are saying to the other person ;)

So this was a pure software solution and this has the benefit that allyour jack input ports are available to the ploop jack client so thatALSA apps can record the audio coming from these jack ports via thelooped-back device. Of course, the loop stuff has latency (the defaultdmix and dsnoop buffering is quite big), but who cares ? ... Wellactually, I did care a little so I revisited some things and estimatedthe latency added by the Loopback device. I also tweaked a hybridsolution where the ALSA capture PCM is using a real hardware (onboardchip or extra soundcard). Just read below.

Alternative Setup: hardware and software based solution

As mentioned in my introduction, I happen to have an onboard chip (IntelHDA) but also a USB webcam with a built-in mic. It would be a shame notto use their recording capability in some way, especially since I tendto use skype from my DAW PC quite often.

Adding extra h/w inputs in asoundrc

So instead of using the Loopback device for the ALSA capture (all thestuff related to 'ploop' in the previous asoundrc), I simply declaredthe extra h/w in the asoundrc. So I removed all the ploop stuffincluding the now useless Loopback subdevices used for ALSA capture andalsa_out, and added hw PCM devices on the Intel device and USB webcam.

Testing the new ALSA capture

This one was easy to test. I made sure that my asoundrc default cardused the 'usb' pcm capture (see above). I then fired up skype, set it touse the 'Default' device for everything. I fired QasMixer (a nice QT4based mixer if you don't know it) and controlled the capture level ofthe USB webcam from there (I do not let skype control my levels). Then Itried the skype test call and made sure it recorded my voice. The webcamis by the way a Logitech Webcam C310 which I am satisfied with. Worksout of the box in linux due to its class compliance with USB vid.

Note that the same thing can be done with the Intel HDA capture ('intel'pcm capture defined in the previous asoundrc) provided that you plug amic to its input jack of course :)

Measuring the latency introduced by the Loopback device

For measuring the latency, I had to be able to provide both Jack andALSA with a common audio source.

Playback only

Jack Bridge Software

First, I fired up jackd and alsa_in on 'cloop' (just as before). Then,I used ecasound as the middle-man for allowing the measuring of theeventual delay in ardour (which I am comfortable with, you can of courseuse another jack enabled recording software if you want).

Here is the ecasound command line, very simple:

Then in qjackctl, I connected a sound source like my microphoneavailable at system:capture_3 to ecasound:playback_1/2.

In ardour, I created two tracks: one mono track accepting audio from thesame system capture port, and a stereo track connected to the 'cloop'client. Indeed, since ecasound outputs to the default ALSA device, thecloop client should have the audio by loopback. I then recordedsomething in ardour so both tracks contained data coming from the sameaudio source. I compared the resulting waveform and observed a delay of120 ms when the dmix parameters are set to default.

Another way is to use the click sound ardour provides, instead of amicrophone as mentioned above. Just connect the ardour click outputports to ecasound's input ports, and connect the click ports to one ofthe ardour tracks, the other one should still receive the cloop clientdata. If you fiddle with the dmix parameters in the .asoundrc, you willobtain various delays. It is therefore up to you to decide how theperiodsize and buffersize params must be set.

A 120 ms delay is not bad at all considering the huge buffering dmixconfigures by default. But remember that dmix really sucks at smallbuffering so you have to choose a reasonably large one. I ended up usingthe following:

Jack Bridge Software

This setting gives me a final Loopback latency of ~ 35ms, while dmixdoes a good job without choking.

Capture and Playback

If you are using the complete software solution (ALSA playback andcapture via cloop and ploop), then you can still use ecasound as anintermediate tool. Just fire it up in this way:

In qjackctl, connect the ardour click ports to the 'ploop' ports. Thiswill allow ecasound to record the ardour click via the looped-back ploopaudio. Then ecasound will output it to the default ALSA pcm playbackwhich alsa_in collects via the cloop client.

In ardour, just like the setup above, have two tracks, one receivingthe internal ardour click directly, the other connected to the cloopports. Arm the tracks for recording, enable the click, activate thetransport. You will see audio data in both tracks, one is delayed ofcourse (the one connected to cloop). With my tuned asoundrc, I get anoverall Loopback latency well below 100ms (approximately 75ms). It isnot bad at all for the whole purpose of the Loopback bridge.

If low latency is a concern, don't use ALSA only apps, use jackifiedapps :D

Final word

I hope all this was clear enough. The idea behind this was to use a h/wcapture device instead of the Loopback device. This reduces the role ofthe Loopback device to ALSA playback only, and removes the need ofalsa_out, sparing some CPU and jack process cycles. At the moment, I amusing my USB webcam for capture because I only need ALSA capture forskype. The Intel HDA is available as well but I don't really need it. Itis connected to my patch panel though, so I can always use it if theneed comes (unlikely).

Troubleshooting

If you have pulseaudio on you machine, better kill it, overwise it doesn't work.

Before we dive into specific music applications, I need to provide a little background information about audio and MIDI support on Linux.

If you’re coming from Mac OS X or Windows, you may not have heard very much about the Linux way of doing audio and MIDI. Seems like the “mainstream media” don’t want to have much to do with Linux. Linux has a very well-developed infrastructure for audio and MIDI. Linux audio is a “stack” (a layer cake) with audio/MIDI applications on top:

  • Audio applications
  • JACK (Jack Audio Connection Kit)
  • ALSA (Advanced Linux Sound Architecture)
  • Linux kernel

You probably haven’t heard about JACK and ALSA before, so a little explaining is in order.

The Advanced Linux Sound Architecture (ALSA) uses the kernel to implement low-level — but extremely powerful — audio and MIDI features. ALSA provides several useful applications, but I like to think of ALSA as a tool to build higher level tools. ALSA is the layer that supports “soundcards,” which is the Linux catch-all term for hardware audio interfaces, MIDI interfaces, and more. Go to the ALSA project homepage to get more information from the developer’s perspective.

You are far more likely to interact with the Jack Audio Connection Kit (JACK) than ALSA. JACK is an audio/MIDI server that provides audio and MIDI services to JACK-based applications (i.e., applications using the JACK API). The list of JACK-enabled applications is impressive. In fact, this list is a rather good summary of the audio and MIDI applications that are available on Linux! Check out the JACK project page to get more information from the developer’s point of view. End-users (us normal people) should read the JACK FAQ which covers some of the finer points about JACK.

ALSA utils

The ALSA utility applications are collectively known as “ALSA utils.” Use the apt-get command to download and install the ALSA utils:

Here is a list of the ALSA utility applications:

Let’s take a look at a few of these applications in action.

Test speaker output

Although not strictly part of ALSA utils, speaker-test is a quick way to make sure that the built-in Raspberry Pi audio output is properly connected and configured.

First, connect the RPi2 audio output to your powered monitors using a 3.5mm to whatever patch cable. The Raspberry Pi built-in audio can be routed to either the 3.5mm audio jack (“analog”) or to the the HDMI port. Enter the command:

to route the built-in audio. Replace “N” with one of the following choices:

In this case, use N=1 to route the audio to the 3.5mm audio jack. Then, run the command:

to send a 440Hz tone to the audio output. You should hear a test tone from your speakers.

If you don’t hear a test tone, double check your connections. You may need to add the current user to the audio group: sudo adduser XXX audio, where “XXX” is the user’s name. (I don’t believe this is strictly necessary.)

Play an audio file

Once speaker output is working, why not play an audio file? The aplay program plays an audio file. It supports just a handful of audio formats: voc, wav, raw or au. The default format is WAV.

The -c option specifies two channels. (The default is one channel of audio.)

If you listen carefully, you’ll notice that the built-in audio is a little bit noisy. I’ll get into the issue of audio quality in a future blog entry.

The command aplay -l displays a list of all sound cards and digital audio devices.

ALSA mixing

There are two ALSA utility mixer applications: amixer and alsamixer. amixer is a command line tool that controls one or more soundcards. The command (which does not have any command line arguments):

displays the current mixer settings for the default soundcard and device as shown below:

The output shows a list of the simple mixer controls at your disposal.

The alsamixer application is a bit more visual. alsamixer turns the terminal window into a visual mixer. Try:

and see. Start alsamixer in one window and play an audio file in different window. Use the UP and DOWN arrows to control the playback gain (volume). Use the escape key (ESC) to exit alsamixer.

MIDI patch-bay

ALSA provides a virtual MIDI patch-bay that lets you interconnect MIDI senders and receivers. MIDI data is communicated from sender ports to receiver ports. A port may belong to either a MIDI hardware interface or a software application. The virtual patch-bay allows for very flexible, powerful MIDI data routing.

The aconnect utility application both displays the status of the virtual patch-bay and makes connections. First off, we need to know the available sender and receiver ports. The command:

displays a list of the sender ports including external MIDI input ports. External MIDI input ports (-i) are ALSA sender ports because they send MIDI data to ALSA receiver ports. I connected a Roland UM-2ex MIDI interface to one of the RPi’s USB ports and got the following output with aconnect -i:

The UM-2ex has one 5-pin MIDI IN (client 20, port 0).

Jack

Jack 6 Bridge Software

Likewise, the command:

displays a list of the receiver ports including external MIDI output ports. External MIDI output ports (-o) are ALSA receiver ports because they receive MIDI data from ALSA sender ports. Here is the output when the UM-2ex is connected:

The UM-2ex has two 5-pin MIDI OUTs (client 20, port 0 and port 1).

The notions of sender and receiver may seem a little confusing especially in the context of external MIDI INs and OUTs. Please keep in mind that “send” and “receive” are defined with respect to ALSA itself (and ALSA objects).

Now, I want to really blow your mind. Let’s connect both the Roland UM-2ex and an M-Audio Keystation Mini 32 to the RPi2. Here is the output generated by aconnect -i:

We can see the MIDI IN for the UM-2 and the Keystation.

Here is the output generated by aconnect -o:

We see the MIDI OUTs for the UM-2 and the Keystation.

Let’s patch the Keystation (client 24, port 0) to the MIDI OUT of the UM-2ex (client 20, port 0):

The sender port is (24:0) and the receiver port is (20:0). MIDI messages are sent from the Keystation to the MIDI OUT of the UM-2ex. If you physically connect the MIDI IN of a tone module or synthesizer to the UM-2’s MIDI OUT, you can now play the tone module or synth using the Keystation. Guess what we just built? A USB MIDI to 5-pin MIDI bridge. Ever need to control an old school 5-pin MIDI synth using a new school USB-only MIDI controller? Now you can with Raspberry Pi and ALSA!

Run aconnect -l to display the connection status. Here is the output for the virtual patch bay:

The output shows the connection from the Keystation to the UM-2ex.

To break the connection, run the command:

Run aconnect -l, again, and you’ll see that the connection has been removed.

Jack 6.0 Computer Bridge Software

More resources

If you’re a long-time reader of my site, you know that I blogged about the USB to 5-pin MIDI bridge technique before. If you have a Raspberry Pi and know how to run aconnect, you have a bridge!

The Ardour folks have two good articles about JACK on Linux (here and here).

Jack Bridge Software

New to Linux (Raspbian Jessie) on Rapsberry Pi? Then be sure to check out my article about getting started with Raspbian Jessie and Raspberry Pi.