Project Looking Glass

Introduction

Newspaper Club has two offices: one in Glasgow and one in London. Glasgow is the HQ, where all the customer service, logistics and operational stuff happens. And in London, we develop and manage all the products and services, designing the site, writing code and so on.

We chat all day long between us in a couple of Campfire rooms, and we’re not at the size where that’s a bottleneck or difficult to manage. But it’s nice to have a more ambient awareness of each other’s comings and goings, especially on the days when we’re all heads down in our own work, and there isn’t as much opportunity to paste funny videos in Campfire.

I wanted to make a something to aid that. A two-way office-to-office video screen. I wanted it to be always on, with no dialling up required, and for it to automatically recover from network outages. I wanted the display to be big, but not intrusive. I didn’t want a video conference. I didn’t want people to be able to log in from home, or look back through recorded footage. I wanted to be able to wave at the folks in the other office every morning and evening, and for that feel normal.

Here’s what we came up with:

Looking Glass #3

Looking Glass #2

There’s a Raspberry Pi at each end, each connected to a webcam and a monitor. You should be able to put a pair together for under <£150, if you can find a spare monitor or two. There’s no sound, and the video is designed to look reasonable, while being tolerant of a typical office’s bandwidth constraints.

Below, I’ll explain how you can make one yourself.

There’s obvious precedence here, the most recent of which is BERG’s Connbox project (the writeup is fantastic — read it!), but despite sharing studio space with them, we’d never actually talked about the project explicitly, so it’s likely I just absorbed the powerful psychic emanations from Andy and Nick. Or the casual references to GStreamer in the kitchen.

Building this has been a slow project for me, tucked into odd evenings and weekends over the last year. It’s been through a few different iterations of hardware and software, trying to balance the price and availability of the parts, complexity of the setup, and robustness in operation.

I really wanted it to be cheap, because it felt like it should be. I knew I could make it work with a high spec ARM board or an x86 desktop machine (that turned out to be easy), but I also knew all the hardware and software inside a £50 Android phone should be able manage it, and that felt more like the scale of the thing I wanted to build. Just to make a point, I guess.

I got stuck on this for a while, until H264 encoding became available in Raspberry Pi‘s GPU. Now we have a £25 board that can do hardware accelerated simultaneous H264 encoding/decoding, with Ethernet, HDMI and audio out, on a modern Linux distribution. Add a display and a webcam, and you’re set.

The strength of the Raspberry Pi’s community is not to be understated. When I first used it, it ran an old Linux kernel (3.1.x), missing newer features and security fixes. The USB driver was awful, and it would regularly drop 30% of the packets under load, or just lock up when you plugged the wrong keyboard in.

Now, there’s a modern Linux 3.6 kernel, the USB driver seems to be more robust, and most binaries in Raspbian are optimised for the CPU architecture. So, thank you to everyone who helped make that happen.

Building One Yourself

The high level view is this: we’re using GStreamer to take a raw video stream from a camera, encode it into H264 format, bung that into RTP packets over UDP, and send those at another machine, where another instance of GStreamer receives them, unpacks the RTP packets, reconstructs the H264 stream, decodes it and displays it on the screen.

Install Raspbian, and using raspi-config, set the GPU to 128MB RAM (I haven’t actually tried it with 64MB, so YMMV). Do a system upgrade with sudo apt-get update && sudo apt-get dist-upgrade, and then upgrade your board to the latest firmware using rpi-update, like so:

sudo apt-get install git-core sudo wget http://goo.gl/1BOfJ -O /usr/bin/rpi-update &amp;&amp; sudo chmod +x /usr/bin/rpi-update 

Reboot, and you’re now running the latest Linux kernel, and all your packages are up to date.

GStreamer in Raspbian wheezy is at version 0.10, and doesn’t support the OMX H264 encoder/decoder pipeline elements. Thankfully, Defiant on the Raspberry Pi forums has built and packaged up GStreamer 1.0, including all the OMX libraries, so you can just apt-get the lot and have it up and running in a few seconds.

Add the following line to /etc/apt/sources.list:

deb http://vontaene.de/raspbian-updates/ . main 

And then install the packages:

sudo apt-get update sudo apt-get install gstreamer1.0-omx gstreamer1.0-plugins-bad  gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-ugly  gstreamer1.0-tools gstreamer1.0-x 

We also need the video4linux tools to interface with the camera:

sudo apt-get install v4l-utils v4l-conf 

To do the actual streaming I’ve written a couple of small scripts to encapsulate the two GStreamer pipelines, available in tomtaylor/looking-glass on GitHub.

By default they’re set up to stream to 127.0.0.1 on port 5000, so if you run them both as the same time, from different consoles, you should see your video pop up on the screen. Even though this doesn’t look very impressive, you’re actually running through the same pipeline that works across the local network or internet, so you’re most of the way there.

The scripts can be configured with environment variables, which should be evident from the source code. For example, HOST="foo.example.com" ./video-server.sh will stream your webcam to foo.example.com to the default port of 5000.

To launch the scripts at boot time we’ve been using daemontools. This makes it easy to just reboot the Pi if something goes awry. I’ll leave the set up of that as an exercise for the reader.

You don’t have to use a Raspberry Pi at both ends. You could use an x86 Linux machine, with GStreamer and the various codecs installed. The scripts support overriding the encoder, decoder and video sink pipeline elements to use other elements supported on your system.

Most x86 machines have no hardware H264 encoder support, but we can use x264enc to do software encoding, as long as you don’t mind giving over a decent portion of CPU core. We found this works well, but needed some tuning to reduce the latency. Something like x264enc bitrate=192 sync-lookahead=0 rc-lookahead=10 threads=4 option-string="force-cfr=true" seemed to perform well without too much lag. For decoding we’re using avdec_h264. In Ubuntu 12.04 I had trouble getting eglglessink to work, so I’ve swapped it for xvimagesink. You shouldn’t need to change any of this if you’re using a Pi at both ends though – the scripts default to the correct elements.

Camera wise, in Glasgow we’re using a Logitech C920, which is a good little camera, with a great image, if a touch expensive. In London it’s a slightly tighter space, so we’re using Genius Widecam 1050, which is almost fisheye in angle. We all look a bit skate video, but it seemed more important to get everyone in the shot.

You’ll probably also need to put these behind a powered USB hub, otherwise you’ll find intermittent lockups as the Raspberry Pi can’t provide enough power to the camera. It’s not the cheapest, but the Pluggable 7-port USB hub worked well for us.

The End

It works! The image is clear and relatively smooth – comparable to Skype, I’d say. It can be overly affected by internet weather, occasionally dropping to a smeary grey mess for a few seconds, so it definitely needs a bit of tuning to dial in the correct bitrate and keyframe interval for lossy network conditions. It always recovers, but can be a bit annoying while it works itself out.

And it’s fun! We hung up a big “HELLO GLASGOW” scrawled on A3 paper from the ceiling of our office, and had a good wave at each other. That might get boring, and it might end up being a bit weird, and if so, we’ll turn it off. But it might be a nice way to connect the two offices without any of the pressures of other types of synchronous communication. We’ll see how it goes.

If you make one, I’d love to hear about it, especially if you improve on any of the scripts or configuration.

Untitled