Station Ident

Newspaper Club Exhibition

Much like Alice, every now and again someone says to me: "you still doing that newspaper thing"? Yes, yes I am! Still!

It's year five now, pretty much. I didn't think I'd ever do a thing this long. I might never again. But it turns out businesses are hard, especially when they involve atoms and even more so if you want to be profitable, legal and have good customer service. Not that much of that is to do with me.

We've just launched one of the things I naively thought we'd be doing in the second year: selling print-on-demand newspapers from our site, with a nice profit sharing arrangement. This means you can, for example, grab a bunch of your favourite posts from your blog, put them into ARTHR to do a quick layout, print a single copy to check it all over, then start selling it from our site. We take our cut and send you your cut shortly after.

And we're running a small beta test of a personalised newspaper service. I can't say too much about that yet (other than it's a lot of fun), because we're still working out the shape of it, but if you think something like that might work for [your large media organisation], we should have a chat.


So it's good. And hard. But good.

adequate responses

"adequate responses, including of a technical and technological nature"

From. Pretty much describes my job.

Roll 13, 14

Still shooting film. Still enjoying it. If I get one good shot out of a roll I'm happy.

Roll 13/02

Roll 13/31

Roll 14/23

I've been using Eye Culture in Bethnal Green for processing and scans. High res JPEGs are about £6/roll. They do a good job.

The sound of binary


I really enjoyed this post by Oona Räisänen about decoding a signal hidden in the raw video from a police chase. But then I had a look through the rest of her blog, and it's chock full of brilliant forensics on pervasive radio transmissions. Like decoding the real-time bus timetable broadcasts, and listening into wireless keyboards. The ether is alive with the sound of binary.

Facts Not Opinions

Facts not opinions

Facts not opinions, inscribed on Kirkaldy Testing Museum.

1. I love Matt Edgar's posts about historical engineers. It's like a mini In Our Time, without the posh people. 2. I'd like to see more mottos inscribed on buildings. Mottos, not slogans.

Recent quotes

If a cook touches a sauce, it gets passed through a sieve.

Love that.

Sometimes all you need is for someone to see what you are planning and not look bemused.

On Bill Drummond and Jimmy Cauty, from the book about the KLF with the long subtitle.

Perhaps more than anything they did, The Manual led to the pair being perceived as cynical media manipulators rather than random followers of chaos. In a sense, this was always inevitable when they became successful because the public narrative believes that success comes from knowing what you are doing. The equally common phenomenon of stumbling upwards is rarely recognised.

From the same book.

Almost as much a condo as a car.

From this video about the Pontiac Stinger, found via Fosta. If anyone wants me to do a presentation about feature creep, I am ready now.

Summer 2013






Measures / Countermeasures

Presence Orb

A couple of days ago it appeared that a company called Renew are trialling tracking smartphones using a device called the Presence Orb ("A cookie for the read world") placed in recycling bins across the City of London.

The mechanism by which they do this hasn't been discussed in much detail in the media, so I thought I'd have a go at unpacking it.

Every device that connects to a wired or wireless network has a Media Access Control (MAC) address, that uniquely identifies it. It's unique worldwide, and written into the read-only memory on a chip.

The MAC address is 48-bit number, usually described as six pairs of hexadecimal digits, such as 01:23:45:67:89:ab. The first three pairs are a manufacturer specific prefix, so Apple has 00:03:93, Google has 00:1A:11 and so on. The list is freely available and many manufacturers have multiple prefixes.

MAC addresses are only used for communicating within devices on the same network, and they're not visible outside of that (on the wider internet, for example).

Your smartphone, knowing that you prefer to be on a fast wireless network rather than 3G, will regularly ask nearby wireless networks to announce themselves, and if it finds one you've saved previously, it'll connect to it automatically.

This is called a probe request, and like all packets your device sends it contains your MAC address, even if you don't connect to a network. They're sent every few seconds if you're actively using the phone, or every minute or so if it's on standby with the display off. The frequency varies between hardware and operating system.

By storing the MAC address and tracking the signal strength across a number of different receivers, you can get an estimate of positioning, which will be reasonably accurate along a 1D line like a street, and less so in a more complex 2D space.

Renew claim that during a single day in the trial period they detected 106,629 unique devices across 946,016 data points from 12 receivers. Given the entire working population of the City of London is ~350,000, that seems high, but I guess they're not mentioning that they pick up everyone in a passing bus, taxi or lorry too.

They make sure to mention that the MAC address doesn't reveal a name and address or other data about an individual, but because your MAC address never changes, as coverage grows you could be tracked from your bus stop, to your tube station, and out the other side, to your office. And then that could be correlated against store cards using timestamps to tie to a personal identity. Or perhaps you'd like to sign into a hotspot using Facebook? And so on.

Of course, you can opt-out.

But here's the thing. Even though the MAC address is baked into the hardware, you can change it.

On OS X, you'd do something like this to generate a random MAC address:

openssl rand -hex 6 | sed 's/(..)/1:/g; s/.$//'

And then to change it:

sudo ifconfig en0 ether 00:11:22:33:44:55

Replacing 00:11:22:33:44:55 with the address you just generated. On most devices it'll reset back the the default when the machine is restarted.

(This technique will also let you keep using hotspots with a 15 minute free period. Just rotate your MAC and connect again.)

To all intents and purposes, you are now on a different device on the network. And if you didn't care about using your wireless connection for anything useful, you could run a script to rotate that every few seconds, broadcasting probe requests to spoof any number of devices, from any manufacturer(s) you wish to be.

The same packet that contains your MAC address also contains a sequence number, an incrementing and rotating 12-bit integer used to check ordering of messages. The Presence Orb could use that to discard a series of messages that follow a sequence. In turn, you could randomise that.

They might check signal strength, discarding multiple messages with similar volume. In turn, you'd randomise power output, appearing to be a multitude of distances from the receiver.

Then there's beam forming antennas, scattering off buildings and so on, to ruin attempts to trilaterate your signal. Stick all this on a Raspberry Pi, put it in a box, plug it to a car battery, tuck it under an alcove, and walk away.

If you ensure your signal strength is kept within bounds, and traffic low enough not to disrupt other genuine users of nearby wireless networks, I believe this is legal, and it'd effectively ruin Renew's aggregate data, making traffic analysis impossible.

It's still unclear whether what Renew is doing is legal. I am not a lawyer, but I suspect we'd need a clarification from the ICO as to whether the combination of MAC address and location is personal information and regulated by the Data Protection Act, as is suggested by the EU's Article 29 Working Party.

It seems likely that the law will be a step behind location tracking technology, for a while at least. And while that's the case, chaff is going to be an important part of maintaining privacy. The tools are there to provide it, if we want to.

Project Looking Glass


Newspaper Club has two offices: one in Glasgow and one in London. Glasgow is the HQ, where all the customer service, logistics and operational stuff happens. And in London, we develop and manage all the products and services, designing the site, writing code and so on.

We chat all day long between us in a couple of Campfire rooms, and we're not at the size where that's a bottleneck or difficult to manage. But it's nice to have a more ambient awareness of each other's comings and goings, especially on the days when we're all heads down in our own work, and there isn't as much opportunity to paste funny videos in Campfire.

I wanted to make a something to aid that. A two-way office-to-office video screen. I wanted it to be always on, with no dialling up required, and for it to automatically recover from network outages. I wanted the display to be big, but not intrusive. I didn't want a video conference. I didn't want people to be able to log in from home, or look back through recorded footage. I wanted to be able to wave at the folks in the other office every morning and evening, and for that feel normal.

Here's what we came up with:

Looking Glass #3

Looking Glass #2

There's a Raspberry Pi at each end, each connected to a webcam and a monitor. You should be able to put a pair together for under <£150, if you can find a spare monitor or two. There's no sound, and the video is designed to look reasonable, while being tolerant of a typical office's bandwidth constraints.

Below, I'll explain how you can make one yourself.

There's obvious precedence here, the most recent of which is BERG's Connbox project (the writeup is fantastic — read it!), but despite sharing studio space with them, we'd never actually talked about the project explicitly, so it's likely I just absorbed the powerful psychic emanations from Andy and Nick. Or the casual references to GStreamer in the kitchen.

Building this has been a slow project for me, tucked into odd evenings and weekends over the last year. It's been through a few different iterations of hardware and software, trying to balance the price and availability of the parts, complexity of the setup, and robustness in operation.

I really wanted it to be cheap, because it felt like it should be. I knew I could make it work with a high spec ARM board or an x86 desktop machine (that turned out to be easy), but I also knew all the hardware and software inside a £50 Android phone should be able manage it, and that felt more like the scale of the thing I wanted to build. Just to make a point, I guess.

I got stuck on this for a while, until H264 encoding became available in Raspberry Pi's GPU. Now we have a £25 board that can do hardware accelerated simultaneous H264 encoding/decoding, with Ethernet, HDMI and audio out, on a modern Linux distribution. Add a display and a webcam, and you're set.

The strength of the Raspberry Pi's community is not to be understated. When I first used it, it ran an old Linux kernel (3.1.x), missing newer features and security fixes. The USB driver was awful, and it would regularly drop 30% of the packets under load, or just lock up when you plugged the wrong keyboard in.

Now, there's a modern Linux 3.6 kernel, the USB driver seems to be more robust, and most binaries in Raspbian are optimised for the CPU architecture. So, thank you to everyone who helped make that happen.

Building One Yourself

The high level view is this: we're using GStreamer to take a raw video stream from a camera, encode it into H264 format, bung that into RTP packets over UDP, and send those at another machine, where another instance of GStreamer receives them, unpacks the RTP packets, reconstructs the H264 stream, decodes it and displays it on the screen.

Install Raspbian, and using raspi-config, set the GPU to 128MB RAM (I haven't actually tried it with 64MB, so YMMV). Do a system upgrade with sudo apt-get update && sudo apt-get dist-upgrade, and then upgrade your board to the latest firmware using rpi-update, like so:

sudo apt-get install git-core 
sudo wget -O /usr/bin/rpi-update && sudo chmod +x /usr/bin/rpi-update

Reboot, and you're now running the latest Linux kernel, and all your packages are up to date.

GStreamer in Raspbian wheezy is at version 0.10, and doesn't support the OMX H264 encoder/decoder pipeline elements. Thankfully, Defiant on the Raspberry Pi forums has built and packaged up GStreamer 1.0, including all the OMX libraries, so you can just apt-get the lot and have it up and running in a few seconds.

Add the following line to /etc/apt/sources.list:

deb . main

And then install the packages:

sudo apt-get update 
sudo apt-get install gstreamer1.0-omx gstreamer1.0-plugins-bad gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-ugly gstreamer1.0-tools gstreamer1.0-x

We also need the video4linux tools to interface with the camera:

sudo apt-get install v4l-utils v4l-conf

To do the actual streaming I've written a couple of small scripts to encapsulate the two GStreamer pipelines, available in tomtaylor/looking-glass on GitHub.

By default they're set up to stream to on port 5000, so if you run them both as the same time, from different consoles, you should see your video pop up on the screen. Even though this doesn't look very impressive, you're actually running through the same pipeline that works across the local network or internet, so you're most of the way there.

The scripts can be configured with environment variables, which should be evident from the source code. For example, HOST="" ./ will stream your webcam to to the default port of 5000.

To launch the scripts at boot time we've been using daemontools. This makes it easy to just reboot the Pi if something goes awry. I'll leave the set up of that as an exercise for the reader.

You don't have to use a Raspberry Pi at both ends. You could use an x86 Linux machine, with GStreamer and the various codecs installed. The scripts support overriding the encoder, decoder and video sink pipeline elements to use other elements supported on your system.

Most x86 machines have no hardware H264 encoder support, but we can use x264enc to do software encoding, as long as you don't mind giving over a decent portion of CPU core. We found this works well, but needed some tuning to reduce the latency. Something like x264enc bitrate=192 sync-lookahead=0 rc-lookahead=10 threads=4 option-string="force-cfr=true" seemed to perform well without too much lag. For decoding we're using avdec_h264. In Ubuntu 12.04 I had trouble getting eglglessink to work, so I've swapped it for xvimagesink. You shouldn't need to change any of this if you're using a Pi at both ends though - the scripts default to the correct elements.

Camera wise, in Glasgow we're using a Logitech C920, which is a good little camera, with a great image, if a touch expensive. In London it's a slightly tighter space, so we're using Genius Widecam 1050, which is almost fisheye in angle. We all look a bit skate video, but it seemed more important to get everyone in the shot.

You'll probably also need to put these behind a powered USB hub, otherwise you'll find intermittent lockups as the Raspberry Pi can't provide enough power to the camera. It's not the cheapest, but the Pluggable 7-port USB hub worked well for us.

The End

It works! The image is clear and relatively smooth - comparable to Skype, I'd say. It can be overly affected by internet weather, occasionally dropping to a smeary grey mess for a few seconds, so it definitely needs a bit of tuning to dial in the correct bitrate and keyframe interval for lossy network conditions. It always recovers, but can be a bit annoying while it works itself out.

And it's fun! We hung up a big "HELLO GLASGOW" scrawled on A3 paper from the ceiling of our office, and had a good wave at each other. That might get boring, and it might end up being a bit weird, and if so, we'll turn it off. But it might be a nice way to connect the two offices without any of the pressures of other types of synchronous communication. We'll see how it goes.

If you make one, I'd love to hear about it, especially if you improve on any of the scripts or configuration.


Roll Five

This is the first roll of film that I've processed and scanned by my own fair hand. I think film processing falls in the same bucket as audiophilia — lots of kit, thousands of variables to fiddle with, and just enough science to justify almost any decision you want.

Roll 5/16

Roll 5/23

Roll 5/25

Roll 5/33

Roll 5/34

Roll 5/36