Overhead
A Raspberry Pi–powered LED matrix that listens to live aircraft transponders and narrates what's flying over our flat — in the voice of David Attenborough.
The idea
My girlfriend loves planes. Every time we're out and about I'll notice her looking at her phone, then up at the sky. And this can only mean one thing — she's on Flight Radar, looking at what's going overhead. It's an adorable special interest, and something I've started taking more interest in myself.
I wanted to make her something. A way to see what's flying over our flat, at a glance, the moment it happens, without reaching for her phone. Back in university I'd built a spectrum analyser on a 64x64 Adafruit LED matrix — 4,096 tiny LEDs, each one capable of any colour, about the size of your palm. So the seed of the idea was there: could I get plane data onto one of these boards?
My first attempt was just a Mac app hooked up to a free API that sent a push notification whenever a plane was nearby. It worked. It was also really annoying. But it proved the concept, and it made me think — what if instead of borrowing someone else's data over the internet, we could listen to the planes ourselves?
Every commercial aircraft constantly broadcasts its position, altitude, speed, and heading via radio transponder. A USB receiver called an RTL-SDR dongle, paired with a 1090MHz antenna, can pick all of it up. Real signals, from real planes, received live from a flat in North London. I had no idea if this would actually be feasible. Turns out it was shockingly easy.
I ordered everything — a Raspberry Pi 5, the LED matrix, the antenna, a bonnet board to wire it all together — and spent the week before it arrived building the software pipeline on my Mac against a simulated version of the matrix. The moment hardware started showing up, I was locked in.





Getting it working
The first few days were pure problem-solving. The Pi wouldn't connect to Wi-Fi because macOS had silently put garbage on the clipboard instead of my actual password — the Pi accepted it without complaint and failed silently. The LED matrix library didn't support the Pi 5's chip architecture, so I had to swap it out and write a translation layer. The matrix and Pi kept crashing because they were sharing a power supply and bright pixels drew too much current. Each problem felt like a wall until it didn't. Reflash the card. Switch the library. Separate the power supplies.

Within a couple of days, the antenna was decoding live aircraft signals, the Pi was processing them, and the matrix was lighting up. The pipeline worked. But what it was showing was ugly.

Making it feel right
Getting data onto a screen is an engineering problem. Making it feel like something you'd actually want to look at is a design problem. And it's much harder.
The raw transponder signal gives you almost nothing useful — a hex code, a cryptic callsign like "BAW226", altitude in feet, speed in knots. No airline name, no aircraft type, no route. So I built layers of translation: callsign prefixes mapped to airlines, type codes decoded into aircraft names, and flight databases cross-referenced for origin and destination. Even then there was a London-specific problem — "going to London" is useless when you live here. Which airport? That needed its own lookup.
I laid it all out on the board like a departures board — callsign, carrier, route, altitude, speed. Rows and columns, crossfades between pages. It worked. It also felt like a spreadsheet with animations.
The fonts were bothering me too. The defaults looked terrible at this size — inconsistent widths, poor spacing, no personality. I happened to be playing Pokemon Fire Red on the Switch, and noticed how lovely the in-game font was — designed for the Game Boy Advance's tiny 240x160 screen, every pixel carefully considered for legibility at low resolution. I got a TrueType version rendering on the matrix with anti-aliased edges. On a normal screen you wouldn't notice, but on an LED matrix where every pixel is physically visible, those soft edges are transformative. Suddenly the board had a visual identity.

But the layout was still the problem. And then it clicked: what if I stopped trying to arrange data and started writing sentences?

The personality
With the typewriter working, the question became: what should it sound like?
The first personality I tried was Matty Matheson — loud, unhinged chef energy. "HOLY CRAP a British Airways A350 just TORE out of Heathrow!" It was funny for about a day, then it got exhausting. Every sentence was at an 11. The structure got repetitive.
So I switched to David Attenborough. And everything fell into place. Planes became wildlife. Routes became migration paths. Airlines became species. A 747 is "a noble elder, increasingly rare in these parts." A Ryanair 737 is "the hardy sparrow, thriving where others dare not venture." An A380 is "the apex predator of these skies." Same data, completely different character.

When a new plane enters range, the board doesn't just update — it does a full-screen colour splash inspired by Pokemon wild encounters. The screen fills with the airline's brand colour and a word like "EYES UP" or "SPOTTED" appears knocked out in black. There are five animation styles that cycle randomly. It turns each sighting into an event — a moment that makes you look up.


The board remembers what it's seen, too. Every aircraft gets logged in a local database. When a repeat visitor shows up, the LLM knows and reacts warmly. First-timers get introduced. At set times through the day it shows wrap-up summaries — a morning report, an afternoon update, an evening wind-down — each narrated by Attenborough against the real stats.

After 10pm, the personality shifts. He's still there, but whispering. Conspiratorial. Cargo flights become "nocturnal hunters." It's funny because nobody's watching.

Two Claudes
One workflow detail worth calling out: I had Claude running on both my Mac and on the Pi simultaneously. Mac Claude was the architect — designing systems, writing the renderer, managing code. Pi Claude was the field engineer — testing against real hardware, introspecting live APIs, catching things that only break on ARM. When one hit a wall, I'd relay the problem to the other. When Mac Claude pushed a fix, Pi Claude would pull and test. I was the bridge between two AI instances troubleshooting the same problem from different angles. It cut what would've been hours of manual hardware debugging into minutes, and it became a genuinely interesting creative rhythm.

The result
It sits on my desk in a deep picture frame with a smoked acrylic face. The tinted glass hides the individual LEDs when they're off but lets the light through cleanly when they're on — it looks like a single glowing panel, not a grid of dots. The antenna sits on top. It looks intentional.
Planes aren't always overhead — sometimes twenty minutes pass between sightings. So in its resting state it's a clock, with a tiny pixel plane orbiting the perimeter every thirty seconds. The day, the date, the temperature. It earns its spot on the desk. And because planes appear at irregular intervals, each sighting genuinely feels like something — a flash of colour, Attenborough typing out his observation, then calm again.

This project made me think about how much opportunity there is to make mundane things feel alive. A clock, a thermometer, a plane tracker — none of these are new ideas. But giving them personality, unpredictability, a sense of humour — that changes what they are entirely.
I've got a few more ideas I want to bring to life.
In the meantime, Laura loves it. It hasn't curbed her time on Flight Radar — she now uses it as a supplementary tool. A heads-up that something's going over the house, before she reaches for her phone to find out more.