It turns out that there is a woeful lack of local astronaut training facilities, so I built my daughters a toy spaceship for Christmas:
This is the first bit of real woodwork that I’ve attempted since I was about 8 (when my Dad showed me how to hold a hammer) and I’m pleased with how it turned out. It took around 10 weeks of long evenings (and some weekend time), plus another 75 hours of programming time (during my daily commute). The project was partly inspired by Jeff Highsmith’s excellent Kids Room Spacecraft in MAKE magazine, and was initially fuelled by an unexpectedly strong coffee from a Shoreditch Baristas leading to some midnight sketching.
There’s a functional control panel, with sound effects, a countdown timer and various lights (powered by a Raspberry Pi and an Arduino Mega). The ‘Auto Launch’ button triggers a countdown and 2-minute launch to space (with sounds from Apollo 11). My 4-year old’s favourite is the ‘Waste Dump’ switch, which plays a loud toilet flush, just like Jeff’s spacecraft for MAKE. Amongst other things, there is also a ‘Make Tea’ switch (well, this is a British invention..)
The second cockpit currently has a blank control panel for now, so that my daughters can suggest ideas for it. On the list so far are: a navigation computer, docking controls and a button to play twinkle-twinkle-little-star.
The structure is made out of plywood from scratch, painted with a couple of coats of primer then sprayed with satin white to give a slightly glossy finish. The interior is padded with blue panels cut from pet car protection blankets, velcro’d to the wood. The spacecraft consists of the nose, tail, and two ‘crew modules’ which are held together with more velcro – they can be detached to fit precisely at the ends of my daughter’s beds for storage.
Everything had been done in secret, in the garage, so when my daughters opened their last presents on Christmas Day (white cotton overalls, with ironed-on NASA patches, Union Jack flags and velcro-d on name tags) and went upstairs to change into them, my wife and I hastily brought the spaceship into the lounge, set it up and then called them downstairs: there was total confusion for a few seconds – before excitement took over – and then they climbed in to begin their first mission.
I’ve used GitHub, BitBucket and Unfuddle for hosted Git repositories – and they’re good at what they do – but until recently hadn’t found a good solution for a locally hosted Git repository server. I stumbled across GitBlit, which fits the bill perfectly, and even runs acceptably on a Raspberry Pi. It’s written in Java, so you’ll need to install Oracle JDK 8. For the record, I used the Raspbian OS and installed the Linux Gitblit GO distribution. GitBlit includes basic issue tracking and repository documentation (via Markdown).
I’d hoped to invite a couple of friends round for some stargazing, but it clouded up shortly after I’d set up the telescope. I managed to get this shot of Jupiter with my smartphone held up to a new eyepiece.
Jupiter is about 11 Earths in diameter.
I’ve been reading up on ‘image stacking’, where you combine tens or hundreds of frames (either still or from video) to improve the detail visible in images – so fingers crossed for some clear nights ahead.
After a couple of months with hardly any time to work on my personal projects, it feels great to be developing again! This is a screenshot of XPDisplay, showing the new airport information display on the map tab:
The airport information is read from the same apt.dat file that XPlane uses, so everything is kept nice and consistent.
No photo unfortunately, but last night I successfully observed supernova SN 2014J – a new supernova in the galaxy M82 that was only discovered on January 21st – using my 8-inch Meade LX200 telescope and a 15mm Super Plossl Eyepiece.
I’m only just getting the hang of using the telescope, so finding M82 (the “Cigar Galaxy”) was tricky. The galaxy appeared as an elongated smudge, only just visible, but with a few stars around. I made a rough sketch of what I saw and compared it with a finder chart and Stellarium so that I could be sure I’d found the right place.
It’s pretty cool to see light from a dying star that’s travelled 11.5 million light years.
I took this using a normal point-and-click camera held up to the eyepiece of my telescope last night. No red spot on Jupiter, that would have appeared around midnight (Jupiter rotates completely in 10 hours – crazily fast) but you can see bands of cloud quite clearly. It might have been a clearer picture but for the thin cloud (on our planet) and condensation that formed and reformed every few minutes on the telescope.
The scenery covers most of the main objects on the airfield, although there is always room for improvement (in particular the main hangar needs remodelling from scratch). Bengeo water tower is included, as it is a prominent local feature that pilots sometimes use for navigation.
Here are two more traces, generated from a whole day’s worth of data (29th June 2013). I’ve added some more ground locations as well as the places which the ‘stacks’ are named after (BOVINGDON, LAMBOURNE, BIGGIN and OCKHAM). Happily they coincide nicely with the loops which indicate where aircraft are circling. Colours here are assigned randomly to different aircraft.
The second trace is coloured according to altitude (purple corresponds roughly to 35,000ft – 40,000ft, down through green, yellow, orange and finally red, which corresponds to 5,000ft – 2,000ft).
You can clearly see high altitude routes (purple) as well as the stacks and approach routes into Heathrow. I expect that the routes into and out of Gatwick would also be visible if the range of my receiver was better. I’ve tried building a custom aerial, but haven’t had any luck with it so far.
For completeness, here is a trace coloured according to speed. Purple indicates 500-600kts, down through green, yellow and red at about 100-150kts.
One of my trusty Raspberry Pis has a SDR (Software Defined Radio) USB stick, which is currently tuned into aircraft ADS-B transponders (which basically means that the Pi receives the position, altitude, speed and heading of every commercial aircraft within 50-100km, sometimes up to 200km, if the conditions are right).
The Pi now logs this data every 10 seconds to a file, which I copied from the Pi this evening and with a bit of Groovy scripting was able to generate the traces of aircraft seen over the last 8 hours or so:
You can clearly see the Lambourne, Biggin and Ockham holding stacks, where the planes circle around for a while, descending until given a landing slot.
The aircraft disappear from my screen below 1500ft or so, which is why you can see lots of planes approaching Heathrow (from the east/right) but they all seem to disappear before they get there. Similarly with Gatwick, many aircraft disappear from the right (east) as they arrive and many appear to the left (west) as they leave.
The data and processing isn’t perfect, but it’s a pretty picture for now.
I’ve been playing with a Raspberry Pi, Logitech USB mic, USB-powered speakers and a USB SDR TV tuner stick, combined with Stephen Hickson’s fantastic voicecommand system. (The Pi is the rainbox-striped box at the back, the flat black thing in the middle is a powered USB hub, into which is plugged the black TV tuner stick, which has an aerial (positioned on the small jar). The Rubix Cube is just there for scale.)
First, speech is sent as raw audio to a Google API which returns text. It would be better to do this on the Pi, but my experiments (with Julius) have shown that Google (with their gargantuan computing grid) is much better in terms of both speed and accuracy. (Since the microphone has an on/off button, audio is only sent to Google when I so choose.)
Second, the text is compared with a list of known commands (see the voicecommand website for more details). If a match is found, the corresponding script is run. (This is how the ‘weather forecast’ and ‘have I got mail’ commands work.) If no match is found, the text is sent off to Wolfram Alpha, which returns a text answer.
Finally, the results from Wolfram Alpha, or the appropriate script, are sent off to another Google API to turn them into an audio file, which is then played out over the speakers. I have tried using espeak, but again, Google’s API currently does a better job.
The whole thing is reasonably fast, given everything that is involved. Occasional internet latency spikes delay responses from the script for 10 seconds or so, but in my experience they are rare.
The live aircraft information is received using the Software Defined Radio (SDR) technique, using a RTL2838 TV tuner USB stick with rtl-sdr and dump1090 software, which provides a nice json interface over http. A python script queries this interface on demand and computes the nearest couple of aircraft to my location, then gathers some supplementary information from the internet before reading the response.
Future plans include: adding commands to play music, add items to a shopping list, read news headlines and much more. My four year old daughter’s most recent request was for the AnswerBox to gain wings and fly around the room on request. There’s probably a python library for that. Hmmm….
About 10 years ago, I wrote Alternate Universe MUD, a free text-based online adventure game, with a space-theme and an emphasis on exploration and discovery.
Note: for younger/less geeky readers, a MUD is a Multi User Dungeon, i.e. a game that you could play (with other people, no less!), which often revolved around dungeons, monsters and combat. Massively Multiplayer Online Role-Playing Games (MMORPGs) like World of Warcraft can trace their history back to these kinds of games.
The game features spaceships, several worlds, an astronomical observatory, mudmail and mud-wide-web terminals, a stock market, various robots and even an auditorium where Shakespeare’s plays are performed by bots every few hours.
Over 2000 players discovered Alternate Universe over the first few years (not bad for something that started out just as an experiment). Most players just played, but otherscontributedcode, some helped build the world from the inside and one even ran an in-game monthly newspaper. I ended up meeting some of them in real life too, as a result of all this.
The game server itself was written from scratch in Java and has had many homes, but thanks to Oracle’s release of Java 8 for ARM in December, it now runs on a Raspberry Pi (and it runs happily too, using less than a third of the 512MB of memory – AU was mostly written on a 400MHz HP laptop with half the CPU speed that the Pi has now, so I’m not that surprised).
(the black thing poking out by the yellow S-Video socket is an external temperature sensor – I hope to wire this up to the MUD, so that the external temperature drives some behaviour or description inside the game)
To play Alternate Universe, go to the command-line and type telnet alternateuniverse.dyndns.org 1063
If you are playing from another Pi, you may first need to install the ‘telnet’ program: sudo apt-get install telnet
I hope you have fun playing. Drop me a MUD-mail if you enjoy the game!