Category: pi

Raspberry Pi – AnswerBox

I’ve been playing with a Raspberry Pi, Logitech USB mic, USB-powered speakers and a USB SDR TV tuner stick, combined with Stephen Hickson’s fantastic voicecommand system. (The Pi is the rainbox-striped box at the back, the flat black thing in the middle is a powered USB hub, into which is plugged the black TV tuner stick, which has an aerial (positioned on the small jar). The Rubix Cube is just there for scale.)

First, speech is sent as raw audio to a Google API which returns text. It would be better to do this on the Pi, but my experiments (with Julius) have shown that Google (with their gargantuan computing grid) is much better in terms of both speed and accuracy. (Since the microphone has an on/off button, audio is only sent to Google when I so choose.)

Second, the text is compared with a list of known commands (see the voicecommand website for more details). If a match is found, the corresponding script is run. (This is how the ‘weather forecast’ and ‘have I got mail’ commands work.) If no match is found, the text is sent off to Wolfram Alpha, which returns a text answer.

Finally, the results from Wolfram Alpha, or the appropriate script, are sent off to another Google API to turn them into an audio file, which is then played out over the speakers. I have tried using espeak, but again, Google’s API currently does a better job.

The whole thing is reasonably fast, given everything that is involved. Occasional internet latency spikes delay responses from the script for 10 seconds or so, but in my experience they are rare.

The live aircraft information is received using the Software Defined Radio (SDR) technique, using a RTL2838 TV tuner USB stick with rtl-sdr and dump1090 software, which provides a nice json interface over http. A python script queries this interface on demand and computes the nearest couple of aircraft to my location, then gathers some supplementary information from the internet before reading the response.

The scripts that make all this work are available on github.

Future plans include: adding commands to play music, add items to a shopping list, read news headlines and much more. My four year old daughter’s most recent request was for the AnswerBox to gain wings and fly around the room on request. There’s probably a python library for that. Hmmm….

MudPi

About 10 years ago, I wrote Alternate Universe MUD, a free text-based online adventure game, with a space-theme and an emphasis on exploration and discovery.

Note: for younger/less geeky readers, a MUD is a Multi User Dungeon, i.e. a game that you could play (with other people, no less!), which often revolved around dungeons, monsters and combat. Massively Multiplayer Online Role-Playing Games (MMORPGs) like World of Warcraft can trace their history back to these kinds of games.

The game features spaceships, several worlds, an astronomical observatory, mudmail and mud-wide-web terminals, a stock market, various robots and even an auditorium where Shakespeare’s plays are performed by bots every few hours.

Over 2000 players discovered Alternate Universe over the first few years (not bad for something that started out just as an experiment). Most players just played, but others contributed code, some helped build the world from the inside and one even ran an in-game monthly newspaper. I ended up meeting some of them in real life too, as a result of all this.

The game server itself was written from scratch in Java and has had many homes, but thanks to Oracle’s release of Java 8 for ARM in December, it now runs on a Raspberry Pi (and it runs happily too, using less than a third of the 512MB of memory – AU was mostly written on a 400MHz HP laptop with half the CPU speed that the Pi has now, so I’m not that surprised).


(the black thing poking out by the yellow S-Video socket is an external temperature sensor – I hope to wire this up to the MUD, so that the external temperature drives some behaviour or description inside the game)

Play!
To play Alternate Universe, go to the command-line and type
telnet alternateuniverse.dyndns.org 1063
If you are playing from another Pi, you may first need to install the ‘telnet’ program:
sudo apt-get install telnet

I hope you have fun playing. Drop me a MUD-mail if you enjoy the game!

RPi CPU load and temperature logging to Cosm

Here is a 15-minute recipe to get your Raspberry Pi logging data to Cosm.com, who provide a RESTful API to query the data and produce customized charts.

I use two scripts that cron runs every 5 minutes, one to log data to a CSV file, and another to upload the data to cosm, then delete the CSV file. That way, if my internet connection goes down for a while, the logged data is not lost. I have a copy of these scripts in a separate directory for each variable (‘datastream’) that I monitor, which makes it easier to manage.

CPU load

First, create an account on Cosm.com (it’s free & quick).

Now set up a new feed (a feed is a collection of datastreams; each datastream is a series of timestamped data points, aka a ‘time series’)

  • “+ Device/Feed”
  • What type of device/feed do you want to add?: Choose ‘Something Else’
  • Step 1 – Data: Choose ‘No, I will push data to Cosm’
  • Step 2 – Title: think of a relevant name for your feed, e.g. “MyDesktopPi”
  • Step 3 – Tags: give it any relevant tags that might help you find it in future (this is only really useful for public feeds)
  • Step 4 – Create. Make a note of the feed ID, we’ll use this later.

Once you’ve created a feed, you can add datastreams to it. A datastream represents a value that your Pi will monitor over time, like temperature or CPU load (or internet connectivity, washing machine activity, presence of your mobile phone on your LAN, etc..)

  • “+ Datastream”
  • ID: This doesn’t need to be numeric – you can enter something like ‘Pi_CPU_Load’. Make a note of this datastream ID too.
  • Tags: e.g. ‘pi cpu load’
  • Units: ‘Capacity’ – this is free text
  • Symbol: leave blank

At this point, your feed is public, i.e. anyone can view the current data. This may be fine, but if you want to change it, scroll down to the ‘Feed Status’ section at the bottom of the page.

Now, at the bottom of the screen click the green ‘Save Changes’ button. (this is an area of the cosm UI that needs work, IMHO, as you expect to find this button near the data that you’re editing..)

You can get back to the edit feed / add datastream page at any time by clicking the little cog icon on the right hand side of the feed name and choosing ‘edit’.

Now that we’ve defined our feed and datastream, we need to give our script permission to upload data to our Cosm datastream. This is done by generating an API key.

  • In the top-right of the cosm web page, click the ‘Keys’ icon.
  • Click the ‘+ Key’ icon, give the key a label (ID), e.g. ‘MyDesktopPi_UploadScriptKey’, and choose feed restrictions:’Use any feed (including my private feeds)’ and access privileges:’all’, then click ‘create’.
  • Make a note of the long alphanumeric API key string, as we’ll use that in a moment.

The last thing to do is to create the scripts on the Pi that will upload data to cosm.

Log in as ‘pi’.

cd ~
mkdir cosm
cd cosm
mkdir load
cd load

Install CURL
sudo apt-get install curl

Using your favourite text editor, create the file ‘log.sh’:
#!/bin/bash
####################################################
# Please customize these values appropriately:
LOCATION=/home/pi/cosm/load
#VALUE=$( uptime | awk -v FS="[, ]" '{print $18}' )
# alternatively:
VALUE=$( cat /proc/loadavg | awk {'print $2'} )
####################################################
TIME=`/bin/date -u +%FT%XZ`
if [[ "$VALUE" == "" ]]
then
VALUE=0
fi
echo "$TIME,$VALUE" >> $LOCATION/cosm.csv
exit 0

.. and save it. The line cat /proc/loadavg | awk {'print $2'} simply takes the second number from the /proc/loadavg file, which represents the 5-minute-average of the CPU load.

Create a file called ‘upload-cosm.sh’:
#!/bin/bash
####################################################
# Please customize these values appropriately:
LOCATION=/home/pi/cosm/load
API_KEY='your-long-alphanumeric-api-key-here'
FEED_ID='your-feed-id-here'
DATASTREAM_ID='your-datastream-id-here'
####################################################
COSM_URL=http://api.cosm.com/v2/feeds/$FEED_ID/datastreams/$DATASTREAM_ID/datapoints.csv
sleep 2 # gives any data logging scheduled at the same time a chance to run
echo $COSM_URL
curl -v --request POST --header "X-ApiKey: $API_KEY" --data-binary @$LOCATION/cosm.csv $COSM_URL
if [ $? -eq 0 ]
then
rm $LOCATION/cosm.csv
#echo "Would delete file now."
fi
#echo "Done"

.. and save that too. Exit the text editor, and make all the .sh scripts executeable with:
chmod u+x *.sh

(I’ll assume that you’ve changed LOCATION, API_KEY, FEED_ID and DATASTREAM_ID appropriately for your system)

The log.sh script will append to a file called cosm.csv every time it is run. You can try it now if you like:
./log.sh

To schedule these scripts to run, we edit ‘crontab’, a file that tells cron which scripts to run, and when. My favourite editor is called ‘joe’, yours might be ‘vi’, ‘emacs’ or another. The first line makes sure that crontab will use your editor to edit the crontab file:

EDITOR=joe
crontab -e

Append these lines to the end of your crontab file (leaving a blank line at the end):

*/5 * * * * /home/pi/cosm/load/log.sh
*/5 * * * * /home/pi/cosm/load/upload-cosm.sh

*/5 means ‘run every 5 minutes’. See this reference for more details.

Save the crontab file and exit your editor. Now you can either wait for five minutes, or simply run the upload script with:
./upload-cosm.sh

With luck, you should now be able to reload your Cosm.com page and see the data uploaded to your datastream as a chart.

CPU temperature

Simply copy the ~/cosm/load directory:
cp -R ~/cosm/load ~/cosm/temperature
cd ~/cosm/temperature

Edit the log.sh script, and replace:

LOCATION=/home/pi/cosm/temperature
# VALUE is in degrees celcius
VALUE=$( cat /sys/class/thermal/thermal_zone0/temp | awk -v FS=" " '{print $1/1000""}' )

This takes the 1st value from /sys/class/thermal/thermal_zone0/temp (in fact there is only one number), and divides it by 1000 to get degrees C (the raw value is in thousandths of a degree).

Edit the upload-cosm.sh script with your temperature datastream ID (update the LOCATION too), and add to crontab:

* * * * * /home/pi/cosm/temperature/log.sh
*/5 * * * * /home/pi/cosm/temperature/upload-cosm.sh

(I’ve chosen to record the temperature every minute, but upload the values every 5 minutes)

Note the daily variation, even though this is monitoring CPU temperature! I guess the Pi is in a south-facing room which warms up during the day, but I didn’t expect to see this so clearly. The dropouts/spikes that you see in the data are caused by occasional erroneous values returned by the temperature sensor.

If you find this useful, please post a comment indicating the type of data that you’re monitoring (and maybe the line of script that captures the data?).

Thanks for reading!

Dansette