python

Working with GPX files in Python

Extracting specific track segments from a track

I have an i-Blue 747A+ GPS logger which I use to track my runs (amongst other things). Afterwards I use BT747 to retrieve the data from the device and create a GPX file of the run, which I then upload to Endomondo which gives me nice graphs and statistics.

I need to modify the GPX file slightly before I can do so however: I use the button on the device to mark the beginning and end of the run, which appear as waypoints in the GPX file. BT747 creates separate track segments (within a single track) between each waypoint, but Endomondo ignores these. I therefore need to extract the single track segment covering the actual run and create a new GPX file with just that segment. I therefore wrote a script in Python, splittrack.py, to do this. It uses the gpxdata library to parse the input file, locates any track segment which match a set of time, distance, displacement and speed criteria1, and outputs a new GPX file with just those.

% splittrack.py mgorven-20121015_0109.gpx > run-20121014.gpx
Reading mgorven-20121015_0109.gpx
<TrackSegment (23 points)> covers 21m over 0:00:22 at 1.0m/s average speed with 4m displacement
<TrackSegment (904 points)> covers 3018m over 0:15:03 at 3.3m/s average speed with 8m displacement
Adding <TrackSegment (904 points)>
<TrackSegment (4 points)> covers 3m over 0:00:03 at 1.3m/s average speed with 3m displacement

Integrating heart rate data

I then recently bought an Oregon Scientific WM100 heart rate logger. It listens to the broadcasts from a heart rate strap2 and records the measurements every 2 seconds. I retrieve the data using the wm100 driver for Linux which writes a CSV file like this:

Name,2012-10-14T18:08:27
Description,
Date,10/14/2012
Time,18:08:27
SamplingRate,2
HeartRate
,80
,78
,76
,75

In order to get this data into Endomondo, I needed to combine the GPS trace with the HRM data into a single file format which Endomondo accepts. I initially started implementing a library for the TCX format3, but then discovered that there is a GPX extension for including heart rata data which Endomondo accepts. So I wrote a script in Python, wm100gpx.py, which reads the input GPX and CSV files, merges the heart rate measurements into the GPX records, and outputs a new GPX file.

% wm100gpx.py 2012-10-14T18:08:27.csv < mgorven-20121015_0109.gpx > run-20121014.gpx

The entries look like this:

<trkpt lat="37.392051" lon="-122.090240">
  <ele>-44.400761</ele>
  <time>2012-10-15T01:20:13Z</time>
  <extensions>
     <gpxtpx:TrackPointExtension>
         <gpxtpx:hr>175</gpxtpx:hr>
     </gpxtpx:TrackPointExtension>
    </extensions>
</trkpt>


  1. I actually initially wrote this to find tracklogs of runs amongst all my tracklogs. 

  2. I use a strap from an entry level Nike triax C3 heart rate monitor watch. 

  3. Which is quite exhaustive... 

Migrating SMSes from Nokia to Android

My wife recently got a Samsung Exhibit II 4G Android phone to replace her aging Nokia E63. Migrating her contacts was accomplished fairly easily by exporting them to CSV with Nokia PC Suite and then importing them into Google Contacts. Migrating SMSes was not so trivial however.

Other approaches

There are a couple methods floating around the web, but none were suitable. This one uses Gammu to retrieve the SMSes from the Nokia, and then a script to convert them to an XML format readable by SMS Backup & Restore. It turns out that Gammu doesn't work on Symbian S60v3 devices however. This script can convert SMSes exported by the MsgExport app to XML for SMS Backup & Restore, but I didn't feel like paying for it or dealing with the Ovi Store. VeryAndroid is a Windows application which can convert SMSes from Nokia PC Suite CSV format and sync them directly to an Android device, and Nokia2AndroidSMS can convert SMSes from the OVI Suite database to XML for SMS Backup & Restore. I didn't want to deal with more Windows software though, so I just decided to write my own.

Formats

I already had the Nokia PC Suite installed and was using it to migrate contacts, so I decided to work with the CSV output it generates for messages. A received SMS looks like this:

sms,deliver,"+16501234567","","","2012.06.13 19:13","","Leaving now"

and a sent SMS looks like this:

sms,submit,"","+16501234567","","2012.06.13 19:11","","Where are you?"

The fields are:

  • 0: "sms" for SMSes (MMS is presumably different)
  • 1: "deliver" for received messages, "submit" for sent messages
  • 2: Sender's phone number (blank for sent messages)
  • 3: Recipient's phone number (blank for received messages)
  • 5: Date and time in "YYYY.MM.DD HH:MM" format and local timezone
  • 7: Message body

Fields 4 and 6 are always empty for SMSes (they are probably used for MMSes, one being the message subject).

I also decided to generate XML for the SMS Backup & Restore app. The XML format looks like this:

<?xml version='1.0' encoding='UTF-8' standalone='yes' ?>
<?xml-stylesheet type="text/xsl" href="sms.xsl"?>
<smses count="1">
  <sms
    protocol="0"
    address="+16501234567"
    date="1341025384351"
    type="1"
    subject="null"
    body="Leaving now"
    toa="null"
    sc_toa="null"
    service_center="+12063130025"
    read="1"
    status="-1"
    locked="0"
    date_sent="null"
    readable_date="Jun 29, 2012 8:03:04 PM"
    contact_name="(Unknown)"
  />
</smses>

but can be reduced down to this:

<?xml version="1.0" encoding="UTF-8"?>
<smses count="1">
    <sms
        protocol="0"
        address="+16501234567"
        date="1341025384351"
        type="1"
        body="Leaving now"
        read="1"
        status="-1"
    />
</smses>

The attributes of the <sms> element are:

  • protocol: Always "0" (possibly different for MMS)
  • address: Sender or recipient phone number
  • date: Date and time in milliseconds since 1 January 1970
  • type: "1" for received message, "2" for sent messages
  • body: Message body
  • read: "1" if the message has been read
  • status: Always "-1"

The script

I implemented a script called [nokia2android.py] in [Python] to convert one or more CSV files to this XML format.

./nokia2android.py received.csv sent.csv > android.xml

The XML file can then be transferred to the Android device (using USB or Bluetooth) and stored in /sdcard/SMSBackupRestore. It will then be presented as an option after selecting the Restore button in the app.

Ibid finally released!

After over a year of development, we have finally released Ibid. Ibid is a general purpose chat bot written in Python. We've suffered from a bit of feature creep, so despite being a 0.1 release it can talk 7 messaging protocols and has over 100 features provided by plugins. I think we also have an excellent architecture and very developer friendly plugin API. The 0.1.0 release can be downloaded from Launchpad or installed from our PPA.

Playing with Python and IRC

I wrote three IRC bots in [Python][] this last week (although one was a rewrite). They probably aren't very useful to most people, but I'm going to share them anyway in case someone finds them interesting.

The first one was prompted by [Adrian][], who is maintaining a [countdown][] until his wedding as a factoid in [Spinach][]. Since [Knab][] doesn't actually support countdowns, it has to be updated manually. This clearly isn't the Right Way to do this, and so I hacked together a [script][irccountdown] which connects to IRC and teaches Spinach the updated factoid. I run this as a daily [cronjob][] to keep the countdown up to date.

As is usually the case with Python, there was already a library for accessing IRC, namely [irclib][]. It isn't documented very well, but has a couple example scripts which are fairly easy to follow. It follows an event based model, so you write functions which will be called when certain events occur (such as receiving a message).

The final of the [Currie Cup][] was held on Saturday (which my team (the [Sharks][]) won), and I followed the match online using [SuperSport's][] live score site1. I then thought that it would be cool to have the score announced on IRC when it changed, and since I was bored I wrote a simple [bot][rugby] to do this. It worked well, but was very simple in that it only supported one hardcoded channel and one hardcoded game.

Since I was also bored on Sunday I [rewrote][rugbybot] this bot properly. I added a subscription mechanism so that channels and users can subscribe and unsubscribe to games by sending the bot a command. It's mostly working except for listing the available games (since there aren't any rugby games coming up which means that I can't test it ;-) ). Games are specified by the ID used by SuperSport's site, and finding the right ID is currently a manual process.


  1. I'm not really a sports fan — I just enjoy bragging when we do win ;-) 

CLUG Political Compass graph

A couple people on #clug were updating their Political Compass scores, which prompted me to jump on the bandwagon and do the test. I came out with the following scores.

Economic Left/Right: -4.38
Social Libertarian/Authoritarian: -2.21

I then thought that it would be interesting to compare everyone's scores on a graph, so I wrote a [Python][] [script][py] to get the scores from Spinach and a [Gnuplot][] [script][p] to plot them.

CLUG Political Compass

To add yourself to the graph, tell Spinach your score in the following format. The graph is regenerated every hour.

cocooncrash.political_compass is -4.38 / -2.21 (2008/09/14)

Downloading Google Talk logs

I used Google Apps to host mail for this domain for a while, and wanted to close down the account since I don't use it anymore. Before I did that I wanted to move all the data onto my server. Transferring the emails was fairly straightforward using [POP3][], but I couldn't find a way to download the [Google Talk][] logs. [Gmail][] handles the logs as emails, but they aren't accessible using either POP3 or [IMAP][].

I therefore wrote a [Python][] script which downloads the logs via the web interface. On [Jeremy's][] [suggestion][] I used [BeautifulSoup][] to parse the [HTML][] this time, which worked very well. The script works with both Google Apps and normal Gmail, although my account got locked twice while trying to download the 3500 logs in my account.

Python Decorators

For my Masters project I need a method by which the user can specify which functions should be run on an SPE1. This method should be simple, clear and easy to turn on and off. I stumbled upon a blog post a little while ago (I think it was this one) which explained decorators in Python, which is the perfect tool for the job. Decorators are used to transform functions, but without changing the function itself or the calls to it.

def spe(func, *args):
    def run(*args):
        return compile(func, *args)
    return run

@spe
def sub(a, b):
    return a - b

print sub(2, 4)

The spe function is the actual decorator. The @spe line applies the decorator to the sub function. Implicitly, the following declaration is made:

sub = spe(sub)

The sub function is being wrapped by the spe function, and so all calls to sub (such as the print line) will use the wrapped function instead. The decorator creates and returns a new function called run which will (eventually) cause the original function to be compiled and executed on an SPE. This means that running a function on an SPE will be as simple as adding @spe before the function declaration2, without having to change the way in which the function is called. Turning it off is as simple as commenting out this line, and it is fairly clear as to what is happening.


  1. Trying to make this decision automatically would be a massive project in itself and would probably be worse than a human decision. 

  2. There will probably be some restrictions on what the function may contain, but that's a different matter. 

Masters project overview

Since I might be posting entries regarding my Masters project, I thought that I would provide a brief overview of the project to put it in perspective. I am doing my MSc in Electrical Engineering at UCT as part of the ACE group headed by Prof. Inggs. The group is based at the CHPC, which is part of the Meraka Institute, which in turn is part of the CSIR. The group's research is focused on developing new platforms and methods for HPC.

My project is to investigate the suitability of the Cell processor for HPC. The Cell processor is found in the PlayStation 3 and in BladeCenters, and is a very powerful processor. It achieves this by using two different types of cores. The one type (PPU) is a general purpose core capable of running an operating system, while the other type (SPU) is designed specifically to crunch numbers.

The disadvantage of this architecture is that it is very difficult to program for. When using the IBM Cell SDK, the user needs to write separate programs for each type of core, and needs to manage the SPEs manually as well as take care of all memory transfers. This requires a good knowledge of the architecture, and results in a lengthy development process and unportable code.

For the Cell processor to be a successfull platform in HPC the development process must be made easier while still making efficient use of the Cell's capabilities. There are a number of commercial and non-commercial tools which aim to do this using a variety of methods. I have looked into these tools and have not found one which is both effective and open.

I therefore aim to create my own platform with which to program the Cell processor. The idea is to use [Python][] as the end user language, and to make a backend which transparently runs certain functions on the SPEs. This will involve converting the functions into C or C++, adding code to manage execution on an SPE and do the required memory transfers, compile it with the [GCC][] compiler and then execute it.

It is quite an ambitious plan, and there are a lot of potential pitfalls. If it succeeds however, I think that it will be a very easy way to develop for the Cell processor while still having portable code.

Syndicate content