SuperGenPass and Password Composer are password generators, which generate different passwords for each site you use based on a single master password. This gives you the convenience of only remembering one password as well as the security of using different (and strong) passwords for each site. This means that you won't have all your accounts compromised when1 one of them is compromised.
Most password generators are implemented as browser extensions or bookmarklets, since they are most frequently needed in a web browser. I've been wanting to start using a password generator, but I wanted to be sure that I could access my accounts even if I didn't have a web browser accessible. The two situations I could think of were a command line only system (e.g. SSH) and my cellphone2.
Surprisingly, I couldn't find a command line implementation of SuperGenPass, so I wrote one in Python. I also couldn't find any J2ME or Symbian implementations, and so wrote my own one in J2ME. They both support subdomain stripping and configurable password lengths. They don't support salted passwords.
I chose SuperGenPass over Password Composer because it uses a better scheme. Password Composer only uses hex characters, whereas SuperGenPass uses a base64 encoded hash. SuperGenPass also hashes the password multiple times (which would slow down a brute force attack to find the master password) and imposes complexity requirements on the generated password (which reduces the chances that the generated password can be brute forced).
Following on from yesterday's post, I decided to try implement proper content negotiation. After a fair amount of time spent getting to grips with [Lua], I got a [script] which works very nicely. It implements [server driven] content negotiation for [media types][mime].
The basic idea of content negotiation is that a resource (e.g., this [graph])
exists in multiple formats (in this case, [SVG][graph-svg], [PNG][graph-png] and
[GIF][graph-gif]). When a user agent requests the resource, it indicates which
formats it understands by listing them in the
(The following description assumes knowledge of the
The script works by searching the directory for files with the requested name
but with an additional extension (each of which is a variant). The [media
type][mime] is inferred from the extension using
Some browsers include wildcard entries such as
To install the [script], download and save it somewhere (such as
URLs shouldn't really contain file extensions (like
Doing the same for static files (i.e. files served directly by the webserver)
isn't straightforward because most webservers use the file extension to
determine the MIME type to send in the
I decided to try find a solution to this for my webserver of choice,
Lighttpd. Lighttpd has a module which embeds a [Lua] interpreter and
allows you to write scripts which modify (or even handle) requests. So I wrote
a [script] which searches the directory for files with the same name as
requested but with an extension. This means that any file can be accessed with
the file extension removed from the URL while still having the correct
The script currently chooses the first matching file, which means that having
multiple files with the same name but different extensions doesn't do anything
useful. The proper method however is to actually do [content negotiation],
which chooses the format based on the preferences indicated by the HTTP client
To use this script, download it and save it somewhere (I use
I recently came across an article which tries to address the problem of SSH agent forwarding with screen. Briefly, the problem is that reattaching a screen instance breaks agent forwarding because the required environment variables aren't present in the screen instance. The solution given didn't quite work for me though because I use an SSH wrapper script which automatically runs screen.
My solution is to write a screen wrapper script which stores the
environment variables in
I like to put wrapper scripts in
One of the most effective anti-spam measures one can implement is to enforce valid use of SMTP and other standards. Spam clients are interested in sending messages as quickly as possible, and so usually don't bother with actually implementing standards correctly. In this post I shall describe the various checks which can be used, show how to implement these checks in Postfix, and describe how to ensure that your mail server passes these checks when sending mail.
Reverse DNS Entries
RFC 1912 states that "every Internet-reachable host should have a name" and "make sure your PTR and A records match". This can be checked by performing a Forward Confirmed reverse DNS lookup1. This check can be done before even accepting the TCP connection, which means the mail server's existence isn't even revealed to rejected clients.
I've recently been doing a lot of hacking on Knab, which is the software behind the #clug IRC bot, Spinach. I've contributed a number of new modules, most of which are running on Spinach and are available from the main Bazaar repository.
This module is basically a calendar feature which can store and retrieve events such as birthdays. It also handles recurring events (both with rules1 or multiple dates).
This modules discovers the URL which a shortened URL redirects to.
This module creates a short URL using is.gd.
This module retrieves an HTTP URL and returns the result of the request.
This module gets definitions using Google.
This module retrieves commit messages from a Subversion repository.
This module summons people by sending them a message via Jabber.
I wrote three IRC bots in [Python] this last week (although one was a rewrite). They probably aren't very useful to most people, but I'm going to share them anyway in case someone finds them interesting.
The first one was prompted by [Adrian], who is maintaining a [countdown] until his wedding as a factoid in [Spinach]. Since [Knab] doesn't actually support countdowns, it has to be updated manually. This clearly isn't the Right Way to do this, and so I hacked together a [script][irccountdown] which connects to IRC and teaches Spinach the updated factoid. I run this as a daily [cronjob] to keep the countdown up to date.
As is usually the case with Python, there was already a library for accessing IRC, namely [irclib]. It isn't documented very well, but has a couple example scripts which are fairly easy to follow. It follows an event based model, so you write functions which will be called when certain events occur (such as receiving a message).
The final of the [Currie Cup] was held on Saturday (which my team (the [Sharks]) won), and I followed the match online using [SuperSport's] live score site1. I then thought that it would be cool to have the score announced on IRC when it changed, and since I was bored I wrote a simple [bot][rugby] to do this. It worked well, but was very simple in that it only supported one hardcoded channel and one hardcoded game.
Since I was also bored on Sunday I [rewrote][rugbybot] this bot properly. I added a subscription mechanism so that channels and users can subscribe and unsubscribe to games by sending the bot a command. It's mostly working except for listing the available games (since there aren't any rugby games coming up which means that I can't test it ;-) ). Games are specified by the ID used by SuperSport's site, and finding the right ID is currently a manual process.
I follow the main feeds of a couple social news sites (namely Digg, Reddit and Muti). When I find an article which I like, I go back and vote it up on the site. However, when I come across good articles via other sources, I don't submit them to these news sites (or try to find out if they've already been submitted) simply because it's too much effort.
When I started aggregating my activity on these sites on my blog and on FriendFeed, I needed a way to share pages that I didn't get to via one of these social news sites. I ended up setting up Delicious because I found a plugin for Konqueror which made it easy to bookmark pages.
I still wanted to solve the original problem though, and so started looking for an easy way to submit links to these sites from Konqueror. Konqueror has a feature called service menus which allows you to add entries to the context menu of files. I then needed to work out how to submit links to these services, which turned out to simply involve loading a URL with a query parameter specifying the link you want to share.
I created entries for Reddit, Digg, Muti, Delicious, Facebook and Google Bookmarks. These take you to the submission page of the service where you can fill in the title1. Digg and Reddit will show existing submissions if the link has already been submitted.
I often share links on IRC, and wondered if I could integrate that with my menu. It turns out that WeeChat has a control socket, and I could send messages by piping them to the socket. I therefore wrote a script which prompted me for a headline or excerpt using kdialog, and then sent the link to the specified channel. My menu now looks like this:
If you want to set this up yourself, download share.desktop and put it in
Joe gave a very intriguing talk on lifestyle design. Some of the points included work less and cheat, which I didn't really agree with, but the basic idea of doing the things you love was good. Jonathan then spoke about actually doing something with your ideas, which was quite inspiring. I don't really get "big" ideas, but I'm going to try anyway.
The other Jonathan showed us Half Price Tuesdays which is an idea he's been working on. I'm helping with the alpha test and it's looking very promising. Kerry-Anne then did a fantastic slideshow karaoke prepared by Jonathan. She gave us some tips on how to survive a GeekDinner talk, but unfortunately needs to implement some of those tips herself :-P