Monitoring London 2012 ticket availability

It isn’t often that my line of work and main passions co-incide, but the frustration of trying to land the London 2012 Olympics tickets that I wanted gave me cause to find a technical solution to my problem.

I’ve blogged before about my limited success in getting tickets for the games. Most of all I wanted tickets, any tickets, for track cycling. These are like gold dust as the venue only holds 6000 people and it is the sport which bears Team GB’s highest chance of medals.

A while ago tickets started to be issued as venue layouts became finalised. In addition the official re-sale process kicked in. This meant that an occasional drip-feed of tickets started to become available on the ticketing web site. There was no advance notice, it was pure luck/co-incidence if something happened to be made available when you looked.

This kind of scenario lends itself nicely to some scripting, so I started looking into the HTML source of the ticketing site to work out how to automate the discovery of ticket availability. The general approach is simple: each session has a unique code and you can perform a search for that session. It then either tells you that tickets are currently unavailable, or allows you to select that session and go on to the ticket selection process.

Therefore a simple script that could poll the session search results page and check for the indication of availability would do the trick, so here it is:

The starting point was this web site monitoring script found via google.

Usage is simply like this:

ticket_watch.pl -e <your email address> <session code>

I utilised an Amazon EC2 micro-instance running Linux then simply set up cron jobs to check the nine track cycling sessions every five minutes. Using cron isn’t perfect – if tickets for a session do become available for a period of time then your inbox might get busy with continual alert emails but that is a minor inconvenience. What’s more annoying is the way that the ticketing site works. There is a significant lag between tickets becoming unavailable and the search results page indicating this fact. These two issues can combine to mean that you continue to get alert emails for quite a while after tickets sell out.

So, does it work? Well I’ve had it running for about two weeks and there have certainly been a lot of tickets drip feeding in during that time. There is no pattern to when, so I could wake up to alert emails sent during the night, or receive them whilst walking the dogs or otherwise not being in a position to do anything about it. However, today I was sat at my desk at work and an alert email came through. I jumped on the site and managed to grab a top-price ticket to the very final track medal session featuring the Women’s sprint, Women’s Omnium and Men’s Keirin. All three events will have very solid possibilities of GB medals.

So, mission accomplished and my EC2 instance has now been retired. There’s still tickets coming in though, and time to grab them so if you are also a bit of a geek feel free to make use of this script!

RTE News on social software and IBM

RTE, Ireland’s national broadcaster, visited IBM Dublin on Monday 1st December for a piece on social software within the enterprise. It went out on the News On Two programme the same day. Yours truly is included in a couple of arty shots and a quick vox-pop

Featured is the internal IBM research project called Beehive. Beehive is, as Mike described in the clip, a kind of Facebook for the Enterprise, allowing employees to share the lighter side of life within the firewall. It’s great at fostering relationships between people who you may work with but never otherwise get the chance to know and meet.

Additionally, Beehive acts as a fantastic proving ground for ideas and research which may eventually end up being part of Lotus Connections, or indeed other IBM products. Indeed, Connections is heavily built on the productization of ideas and research from our own experiences within IBM. There’s much more on Connections over at the Synchronous blog (including the odd post from me.)

Photoshop CS3 and the 7-Point System

Up until now I’ve mainly been using Adobe Lightroom to process my RAW photos for white balance, exposure and tonal correction along with removing dust spots and sharpening. Whilst I’ve got pretty used to Lightroom I’ve often found myself wanting to do a bit more. I have a copy of Photoshop Elements 4 on my Mac and sometimes take photos out to that to do a little more with levels and layers. However I’ve never really invested much effort into that side of things. Whilst I always try to get things right in-camera I’ve become increasingly interested in improving my post-processing skills.

Adobe Photoshop CS3

Adobe Photoshop CS3 is pretty much the de-facto standard for photo editing, but the problem is it costs a lot. I’ve never been one for the illegal use of software, mainly because I work in the industry, so up to this point have simply got by with what I have, plus a few great low-cost tools like FDRTools, Calico and NoiseNinja.

So, I recently stumped up and bought Photoshop CS3. I also got a couple of books: Adobe Photoshop CS3 for Photographers by Martin Evening, and Scott Kelby’s 7-point System for Adobe Photoshop CS3.

Scott Kelby's 7-Point System for Adobe Photoshop CS3

Whilst Evening’s book is a typically comprehensive overview of the features and function relevant to photographers, the Kelby book is structured in a different way. It concentrates on a core set of functions and techniques, including processing in Camera Raw (or Lightroom), curves, shadow/highlight adjustments, painting with light, channel adjustments, layer blending and layer masks and sharpening. It applied these techniques to 21 photos each in it’s own chapter. It is basically teaching by rote. You can download the RAW files he uses and follow along with each chapter taking you through the same set of actions (more or less) until they become ingrained. Along the way he introduces other techniques but never digresses too far from the seven key concepts.

I’ve found this very useful in terms of helping me learn what to apply in what circumstances. There’s such a wide array of tools in Photoshop that there are effectively many ways to achieve the same end result. Already I look back at how I used to do things in Elements and know it wasn’t the right way. The real value I’m getting from the 7-point system is to give me a solid base skill set that I know when and how to deploy for the result I want to achieve. Without this and if I just had the Evening book (or any similar one, including other Kelby efforts) then the information overload it gives you can be a problem and it is hard to relate that to what you would need to do to take one particular photo from start to finish.

Of course, the danger is that the 7-point system becomes too formulaic and the inevitable look that it produces becomes too familiar. I think the challenge here is to use it as a basis but then extend that knowledge with your own style and ideas, and other techniques that you pick up along the way.

Wacom Bamboo

To end, CS3 has also been great fun to use with one of my Christmas presents: a Wacom Bamboo graphics tablet. I can’t wait to revisit some of my back-catalogue of photos and work them through CS3!

Lotusphere 2008

Lotusphere 2008 Beach Party 2

This entry is probably overdue and a little past it’s sell by date by now, but I’ve only just had the time to catch my breath in the last week or so since getting back from Orlando.

My first ever trip to a Lotusphere conference was simply all I expected it to be and more. I had the chance to meet lots of people I’ve worked with for the last six months or so, and also others who I’ve built up a relationship with over Twitter, Facebook, LinkedIn, Connections, Beehive and various other social software tools. In addition I got to see work I’ve been leading shown as part of the Lotus Connections 2.0 related announcements, sessions and demos during the week. Staffing the Connections area in the Meet the developers lab gave me the opportunity to talk to a wide variety of customers, either already using Connections or considering doing so.

The reception that the new features of Connections 2.0 got was generally very pleasing, and on a selfish front the Home page appeared to go down well. I’ll be contributing a more detailed blog entry on the Home page to the synch.rono.us blog in the near future.

Aside from all that the conference experience was pretty unique in my experience of these things. There definitely is a great community spirit around Lotus products, and I got the impression that it is stronger than ever. A few other notes:

  • Aer Lingus transatlantic Dublin – Orlando was convenient and generally very good. Leg room in economy was on a par with American and the service much better.
  • The Walt Disney World Swan and Dolphin resort is a great conference facility. I’d put it ahead of anywhere else I’ve experienced including various Vegas venues.
  • It was nice to have a chance to exercise the favourable Euro-Dollar exchange rate!
  • It helps if you can get by with very little sleep for a whole week.

As ever, my photos from the event are available on flickr

Firefox beats Internet Explorer

Whilst browsing through the data collected for this blog by Google Analytics in December, I noticed that for the first month ever (well, since February when I started hosting the blog myself) that Firefox has been used by more visitors than Internet Explorer:

Firefox beats IE

Only by a small margin, but interesting nonetheless. With Firefox 3 in beta and the next version of IE a good way off, it will be intriguing to see if this trend continues in 2008. Admittedly this is a small blog and therefore a small sample size. The nature of the visitors here is probably slightly more inclined towards using Firefox (or Safari or others) than IE anyway.

Update: Actually, with the inclusion of December 31st, IE snuck back into the lead by 3 visits.

Lotus Connections team blog

I’ve tended not to use this blog to talk about work much, so I’ll gladly point readers off to a new blog from the Lotus Connections product team: Synch.rono.us

I work with Suzanne, Joe and David on Lotus Connections and they are just the right people to bring you news and insight into where the product is going, and how it may benefit you if you happen to be looking for a social software platform for the enterprise. You never know, I might pop up there now and again with a guest entry!

Project Zero and why community driven commercial development is good.

Whilst many colleagues have already blogged about it, I thought I’d just do a quick post about Project Zero. The aim of going public with the site is to provide a community based development aspect around some work that IBM has been tinkering with for a little while now. The goal of Project Zero is to define a revolution in dynamic web application development by bringing together scripting from Groovy and PHP along with RESTful web services, Atom feeds and (say it quietly) even Java.

The interesting thing about the who project is the approach it is taking towards community driven commercial development. Project Zero is not going to be open-source software, so the immediate question is what value does making it’s development community driven have? I think the best example of why socialising the Zero technology core at this stage is a good thing is to make a comparison with another IBM technology vision which started in a similar way, namely Service Component Architecture (SCA)

As I’ve written about before, SCA started life within IBM as the brainchild of a couple of people. From there it grew and a lot of work was done by some talented people (and me) to prototype it, define the programming model and turn it into a technology which we could build a product on. It became the basis of WebSphere Process Server (and thus WebSphere ESB) and really lies at the core of those products. Along the way IBM started to work with BEA and then other vendors to open the spec up, to the point where it is now going through OASIS.

So, we’ve had products in the market built on top of a lot of internal work to define SCA. The problem is however that SCA has evolved much quicker in the open than it probably would have done if it remained entirely within IBM. Had it been kept internal then we would be able to easily manage the trade off between feature and function versus other considerations such as API compatibility from release to release. Problems such as the latter are likely to be more significant as the openly evolved SCA diverts more quickly away from the original IBM implementation (Note, that’s not to suggest that customers with WPS/WESB will suddenly break as and when those products adopt the openly developed SCA spec!)

Now compare this with the community driven approach being taken by Zero. Once again the initial core ideas have come from a small number of people within IBM. They have been socialised a bit within the company (indeed, there’s was and still is a significant amount of input into Project Zero from my old stamping ground in Hursley, some of which I was lucky enough to be involved with when I was there) The difference is that there was no IBM product deliverable driving the creation of Zero. It’s whole raison d’etre is to radically simplify the process of web application development, and the early and hopefully significant input of the wider community can only help those working on it to better understand the problems, issues and their solutions. By getting this input much earlier than ever before in the way these type of projects are incubated within IBM then hopefully the result will be a technology which is more feature rich, stable, performant and documented then ever before, which future IBM products can utilise, and which will help our customers be more successful. As a customer (or even a prospective one) you get to have much more input into what we do as a development organization as well.

Facebook platform introduces Twitter app

Further to my recent entry on Federated Status, Facebook have launched their Facebook Platform which allows applet style gadgets to be embedded into one’s Facebook page. One such widget integrates Twitter and Facebook, but sadly not to the extend I’d hoped for in my previous post. Rather, it allows you to put an extra gadget onto your page to display your Twitters (but not those of your Twitter friends) which is then visible to anybody who can see your FB profile. It only displays your last tweet, and clicking edit on the gadget brings up an error page saying they are still ironing out a few issues, so hopefully more functionality may appear.

Additionally to the gadget, the profile owner can also twitter directly from FB, and also see their Twitter friends’ tweets.

What it doesn’t appear to do is put any integration between the status I can set at the top of my FB profile and Twitter. It would be nice to have the option to set one from the other. Most probably FB -> Twitter rather than the other way around.

Federated status

Unusually, I am awake very early this morning, probably because Lana isn’t here at the moment. As such I logged on and sent a message to Twitter. Then I logged onto Facebook to see what’s happening, and updated my status there as well. Finally, I took a look at the new Plazes interface (thanks Andy, Roo. Andy has more on it here) and lo and behold you can now augment your current plaze with what you are doing.

So that’s three places where I independently had to type “Wondering why I am up at 5:20am” There is an obvious opportunity to develop a federated status system whereby you can update in one place and it propagates out to various social sites, as well as allowing your IM clients to pick that state up as well. Obviously in the case of Twitter there is value in doing more than just automatically sending status type messages, but it would still be useful.

Looking around google, I’m not surprised that others have thought the same, but so far there only seem to be partial solutions to the problem (e.g. Twitterbook which updates Twitter with your Facebook status, and a PHP script to allow Adium to retrieve from Facebook as well.

It would be nice not to have to use piecemeal solutions but to have a single web service in which I could publish my status and have it federated out to my designated subscribers. Unfortunately, this would rely on suitable APIs being available, and looking at Facebook as one example, it doesn’t allow status updating via API. Whilst you could nominate one site as your “master” site in this case, this would fall down as soon as you added another site with no update API. Thinking about it, what’s really needed is a standard and simple ability for all these sites to grab a status Atom feed and to automatically update from the latest entry in that, then it is just a case of deciding which site is the master (feed creator) and which are the slaves (feed consumers)

ETA:

Twitterfeed is interesting.
And somebody has found a way to push status updates to Facebook via their mobile service.

The long CD ripping slog

CDs

As part of the preparation for our move over to Dublin, I’m in the processing of ripping all of my not insignificant collection of CDs.

In fact, having owned a whole series of digital players over the past 7 years or so I’d already ripped a significant amount of them, and for the past two years I’ve bought stuff pretty exclusively from iTunes. However the CD rips were typically 192kbps MP3 and don’t sound particularly good.

Spurred by the desire to be able to store away the CDs at my folks’ place rather than ship them over to Dublin I decided a while ago to make a concerted effort to re-rip them all, and to this time make the digital copies as future proof as possible.

So, armed with my iMac, it’s 500GB disk, about 500 albums and the same number of singes I made a determined start. The trouble is that a few months later I have only got about 1/3rd of the way through 🙁

To future proof, I am ripping to FLAC, using Max. I’m using the highest quality setting (compression level 8 ) and am using the comparison ripper, which means it makes multiple passes at each sector, then computes their hashes. All of this slows the actual ripping process down, but does produce good results.

I’m then taking the FLAC files and chucking them through XLD to produce 320kbps VBR MP3s using the LAME encoder. These are then imported into my iTunes library. Eventually they will also be shoved up onto the disk attached to my Linksys NSLU2 which runs the TwonkyVision uPnP server. This in turn allows me to access my music library from my XBox 360 which is connected to my Hi-Fi.

There are two other non-technical reasons for the slow progress. Firstly, I’ve started with the large number of compilation CDs I have. A lot of these are double, or even triple disc albums, and in general have more tracks/disc than normal albums. Secondly, and most infuriatingly, the track info held in the MusicBrainz database used by Max isn’t terribly good when it comes to compilation albums. A lot of the time the CD artist will be “Various Artists” but then the artist field for each individual track will not be set. Instead it seems people just put the track title as “artist / track” which needs manual correction. I estimate this has been the case in about 50% of the CDs I’ve done so far.

Thankfully I am now at the end of the compilations, and am getting onto albums and selected singles. These should go past much quicker thanks to more accurate track info and fewer tracks/disc.

A few stats:

CDs ripped so far: 261
Number of tracks: 3906
Total size of FLAC files: 96.83GB
Typical size per FLAC file: 25 MB
Total size of MP3 files: 14.04 GB
Typical size per MP3 file: 3.5 MB