Sunday 19 August 2012

PyConAU 2012 - Day 2

The keynote speaker for the morning with Kenneth Reitz, who is well known in the Python community as being the author of the Requests package, and for being an advocate and example for spending effort in producing succinct and clean APIs. This talk was a huge highlight of the conference, and (once the videos are up), the kind of content you'd want to show to junior developers to encourage them to be Python developers, adopting all the ideals of the Zen of Python. I certainly left this one feeling (a tiny bit) guilty that I don't think about this as much as I should.

Kenneth was also pushing the idea that all developers in the community should consider contributing to python-guide.org, which is aimed to show developers, new and existing alike what the current best practices are, and in some cases the one obvious way to do things. This point resonated with a number of attendees, and I believe that there could well be a collaborative project coming up working on content for explaining how to mirror cheeseshop/PyPI.

The talk on funcargs (and pytest) by Brianna Laugher was excellent, I definitely need to watch the video, and have a think about whether any of the concepts described should be incorporated into the test code I maintain. I'm very happy with nose, but test generators sound quite attractive.

Russell Keith-Magee gave an impromptu talk after a late change in the program on Django core, and the things you can do to maximise your chances of getting your changes added. This talk was pretty far out from what I'd normally go to. My work doesn't involve webapps, so I don't use Django, and don't know if I'll ever need to, so I'm sure I missed a lot of the subtleties. However, it did give me a little bit of an idea what the difference between django-contrib and django-core are, and it's always good to have a feel for the major groups in a community. There was also a reference to a great document on API maturity.

Rhydwyn Mcguire (another MPUG regular) gave a great overview of the current state of RPy, a bridge between the R statistical language and Python. I'm sure the information on developing new code using R-studio will come in handy, as will the list of "features" to be wary of in integrating the two languages. Personally, I think that R looks horrible compared to the pandas library.

Alex Sharp from OrionVM gave the last talk that I saw all of before rushing out to the airport. He spoke about what it takes to make cloud computing work. I've seen a few cloud presentations before, and this one was the first that I've seen the presenter talk about probability distributions, statistics, bloom filters and the types of tricks that you need to process truly massive data sets in practice. Being someone who loves code and math, this one was right up my alley.

Oh, and my talk was in there somewhere....

Saturday 18 August 2012

PyConAU 2012 - Day 1

The easy first day of the conference, at least from my perspective =) is over. It was really great to see a good range of content, and in particular science content, that obviously appeal to a broad audience.

Mark Ramm (now from Canonical) was the keynote speaker for the first session of the morning. I've had the chance to listen to Mark previously, but this talk was particularly good. His main message was "don't waste your life", and to do that by focusing on things where you can really make a difference. For him, that meant trying to find a way to prevent project failures by minimising technical and marketing risk. His suggestion was to focus the work you do by testing and measuring, rather than guessing or blindly asking people questions (market research). The questions and discussion that followed the talk on how to design good experiments were also excellent.

The first session of the morning for me was given by Tennessee Leeuwenburg, one of the regular team from Melbourne Python Users Group (MPUG), on "Visualising Architecture". The talk focused on giving developers practical ways of learning and writing libraries and to understand and improve existing projects. This included using tricks like associating components with locations in the project file system and making good use of visualisations to communicate and justify the decisions you want to make.

After lunch I went to Ed Schofield's talk on "What's new in Python for Science and Engineering", which was one of the talks that I was most looking forward too. Ed covered a huge number of topics, but there were a few really new things I need to look into in the near future. This included:

  • New (for me) libraries for CUDA and parallelisation Theano and copperhead
  • The talks for SciPy2012 are now online.
  • The ipython web notebook now has resizable figures!
  • There were also a few new interesting Cython things to look at in the future 

The final talk of the day that I went to "An Unexpected Day" given by Aaron Iles was amazing. I respect the huge amount of work that must have gone into the preparation for the talk, which ended up being presented as one part drama and two parts programming presentation. The talk focused on how to make best use of python and ctypes, when they were the only available option for solving a problem. The only thing I'm disappointed about is that the talk didn't get recorded.


PyConAU 2012 - Day 0

PyCon Australia has started for 2012! The venue for the conference is beautiful, looking out over the water in Hobart, with an amazing view of all the boats. After registration it was time to grab swag bags, and get ready for CodeWars, the first event of the conference.

The very monty-python like conference t-shirt
CodeWars is a team programming challenge. The first few questions revolved around some tricky use of decoding hidden messages using python modules. The final task of the night was open ended, with the result decided by audience applause. This was a great way to catch up with old friends, meet some new people, and settle into PyCon

Tuesday 14 August 2012

Python Powered Computational Geometry at PyConAU

I'll be speaking with a colleague at PyConAU 2012 this weekend. We'll be giving a 30 minute talk on Python Powered Computational Geometry. The emphasis will be on how the right python tools can help you rapidly prototype solutions, visualise results, and generally get stuff done efficiently, without worrying too much about implementation details.

The full talk description as published is as follow:

Computational Geometry is the study of geometry with the support of appropriate algorithms, and influences a broad range of fields of science, engineering and mathematics including: Computation Fluid Dynamics (CFD), Finite Element Modelling (FEM), Computer Aided Design / Modelling (CAD/CAM), Robotics, Computer Graphics and Collision Detection.

While it is possible to quickly implement naive versions of computational geometry algorithms in any language, such algorithms can be expensive and time consuming to write, debug and maintain. The advent of a number of tools, including new bindings for the Computational Geometry Algorithms Library (CGAL) and the ipython notebook make Python an ideal tool for experimentation with fast and numerically robust algorithms.

The presentation will include demonstrations of a number of common two- and three-dimensional computational geometry algorithms:
  • Triangulations
  • Mesh refinement
  • Intersection testing
  • Alpha shapes
  • Convex hulls
  • Constructive Solid Geometry
  • Minkowski Sums

Saturday 23 June 2012

A Centenary of Turing

One hundred years ago today, June the 23rd, 1912, Alan Turing was born. He is arguably one of the most significant mathematicians born in the 20th century. If you're interested in Turing, I strongly recommend Andrew Hodges biography - Alan Turing: The Enigma

Amongst his many contributions to mathematics, science and cryptography, perhaps the most significant from my perspective relates to the 1936 paper "On Computable Numbers, with an Application to the Entscheidungsproblem", in which he introduced what is now known as a Turing Machine, the theoretical underpinning of the modern computer. A great summary of the concepts in that work is available in Charles Petzold's book The Annotated Turing: A Guided Tour Through Alan Turing's Historic Paper on Computability and the Turing Machine

For me, Turing stands as a reminder of many things:

His work demonstrates the greatness of modern science - in particular how important independent thinking and hard work are, as well as demonstrating how cross-pollination of ideas from math, science and engineering are capable of bringing about remarkable, and in some cases world-changing ideas.

I believe that Turing was, in part, inspired to greatness by his peers, from school and beyond, as well as obviously leaving behind a legacy that has inspired so many. I've been blessed with mentors of my own, both in work and in life. This year, I've taken on the responsibility for mentoring someone, and am sharply reminded how important this role can be.

Turing's story ended in tragedy - he was persecuted by the British authorities of the time for indecency (homosexuality), which ended up with him being punished with chemical castrated. His death two years later was (very strongly) suspected to have been suicide, caused by consumption of a cyanide laced apple. For me, this highlights the need to remain vigilant for unjust prejudices, both around me, and in my own thinking.

Happy birthday Turing, you remain an inspiration

Friday 15 June 2012

Visualisation of Kuhn Triangulations

One of the things that got briefly referred to in the under-actuated robotics course was the excellent paper "Variable Resolution Discretization in Optimal Control" by Munos & Moore. I'd had a copy of the paper lying around for a while, because it was mentioned in Steven Lavalle's book Planning Algorithms

That paper recommends a more efficient method of interpolation using simplexes rather than hypercubes. The advantage is that simplex interpolation requires (n+1) points rather than 2^n points in n-dimensions.

One of the best things about the Munos & Moore paper is a really great explanation of how interpolation in simplexes using Kuhn triangulation works, as well as a number of really great diagrams. I've attached some (pretty horrible) Mathematica code that I've used to reproduce something similar to one of the explanatory figures from that paper.

(* The m-th cartesian aligned basis vector in an n dimensional space. *)
BasisVector[n_, m_] := Map[If[# == m, 1, 0] &, Range[n]]

(* Takes an ordered list of basis functions which describe the order \
to walk the tetrahedra boundary in *)
SimplexVecs[perm_] := Accumulate[
  Join[
   {ConstantArray[0, Length[perm]]}, 
   Map[BasisVector[Length[perm], #] &, perm]
   ]
  ]

(* Generate all 3d tetrahedra on a unit interval *)
TetrahedraGeom3D[] := Map[
   GraphicsComplex[
     SimplexVecs[#],
     Polygon[Subsets[Range[n + 1], {n}]]
     ] &,
   Permutations[Range[n]]
   ];

SimplexCenter[gc_] := Mean[gc[[1]]]

(* Generate all the geometries *)
gs = TetrahedraGeom3D[];

(* Shift them a little bit away from the cube center*)
gs = Map[
   Translate[#, SimplexCenter[#] - {1/2, 1/2, 1/2}] &,
   gs
   ];

(* Generate a 'nice' list of colors *)
mycolors = {Red, Green, Blue, Cyan, Magenta, Yellow};

(* Compose the graphics together, and render *)
Show[MapThread[
  Graphics3D[{Opacity[0.4], #2, #1}] &,
  {gs, mycolors}
  ]]

Saturday 26 May 2012

Minimum time value iteration solution of the double integrator

I've spent the last week of spare time working of a value iteration problem for the underactuated robotics subject I've been working on. In particular this has involved finding the optimal policy of how to get the undamped double integrator to the origin in minimum time (github gist of code). A MATLAB / Octave solution is available on the OCW site, so I don't feel too bad publishing this here.
This has been something a bit different for me, considering how little vectorised code I've written in the past, but has already had a big impact on some things I'm doing for a number of projects. The performance benefit is substantial, but it makes code hard to read, write, debug and test.

Sunday 13 May 2012

Python simulation of the simple pendulum

Following on from the previous post, I've created some simulations of the simple pendulum without damping or control input. This work was designed more to be a test of my understanding of how to use the scipy integrator module, and to make sure the matplotlib plotting code worked correctly.

I'm pretty pleased with the results, and you can see a plot of the homoclinic orbits of pendulum starting in different initial states, and some videos of what that means, if you don't naturally think in phase space.

Saturday 12 May 2012

Python simulation of the Van der Pol Oscillator

One of the things I'm doing at the moment is watching an excellent series of MIT OCW lectures by Assoc. Prof. Russ Tedrake on underactuated robotics. I've learned a lot watching the videos, which pick up at a point where some parts of my doctoral studies finished.

Just to pick something small that is at least tangentially related to the related to the course, I've attached a python snippet of code for generating the Van der Pol Oscillator. This system is significant in that it exhibits limit cycles, which are an important tool for reasoning about walking robots

import numpy as np
import matplotlib.pyplot as plt
from scipy.integrate import odeint

mu = 0.2

def van_der_pol_oscillator_deriv(x, t):
    nx0 = x[1]
    nx1 = -mu * (x[0] ** 2.0 - 1.0) * x[1] - x[0]
    res = np.array([nx0, nx1])
    return res

ts = np.linspace(0.0, 50.0, 500)

xs = odeint(van_der_pol_oscillator_deriv, [0.2, 0.2], ts)
plt.plot(xs[:,0], xs[:,1])
xs = odeint(van_der_pol_oscillator_deriv, [-3.0, -3.0], ts)
plt.plot(xs[:,0], xs[:,1])
xs = odeint(van_der_pol_oscillator_deriv, [4.0, 4.0], ts)
plt.plot(xs[:,0], xs[:,1])
plt.gca().set_aspect('equal')
plt.savefig('vanderpol_oscillator.png')
plt.show() 
Which when simulated looks something like this:

Thursday 15 March 2012

Hard Real-Time Motion Planning for Autonomous Vehicles online now

It's been a while since I've checked, but Swinburne University have published a low quality copy of my PhD thesis - "Hard Real-Time motion Planning for Autonomous Vehicles" online at their research bank. If you don't want to download the whole thing, you can look at the summary here

During my candidature I had the honour of working with the team developing the Wayamba Unmanned Underwater Vehicle, a three meter long flat-fish form factor research vehicle.

Image Taken from http://www.defence.gov.au/news/navynews/editions/4821/topstories/story08.htm

Tuesday 13 March 2012

PyCon - Day 5 (Sprints Day 1)

Today was the first time I've ever sprinted, and I can honestly say it wasn't at all what I expected. I was expecting frantic energy, and I mistakenly assumed that the project maintainers would turn up with a huge number of tickets that would get allocated to anyone who came. I also hoped (expected?) that there would be someone experienced in the project who would help me get started.

Instead it was a cross between a design meeting and a pretty regular work day, and while everyone was friendly and very helpful, they were also busy trying to get their own work done.

It's no-ones fault - I just wasn't anywhere near prepared enough to actually be useful helping on the project that I wanted to work on. In part, I guess that some of that is about trying to break into a community - there needs to be a degree of trust before you get commit rights to a major repository. So I'm happy to watch most of this one out, with the understanding that things will go better at future conferences.

Instead, I've had a good day working on a related topic to my project of interest, which I'm hoping may end up with nice enough results to put in for a PyCon AU talk.

We'll see how it goes.

Monday 12 March 2012

PyCon - Day 5 (Conference Day 3)

I have to admit that the feeling of washed out has continued, and with the change of daylight saving, I almost missed breakfast and the keynote entirely.

Guido's address to PyCon started the morning officially, and he reiterated the key point that this was a time for consolidation and community building, and that we as python developers should already know the answers to any questions trolls may post on various forums. He also indicated that modifications to the core language in the future would need to be demonstrated more rigorously, either in alternative distributions, or through the use of import hooks.

The poster session / job fair was excellent, but in my opinion, the academic and mathematics posters far outshone anything else. I took lots of photos, and I'm happy to discuss them with anyone who is interested.

I literally dragged myself to three talks today, which isn't to say anything about the speakers, merely the grueling schedule of the conference:
- Writing GIMP plugins in Python
- Build A Python-Based Search Engine
- Parsing sentences with the OTHER natural language tool: LinkGrammar

All of which were interesting, but none of which seemed earth-shattering.

The conference finished with lightning talks, which really deserves a post of it's own, but suffice to say, the github links and urls mentioned will be enough to keep me busy for a few weeks when I get home.

The culmination of the conference was a raffle (what Australian's would call a give-away), with Frisbees tossed out into the audience to decide winners. I will state, I did catch a Frisbee - but gave it up to someone else with a tighter grip.

I'll admit in advance that this is reasonably subjective, but it is interesting to try to consider what the most important developers in the python community are interested in, over the past three days. So based (almost completely on the description of the talks in the conference program), the relative proportion of talks by field is something like:

The Python Language, Standard Library and Packaging - 25%
Web topics / Cloud / Networking - 24 %
Math / Science / Data Manipulation - 12 %
Testing / Debugging - 10 %
Electronics / Robotics - 7%
Other - 12 %

This result surprised me a little, because I expected to see a much higher proportion of the web/cloud/networking topics than what really seemed to be present.

At dinner time I found Richard Jones from the Melbourne Python Users Group who was about to go to dinner, and invited myself to go along. Richard ended up deciding that sleep prior to sprinting was more important than food (or more alcohol).

I ended up on a table at a Mexican restaurant with Ian Ozsvald, Mike Mueller (of Python Academy), Ricardo Kirkner and Jack Diederich, amongst others. This was an awesome experience, where I got to pick the brains of so many speakers from the conference.

A number of margaritas later, I've finally arrived back at the hotel, wondering exactly how I'm going to manage to be up in a condition for sprinting in the morning. Regardless, this has probably been the best, and most valuable, night of the whole PyCon.

PyCon - Day 4 (Conference Day 2)

The second conference day definitely had a different feel from the first day. I'm not sure whether it is being away from home, the time difference, or some very big days, but I'm feeling pretty tired.

The PyCon committee had asked the mornings keynote speaker David Beazley to speak on something "diabolical" - something that he has a reputation for after great talks at the past few python events on "programming the superboard" and "the challenges of the python GIL". I think he picked the perfect topic, talking about PyPy (the Python-in-Python) project, which has the potential to have an enormous impact on performance for all code and infrastructure written in Python.

Having attended the PyPy talk earlier in the week and having tried to go through some of the same very challenging process of getting PyPy to build within the last week, the talk struck a chord with me. PyPy is so exciting, yet seems to require an enormous amount of effort to get started to even add small features.

Straight after the keynote I raced over to the exhibitors hall, because I'd manage to finish the Google challenges last night, which meant I earned my pins, and a T-shirt with the slogan on the back "Python: Programming the way Guido Intended"

The talk schedule worked out a little differently today, in particular, I had the flexibility to go to talks that were just plain cool rather than related to work:

- The Journey to Give Every Scientist a Supercomputer
- Python for Makers
- Pragmatic Unicode, or How do I stop the pain?
- Python and HDF5 - Fast Storage for Large Data
- Militarizing Your Backyard with Python: Computer Vision and the Squirrel
Hordes
- Using fabric to standardize the development process

One of the big standouts for the day was the talk on Ned Batchelder's talk on Pragmatic Unicode, which provided a consistent approach to dealing with Unicode through your programs. It also seemed to be a far more pragmatic than the Joel Spolsky's essay on "The Absolute Minimum Every Programmer Should Know About Unicode".

I was impressed by the "Militarizing Your Backyard with Python: Computer Vision and the Squirrel Hordes" talk, both for using Python for everything from the highest to the lowest level, and also because the speaker managed to get very impressive machine learning results for squirrel classification from a minimal, and what I assumed would have been a very small and coarse set of features.

The other big thing that happened was that as a response to the fabric talk, one of the delegates came off the floor and gave a really succinct answer about what the roles of provisioning tools like chef and puppet, versus fabric. The answer seems to boil down to the fact that there is some cross-over, but fabric is best suited for remote management, and puppet/chef are best suited for keeping your machines in a consistent state.

Sunday 11 March 2012

PyCon - Day 3 (Conference Day 1)

Today was a very big day, and to be honest I'm feeling a little wrung-out tonight. Because this was the first full day of the conference we got swag this morning, which had more than enough cool stuff in it to make me very happy.

The keynotes were great (yeah for awesome dancing robots on stage!), as was seeing the 2200+ people in the Grand Ballroom.


The first keynote speaker Stormy Peters' messages on community building, inclusiveness, and on building tools that enable end-users to make and control their data in the ways that they want and need to made lots of sense.

I've followed the second keynote speaker Paul Graham's work for a long time - he's a big part of the reason that I do things the way I do in python, and for my interest in Lisp and functional programming techniques. To hear him speak about Y-Combinator was an honour. The message that I took away from the keynote was that big things can grow out of small ideas, and that we, as developers, really already know what the important problems are.

During the talks I didn't follow a particular track, but rather tried to concentrate on maths and sciences talks that I knew relate strongly to the things that I work on, or could work on in the future (graph-processing, machine-learning, Py0MQ, Sage & Pandas), but I also went to Raymond Hettinger's talk on subclassing, and Dave Brondsema's talk on decorators and context managers.

I managed about five minutes in the exhibitors hall, and grabbed some great deals from the O'Reilley store at what ended up being about a third of the price of the same books off the shelf in Australia.

I also popped in on the end Testing In Python miniconf/open space session, which was an insane combination of informative, hilarious and a bit wrong (lab coats, beer and goat jokes). But the 100 people in the room were a testament to the dedication of the attendees on a Friday night at 9pm when there were at least two different parties running with free beer.

Update

Matt Spitz's talk on practical machine learning had a number of good code samples, you can find his git repository on machine learning from baseball stats on github, and his slides at slideshare.

The talk on Sage including a link to an sagenb.com which hosts an online version of the Sage math notebook for experimentation and sharing with your peers.

Wes McKinney's slides on the VBench and the Panda's data manipulation and transformation library are available on slideshare, and his code is available from github

Saturday 10 March 2012

PyCon - Day 2 Tutorials (Social Networks & High Performance Python II)

I had two tutorials on today, the first on "Social Network Analysis", and the second on "Advanced Python II".

Social Network Analysis is a fairly loose term that groups a whole bunch of ideas together. A Social Network is something where you have information about which relationships exist between individuals, and what those properties are. From a computer science perspective that lends itself to a graph-theoretic approach to analysing the social networks. The second stage is then the use of statistics to make some conclusions about data (identify key members in the network, identify anomalous behaviour, compare networks and lots of other things).

The talk happened in two interleaved parts, a live coding demo of how to retrieve data from online social networking services (twitter & crunchbase), and a slightly more theoretical discussion of the principles of network analysis.

I think that perhaps this tutorial was targeted with slightly different audience in mind to what I was expecting. I would have been happy with a pretty rigorous mathematical description of the statistics, and how to deal with that in python, but considering the reactions, I think that other people may have had a opinion.

The second talk on High Performance Python blew me away completely. The thought of getting to hear Travis Oliphant (previously at Enthought, and now at Continuum Analytics), speak about numpy was a big deciding factor for coming, and he (and his team) didn't disappoint.

The talk was actually split into four parts, an introduction on how to write efficient numpy code (vectorisation, making best use of the numpy data structures), a second section on numexpr (a tool for optimising small but critical numpy expressions), a third section that was a impressive live demo of a particle simulation, that was used to demonstrate the measurement and prediction of performance of numpy code, and a fourth section describing a new tool for writing optimized numpy ufuncs called numba (which comes from numpy + mumba), based on LLVM.

After the tutorials finished, I ran into Ned Batchelder in the lobby. I'm a big fan of Ned's coverage.py and cog modules, but also because of his blog. After that, I picked a random table and started talking to people, and it just so happened that they were the Sourceforge development team - who very kindly took me out to a lovely dinner at an Indian restaurant.

Update

One of the libraries mentioned in the social media tutorial was tweepy, a library for pulling data from twitter

The other was an improved download library to replace the (somewhat) broken urlib2 called requests

There was some discussion in the Adv. Python II tutorial about combining numexpr with the Intel Math Kernel Library.

I was flipping through some of Enthoughts products looking for the Continuum Analytics github repository and came across Chaco, which looks to be an alternative 2d plotting library.

Thursday 8 March 2012

PyCon - Day 1 Tutorials (ipython & pypy)

After a pretty good nights sleep, I was up for 7am registration in the conference centre. I made some new friends and got too meet two of the rockstars of the Python community Raymond Hettinger & Dave Beazley

I could say a lot about the tutorials, but I don't really need to because you'll be able to watch both of them.

As a taster, the slides from the ipython tutorial are available online.

I was pretty blown away by the pypy tutorial, my thesis involved generation and analysis of assembly code traces in a smallish code-base. That effort was challenging enough for me to have huge respect for what pypy team have done (and are continuing to do) with regard to scope, complexity and the quality of the outcomes that they are achieving. Even better, I found out they have a partial numpy implementation working! yeah.

Updates

One of the TA's at the talk, Paul Ivanov, mentioned interoperability layer between vim and ipython.

The attendee sitting next to me in the pypy talk was using a really pretty editor, that looked a lot like vim, with some cool extra features, it was sublime text

Wednesday 7 March 2012

PyCon - Day 0

I've arrived safely in Santa Clara, The city itself is amazing. From a nerdy perspective, it was awesome because it was almost like the internet coming alive. I saw buildings for Yahoo, Mcaffee, some big game development studios and more. It's very obviously a technology focused city. Even all the advertising is for software and software related services.


It's also very clean. It reminds me alot of Canberra (without the round-abouts). Very flat, with perfect footpaths and manicured grass edges.

I got back to the hotel and caught up with one of the other Australians I knew was coming, very nice to see a familiar face after a long day. I stopped and had a late lunch with him and another delegate. I don't think I'm spilling anything big, but the news is that the total numbers have crept over 2000 for the conference, with over 1900 registrations for delegates.

Wi-fi isn't working in my room at the moment, so I've had a quick nap, I'm posting this down in the lobby on the conference network, so I'm using that. In the time from late lunch to dinner time, the hotel lobby itself has gone from being filled with 'suits' to being filled with pythonistas.

Thursday 1 March 2012

PyCon 2012

If anyone is interested, I'm attending US PyCon 2012 in Santa Clara.

Post a comment if you're interested in catching up!

Tuesday 14 February 2012

Building ipython from source

Late last year I saw a demonstration of one of the newest versions of ipython at the Melbourne Python Users Group. It was perhaps one of the most astonishing pieces of open source software I've seen, with a new web-based notebook interface that is very similar to what you would expect in a commercial package like MATLAB or Mathematica.

At the moment the newest versions of ipython aren't in the Debian/Ubuntu repositories (at least for the versions of Ubuntu I care about). What follows is a makefile that documents the steps that I've gone through to get the newest version installed on a close to clean installation of Oneiric (11.10).

It's worth noting that this process is still somewhat manual because the versions of some of the dependent packages are changing so quickly, and it is likely that some manual intervention will be required.

ZEROMQ=zeromq-2.1.11
SIP=sip-4.13.2
PYQT4=PyQt-x11-gpl-4.9.1
TORNADO=tornado-2.2

install:
    sudo apt-get install python-scipy \
        python-matplotlib \
        mayavi2 \
        git-core \
        build-essential \
        python-setuptools \
        python-dev 
    sudo easy_install nose \
        pexpect \
        pygments

zeromq:
    sudo apt-get install uuid-dev \
        libtool \
        autoconf 
    wget -c http://download.zeromq.org/${ZEROMQ}.tar.gz
    tar zxvf ${ZEROMQ}.tar.gz
    cd ${ZEROMQ} && ./configure
    cd ${ZEROMQ} && make 
    cd ${ZEROMQ} && sudo make install
    sudo ldconfig
    sudo easy_install pyzmq

sip:
    wget -c http://www.riverbankcomputing.co.uk/static/Downloads/sip4/${SIP}.tar
    tar zxvf ${SIP}.tar.gz
    cd ${SIP} && python configure.py
    cd ${SIP} && make
    cd ${SIP} && sudo make install

qt4:
    sudo apt-get install libqt4-dev
    wget -c http://www.riverbankcomputing.co.uk/static/Downloads/PyQt4/${PYQT4}.
    tar zxvf ${PYQT4}.tar.gz
    cd ${PYQT4} && python configure.py
    cd ${PYQT4} && make
    cd ${PYQT4} && sudo make install

tornado:
    wget -c https://github.com/downloads/facebook/tornado/${TORNADO}.tar.gz
    tar zxvf ${TORNADO}.tar.gz
    cd ${TORNADO} && sudo python setup.py install
    
ipython: 
    git clone https://github.com/ipython/ipython.git
    cd ipython && sudo python setup.py install