PhD in Computer Science
Tuesday 28th May 2013
What is the fastest computing device you have in your house? A laptop computer, an old desktop, or maybe even your mobile phone? What if it were actually your television set or your Sky+ box (if you have one, of course)?
Chances are that it is not yet your phone, your television or a set top box, but today those devices routinely pack the same sort of computing punch as super computers costing a billion dollars two decades ago and they are likely to get more powerful still.
Your phone, the Sky box, the HD television all fall into the category of "real-time" computing device: if you open a wordprocessor document on your laptop and it takes a second longer than you expect to open then it is a pain, but it doesn't represent a service failure. If your mobile routinely took too long to decode the digital signals it received over its radio connection it would be useless: real-time devices have to meet time deadlines or else the service is completely broken (so called "hard" real-time) or the quality of service starts to decline ("soft" real-time).
Mobile telecoms and TV signals are now routinely dispatched in an encoded digital form and so the devices in your home need to be able to decode the signals fast enough for you to hear the sound or watch the picture. And, if you think of your phone, the device may also need to manage email programs, video cameras and seamlessly move your connection from tower to tower as you travel.
So devices need to be able to run multiple tasks in parallel, they need to meet deadlines while they are doing it and they need to do all they can to keep battery life long and power consumption to a minimum.
Of course, not all real-time devices, even in your home, need to work too hard to keep power consumption down. In a very real sense your latest generation video games console - whether from Sony, Nintendo or Microsoft - could be the fastest computing device, a real-time machine and a power guzzler.
This image, which I prepared in the course of my work on my PhD, shows how real-time devices are catching up with, and even surpassing, more traditional computing devices when we measure computing power in "millions of instructions per second" (MIPS). The black circles, based on data collated by Hans Moravec and available here shows the rising power of what were once called "microcomputers" and would now be called "desktops" from the mid-1970s on. The red crosses, based on data I have collected myself (with the exception of the iPad2 figure), represent real-time devices sold or given to consumers, such as games consoles and set top boxes.
Using MIPS is a crude comparison - today's multiprocessor devices would rarely if ever be able to run flat out without having to wait for new instructions and data to be fed in from memory - but not totally unreasonable. On top of that, for the real-time devices at least, we are relying on manufacturers' claims rather than any rigorous testing. But few would dispute the broad outline of the pattern.
The advance of the black seems steady, while that of video games consoles is close to spectacular. Other real-time devices have not improved as rapidly, but are certainly closing the gap.
A new round of video games console wars are about to begin and the mobile phone companies are seeking to boost data rates and phone capabilities. And don't forget 3-D television...
Wednesday 13th February 2013
As a part-time PhD student with a full-time job I have to ration my trips to the campus at York and this week I have to confess I made the trip with less enthusiasm than usual - pressure at work and a feeling that I had run into something of a brick wall as I began to consider real experiments and what my supervisor calls "inventions".
But I came away feeling both intellectually and physically reinvigorated and wishing I'd been there for longer.
Simply sitting in the library makes a huge difference: surrounded by books and by students working (York's undergraduates seem to treat the library a bit more seriously than those at Edinburgh in the 1980s - certainly the noise level is generally lower) is a stimulus all of its own. But rather more importantly my supervisor was able to put my thoughts and worries into context and restore some confidence.
My research is into operating systems on "Network on a Chip" (NoC) devices. NoCs are likely to be an important part of the future of computing, as they, in theory at least, will continue to deliver faster and more computationally powerful devices despite increasing problems caused by heat and power demands in more traditional system designs.
But NoCs are also a very young technology and questions of how to get the best out of them are largely open. Do we apply the experience of existing parallel computing devices or something more radical still? And, in essence, my supervisor urged me to consider whether my assumptions, based on existing computing paradigms, were the correct ones.
As for the physical side - coming to York I once again hired a bike and used that to get around. Returning home I had some ache in my legs but a sense of having made some real progress.