Thursday 19 February 2009

Text is the new GUI?

We've got a software student ( from another University) working on his final year project at my day job. He is busy adding a 'speech' interface to our Laboratory tracking system. The idea is that while the scientist has their hands in the fume cupboard they don't want to be messing with a mouse or a keyboard why not engage with the computer by voice. This is after all how they did it in the old future in the movies.

Alas the student has got a bit sidetracked into the excitement of speech recognition and synthesis, in an attempt to get him to some sort of conclusion of his project I have suggested that most of the academic benefit of the project could be got by just having a text input and output (although useless in the fume cupboard). Once you go there it starts you thinking about how we interact with systems by voice. We are of course now used to listening to SatNav and some folk order their phones to phone the wife.

While pondering this and Georges earlier post of Twitter Library fees, I was thinking about an article about how fans have put together Twitter accounts of their favourite T.V. characters so that we can see when they are having a sandwich during the week. It made me wonder whether we might soon be engaging with various systems through the power of text rather than super 3D graphical interfaces.

This will be a shame as much of the design thought in web design and business application design has been about mice and windows more or less. In Liverpool this has led to some success for the hybrid 'programmer/graphic designer', perhaps if we are going to deal in a flow of text it will be the Hybrid 'programmer/DJ' or at least 'programmer/Linguist' who lead the way.

We surround ourselves in an increasing sense of a flow of consciousness through Twitter/Facebook etc. Surely this is going to include a Twitter from machines.

"Your fridge is enjoying a quiet day."
"Your car is worrying that it's service is due this week."
"Your door notes that fido is standing at it and wants to go out."

This will lead us to a wish to push application outputs into twitter like streams for our apps to respond to our own twitters. Text (or speech) may be the new GUI. Of course we will have to know a lot more about parsing and extracting meaning and identity from these streams of conciousness.

Thursday 12 February 2009

FOAF and political social graphs

While catching up on some blogs I follow, I noticed that the Semantic Web-ite Ivan Herman posted comments regarding the US Congress SpaceBook – a US political answer to Facebook. He, in turn, was commenting on a blog made by the ProgrammableWeb – the website dedicated to keeping us informed of the latest web services, mashups, and Web 2.0 APIs.

From a mashup perspective, SpaceBook is pretty incredible, incorporating (so far) 11 different Web APIs. However, for me SpaceBook is interesting because it makes use of semantic data provided via FOAF and the microformat, XFN. To do this SpaceBook makes good use of the Google Social Graph API, which aims to harness such data to generate social graphs. The Social Graph API has been available for almost a year but has had quite a low profile until now. Says the API website:
"Google Search helps make this information more accessible and useful. If you take away the documents, you're left with the connections between people. Information about the public connections between people is really useful -- as a user, you might want to see who else you're connected to, and as a developer of social applications, you can provide better features for your users if you know who their public friends are. There hasn't been a good way to access this information. The Social Graph API now makes information about the public connections between people on the Web, expressed by XFN and FOAF markup and other publicly declared connections, easily available and useful for developers."
Bravo! This creates some neat connections. Unfortunately – and as Ivan Herman regrettably notes - the generated FOAF data is inserted into Hilary Clinton’s page as a page comment, rather than as a separate .rdf file or as RDFa. The FOAF file is also a little limited, but it does include links to her Twitter account. More puzzling for me though is why the embedded XHTML metadata does not use Qualified Dublin Core! Let's crank up the interoperability, please!

Friday 6 February 2009

Information seeking behaviour at Google: eye-tracking research

Anne Aula and Kerry Rodden have just published a posting on the Official Google Blog summarising some eye-tracking research they have been conducting on Google's 'Universal Search'. Both are active in information seeking behaviour and human-computer interaction research at Google and are well published within the related literature (e.g. JASIST, IPM, SIGIR, CHI, etc.).

The motivation behind their research was to evaluate the effect incorporation of thumbnail images and video within a research set has on user information seeking behaviour. Previous information retrieval eye-tracking research indicates that users scan results in order, scanning down their results until they reach a (potentially) relevant result, or until they decide to refine their search query or abandon the search. Aula and Rodden were concerned that the inclusion of thumbnail images might distract the "well-established order of result evaluation". Some comparative evaluation was therefore order of the day.
"We ran a series of eye-tracking studies where we compared how users scan the search results pages with and without thumbnail images. Our studies showed that the thumbnails did not strongly affect the order of scanning the results and seemed to make it easier for the participants to find the result they wanted."
A good finding for Google, of course; but most astonishing is the eye-tracking data. The speed with which users scanned result sets and the number of points on the interface they scanned was incredible. View the 'real time' clip below. A dot increasing in size denotes the length of time a user spent pausing at that specific point in the interface or result set. Some other interesting discoveries were made – the full posting is essential reading.