Tons of fun with eye-tracking of #Articulate resources #edtech #uoltech
Another day, another project and another thing learned, as they say ;). Alina (it’s so good to have clever family, I’m telling you!) has very kindly showed me quickly how very cool an eye-tracking system is. After a rapid calibration procedure, you’re up and running (although, I have to say, it’s hard to keep a straight face at first, especially if you’ve seen the movie Wall-E – see photo below to get what I mean).
What I was curious to see is how the eye-tracker would interpret my browsing an Articulate web resource, and I was very impressed by the result (although we didn’t even use it to its full functionality – e.g. we didn’t connect a webcam and mic to see me in a picture-in-picture format as I was browsing, as well as hear any comments that I may have made). So here is a 1-minute silent movie while I was browsing a resource created for the National Network for Interpreting.
Another clever thing that Alina’s eye-tracker does (Tobii X120 – I thought you may appreciate this info, so I did ask ;)) is to generate heat maps. These maps represent graphically how long you have been staring at various areas of the screen – a red area is one which you’ve been looking at quite a bit, while green ones are those you glanced over. Very cool, and the applications for e-learning are quite significant, as Alina’s project will show (on this page, Alina appears in the 2010-2011 University of Leeds Teaching Fellows tab).
Will post more when the results are starting to come through, or you can get in touch with Alina directly – she’s @gr82tweet.