Archive for November, 2011

Genius talk by Ed Hutchins

November 7, 2011 Leave a comment

Ed Hutchins—surfer, pilot, MacArthur “Genius Grant” Fellow, and pioneer of the distributed cognition (dCog) approach to HCI—gave a talk titled “Digital Cognitive Ethnography of the Airline Flight Deck” in which he talked about his experiences conducting ethnographic studies of commercial flight decks, most notably the newly-unveiled 787. It was great to see many of the engineering design methods that we talk about in HCI classes in use while building and evaluating multi-million dollar systems: artifact collection, eye-tracking, video logging, but also plain old observations with checklists that Ed and his team fill out while sitting behind the pilots. Ed compared the digital tools used in measuring, quantifying, and visualizing important project data with stories with stories about about how he used to walk around with a heavy tape recorder and note pad many years ago.

One thing that struck me from the talk was his description of the 787 development effort as the “best collection of notes that he’s seen taken on a project” (rough quote). He talked a lot about the tools and advances in the dCog approach—but he also had great praise for the team that was working on the project. The team seemed to know each other well (many were current or recent lab members), and they had a native language speaker on the team whom Ed described as vital to the project’s success. To me, it seemed that the tight team with a common lingua franca regarding the research approach and methods was as important or more important than any tool or method. And Ed was clearly well-connected with the team as individuals and as a group, sharing stories about every one of them it seemed.

Ed also talked briefly about where dCog is going (though he noted that he could have given another whole talk on that). His focus over the last 5-10 years has been to look at how embodiment connects to dCog; specifically, how the indexicality of gestures and other embodied actions can be captured and interpreted. He acknowledges the ease with which video data can be captured…and the great difficulty in processing it in a fast and cheap manner. He’s looking to sensor data, innovations in eye-tracking, and Kinect-like motion-capture systems as a way not only to capture more data but also to index it. This will be a gateway to more rapid debriefings and more meaningful analyses.

Also good to know was this closing quote: “23 years of studying flying makes me more comfortable about flying”. Good to know we’re going in the right direction!