Prince of San Francisco
A massive interactive installation comprising 200 lit balloons.
Video for Patio
A short video for the Architecture School
2010 - ongoing
video for Solar
VJing for Chicas Japonesas
2008 - ongoing
video mapping performance
an interactive installation
a visual, audio-reactive piece
a live cinema visual instrument
Live Cinema for La Saga
a visual performance
an augmented reality musical instrument
2009 and ongoing
Designing interfaces for users with impairments is a challenge, especially if those users are children. Both gathering information about their specific needs, and understanding how they interact with the environment, usually need the aid of intermediaries and extensive and intensive observation. However, due to the users’ often complex story, it is fundamental to involve them directly in Human-Computer Interaction (HCI) research, design and evaluation.
G. Armagno, M. Bonilla, S. Marichal, T. Laurenzo. Designing interfaces for children with motor impairments: An ethnographic approach. 23 Conference of the XXIX International Conference of the Chilean Computer Society, SCCC 2010, Santiago de Chile, Chile.
Together with T.A. Gustavo Armagno, we started working with two undergraduate students -Marcela Bonilla and Sebastián Marichal-, with the idea of creating computer vision based interaction schemes that would allow motor handicapped children to use OLPC’s XO computers (Uruguay, through its governmental project –Plan Ceibal– was the first country to make a full order of OLPC computers (XO). In October 2009, it was also the first in achieving the ‘full deployment’ status, after successfully delivering a computer to every public schooled child between 6 and 12 years old. It is now targeting secondary education (youngsters from 12 to 18 years old) as well).
After a first ethnographic study carried on at the public school Dr. Ricardo Caritat, the only public school in Montevideo specifically for children with motor impairments, we found that:
i. None of the XOs received by the school came with software specifically designed for users with impairments. However, teachers managed to include the XO in their daily classroom activities.
ii. Most of the children have difficulties to find and use the different programs that come with the XO. Even the most simple activities, like drawing a line in a painting program or writing their name in a text program, can only be carried on with the aid of a teacher, and, in some cases, cannot be executed by the children.
iii. Teachers systematically attribute these problems to the XO’s ergonomics (small screen, keyboard and touchpad), the software interaction design, the lack of suitable software accessibility helpers, and the limited availability of accessibility peripherals.
iv. The XOs potentially increase children’s autonomy allowing for new modes of interaction with other children and their environment.
As a first answer to the problems discovered by the ethnographic study, we designed a Perceptual Interaction-based accessibility framework that allows for the instantiation of programs (in XO jargon, activities) whose interaction is based entirely on the optical recognition of images by the XO.
A small hardware periscope was created so that the XO’s frontal camera can see what happens in the space in front of the keyboard, addressing that some users are unable to hold their hands in the air.
The interaction with all the prototypes is based on the same schema, supported by our framework’s ability to recognize certain drawings: the user shows or occludes an image within the webcam’s field of view, triggering a certain action from the computer.
The implemented prototypes are of three types: Selector, Interactive storytelling and Rock-Paper-Scissors: (the software was programmed in C++, and the computer vision subsystem was implemented using and extending ARToolkitPlus, a fiducial marker-based computer vision library).
Selector prototypes work by letting the users choose from a number of options printed on a sheet of paper. These prototypes allow to associate arbitrary actions to the markers. So far, we have tested three prototype flavors: a faux keyboard that associates drawings to some keyboard keys (e.g. the arrow and enter keys), the bookmark where the user can navigate to pre-defined web sites by touching its corresponding drawing, and the launcher menu from where the user can run XO activities.
The interactive storytelling application allows the child to complete a story being narrated by the computer. The user is presented with a number of cards with drawings of animals, objects and actions, and, by showing the images to the computer, can choose between different branches of the story, or answer specific questions the computer asks (e.g. What animal does bark? or Who did the girl run into while walking in the forest?).
The Rock-Paper-Scissors prototype will be discussed in another post.