May 2011 Articles

On Course: Computing for Social Change

Computer Science 380: Sustainability and Assistive Computing

In a new computer science seminar offered last fall, students created the first stage of a program to translate American Sign Language (ASL) into English text. In another project, they developed a series of interactive charts and games to educate everyone from schoolchildren to policy makers about the global fisheries crisis.

Taught by visiting assistant professor Eric Eaton, the course surveyed the use of computing for positive social and environmental change, including technologies to improve the lives of people with disabilities and computational tools for making decisions about the use of natural resources. Students divided into two groups for their course projects, choosing topics inspired by initial assigned readings.

“This was a highly interdisciplinary course, challenging students to reach outside traditional computer science and apply their studies to real problems facing our society,” said Eaton, who was a senior research scientist in the Artificial Intelligence research group at Lockheed Martin Advanced Technology Laboratories and part-time faculty at Swarthmore College before coming to Bryn Mawr. Students learned firsthand the importance of both collaborating on a research team and addressing multiple audiences for computational sustainability. This new field brings together scientists with complementary skills to develop math models for understanding and stabilizing ecosystems. The second portion of the course addressed the topic of assistive computing, a field applying computer science technology to problems in healthcare and medical informatics.

Signing for the hearing

The goal of “ASL-to-Text” by Caitlyn Clabaugh ’13, Alex Funk ’11, Steph Tran ’13, and Asha Habib ’13 is to improve communication among deaf, hard-of-hearing, and hearing people by eliminating the currently necessary third party translator.

“The use of a translator is not only costly but intrusive,” said Habib, who consulted with Jami N. Fisher, ASL Program Coordinator at Penn Language Center about Deaf culture for the project. “Texting has become a staple for communication in the deaf community, but it uses English, which is usually a second language for deaf people, and it is slower to type than to sign.

“We learned that deaf people don’t always consider deafness to be a disability,” said Habib. “They want to have access to what the majority has but don’t want the majority to feel they have to accommodate them.”

Video cell phones have great potential for ASL, but the low bandwidth of the U.S. wireless telephone network (unlike Asia and Europe) doesn’t support visually intelligible signing. ASL semantics are subtle and facial expressions are extremely important, for example, sometimes more than the actual signs.

The group decided to combine software that trains a computer to identify visual cues and objects, and translate them into a video interface similar to Skype, Oovoo, iChat, and Gchat, which are heavily used in the deaf community.

Wearing a colored glove, and seated in front of a green screen to minimize background distraction, the user signs a letter of the ASL alphabet in front of a webcam. The image is captured and represented as a set of data features, which are then compared to previous training data of ASL signs, and the associated letter is printed inside the interface on the screen.

“We learned that it is really disrespectful to drop your eyes when a person is signing, so we put the text on the side of the screen rather than below,” said Funk.

The scope of ASL-to-Text was necessarily small for this course, but the group hopes to expand the project to recognize a larger vocabulary of signs, including those that incorporate motion, and run tests with fluent ASL users.

Don’t eat this fish

In the last 50 years, new fishing technologies have produced a rapid depletion of key stocks and degraded their marine and freshwater ecosystems, reported Jenny Chen ’11. “Large factory ships can stay out for weeks at a time; small fish that can’t be used as food are caught in small mesh nets and wasted. The crisis also affects birds that feed on these fish,” she said.

Setting quotas has not worked, Chen said. Human demand overrides them and while many people are aware that overfishing is a problem, they do not know how they are contributing to it. Her group decided to create ways to educate consumers, both children and adults, and to help policy makers and environmental organizations make decisions.

Chen, Marissa Mocenigo ’11 and Samantha Wood ’11 created a multiple choice game to teach schoolchildren what species of fish are endangered and should not be eaten. “Fish Food” uses data on 62 different species from the Monterey Bay Aquarium’s Seafood Watch and asks the player to guess the status of each—good, caution or avoid. “If a child comes home to find his or her mother serving bluefin tuna for dinner, the child might mention something to the parent, making the connection between the knowledge gained through the game and the actions taken in everyday life,” said Wood.