Welcome to the real world, Iron Man by Angela Herring May 1, 2013 Share Facebook LinkedIn Twitter “If Tony Stark can do it in a cave with scraps,” said Kyle Dumont, then, well, he and the other members of his capstone team could, too. No matter that the main character of Iron Man is just that—a character—the five electrical and computer engineering seniors decided six months ago they were going to create something just as fantastic, if not as fantastical, as the action hero’s robotic arm. They wanted to build something that doctors, school teachers, or virtually anyone could use to manipulate objects in a virtual space, the same way Iron Man can—using nothing but simple hand gestures. What they created, the Gesture Operated Computer Aided Design tool, or goCAD, grants your average Joe access to technically challenging design software with the simple swipe of his hand through the air. The goCAD team earned third place in the electrical and computer engineering capstone competition this spring. One judge noted it took him three years to master a specific design software and would have been quite grateful for their tool. But goCAD is just an example of how one could use the team’s real innovation, said Greg Andrus, the group leader. Using Microsoft Kinect, the team’s tool can detect a user’s hand in space and then follow it as the user performs any of a series of predefined gestures. The approach could be used for design software, but it could also allow a brain surgeon to easily rotate a three dimensional MRI image or a calculus teacher to explain derivatives more intuitively than she can on a white board in the classroom. “You start off with what the camera sees, which is everything,” said Samantha Kerkhoff, who worked on the program’s low level coding. “Then you determine where the fingertips are, then you determine if they’re moving in a specific pattern, which is a gesture.” Dumont built a library of algorithms that help the Kinect identify fingertips in space and then AJ Mills developed a method for predicting where those fingertips will be from one image frame to the next. Another team member, Sam Herec, figured out how to turn the positions of those fingertips into actual, recognizable gestures, which Kerkhoff then translated into code. Finally, Melissa Milner wrote the code that translates Kerkhoff’s information into something the pre-existing design software could actually use. “Every engineer wants to become a Stark,” Herec said. “But he’s a pretty bad engineer because he’s fictional. He works on his own and doesn’t let anyone else in.” “But in real life,” Andrus chimed in, “you get good projects when everyone works together.” While they won’t be using goCAD to hurl half-ton enemy robots to their doom, the students hope that open sourcing their software libraries will enable developers around the globe to use them to make our interactions with the cyberworld a little more intuitive.