School of Engineering Students Leading Robotics Research at St. Thomas Kelly Engebretson '99 M.A. August 15, 2012 Three St. Thomas electrical engineering students have been hard at work this summer doing high-tech tinkering on one of the university’s first robots.Senior Ryan Delaney, junior Nate Webster and sophomore Mitch Hoffmann have been working on the “TurtleBot” since early June (though Delaney and Webster have worked together on related projects since last year). The robot, which comes with mapping and navigation capabilities, is an open-source-software device, which means users can “tinker,” or rather, have creative license to program it to do what they wish. According to Turtlebot’s website, the bot “can explore your house on its own, build 3D pictures, bring you food, take panoramas, and more.”Mitch Hoffman works in the basement of O’Shaughnessy Science Hall. (Photo by Mike Ekern ’02)Delaney, Webster and Hoffman are using their Turtlebot to map all four floors of O’Shaughnessy Science Center and the building’s basement. But their overarching goal is to use those maps to program the bot to navigate autonomously from their lab in OSS’s basement to School of Engineering Dean Dr. Don Weinkauf’s office on the first floor.Delaney said “mapping is easy” – the basement took about 30 minutes – and the original intent was to map just the basement of OSS. “To make things interesting, we added to our plans and decided to program it to do multifloor navigation.”But the team ran into a not-so-easy road block when the Turtlebot encountered an elevator.“The gap between the floor and the elevator is confusing to the robot,” Delaney explained. Dr. Jim Ellingson, a St. Thomas engineering professor and the project’s lead adviser, added, “The challenge is that the Turtlebot wants a fixed environment, and the elevator is not fixed. It’s open sometimes and closed others. It takes 40 seconds for the Turtlebot to compute that the door is open – and by then the door shuts. … It’s a bit like ‘The Matrix’; there’s the reality, then there’s the perception of it” as far as how the Turtlebot calculates differences between open spaces and physical objects.The team, which also is advised by Dr. Chris Greene and Dr. Kundan Nepal of the School of Engineering, has been working on the elevator dilemma for the past two weeks so the Turtlebot can navigate to Weinkauf’s office one floor above their lab. (As smart as the Turtlebot is, it can’t climb stairs.) Delaney said they are programming it to wait for the “up” button to be pushed by a team member, then to detect when the door opens and to drive itself inside. Because it does not have arms, a team member will need to accompany it on its trip to the first floor and push the floor button. The robot then will need to detect when the doors reopen and drive itself out, and it’s up to the team to “teach” these steps to the robot. Greene said last week that he believes “they’re getting very close.”A close-up of the “Turtlebot.”At first glance, the TurtleBot could pass for a children’s side table from Ikea. Just 11 inches tall, the bot is comprised of three round, white shelves, each 31 centimeters wide, which can be mounted in various configurations and are supported by four aluminum dowels.Ellingson and Greene said most robots are not humanoid in appearance, as science-fiction films would have us believe; however, the Turtlebot’s parts, Ellingson noted, have components comparative to those on humans. The base of the Turtlebot is, essentially, the better-known Roomba, made by iRobot, which is sold for home vacuuming. It contains three wheels, which serve as the robot’s “legs” and allow it to navigate.The robot’s “eyes” are comprised of a Microsoft Kinect sensor, a black, horizontal bar that is mounted beneath the top shelf. The Kinect generates the data used for mapping spaces in 2D. It “sees” by scanning every measurable part of a space using infrared lightwaves that bounce off objects (such as walls) to calculate depth. It also relies on an RGB (red, green, blue) camera for color readings to complete its mapping.Lastly, a PC (a laptop that rests on the Turtlebot’s top shelf) serves as its “brain.” Once the PC has mapped the floor, the team, from a PC in its lab, “talks” to the robot by using a mouse to point it to its destination, or by typing in coordinates. Once the mapping is complete, the bot can use its own judgment, so to speak, to determine its route.Ellingson, who is Delaney’s undergraduate adviser, believes “robotics gets students excited about engineering,” a sentiment that seems to run through this team.Hoffmann, who learned of the project through Nepal, said, “I’ve always had an interest in robotics and coding, and this project was the perfect gateway for me to begin exploring that career path. I will only be in my second year of college this fall, so I am just beginning to feel around for what I want to do in the future, and this project was a great stepping stone. I get to have fun while working with a talented crew of professors and students.”Webster, who studied robotics in high school and plans to work in the field after he graduates, believes “mobile robotics are important because they are becoming more integrated with our world. It’s a field that will only grow and become more important.”Delaney, who will graduate next spring, imagines most households in the future will have robots to perform household chores and other tasks like walking the family dog. He is excited to see where robotics as a field will go.Code for the “Turtlebot.”He is pleased that prices for devices such as the Turtlebot have come down enough to make them more widely accessible. “The last three to four years has seen an explosion in (robotics) growth and development, which has brought costs down. With this $1,000 platform, smaller schools like St. Thomas can participate in this advancement. … We are doing the exact same things that bigger schools like M.I.T and Stanford are doing with half-million-dollar robots; basically, somebody has an idea, and you see if you can do it.”What else is in store for St. Thomas’ Turtlebot? Greene said he’d like to see it give tours of OSS. Nepal “would love to see us move toward an autonomous robotic vehicle capable of navigating the outdoors as well as the indoors.” Similarly, Ellingson would like to see the Turtlebot go outdoors, but noted that the Kinect camera does not work well outside due to constantly shifting light conditions.So it seems there are many possibilities. As Delaney said “the most interesting part of this project is that we don’t yet know where it’s going.”Editor’s note: Ryan Delaney, Nate Webster and Mitch Hoffmann’s research was funded in part by a Summer Housing grant.