It's been a month since University of Toronto undergraduate student Samin Khan and his teammate from the University of Ontario Institute of Technology were on the world stage, pitching their smartARM to a panel of industry judges at the Imagine Cup, Microsoft's annual student innovation competition.
The hook of their pitch was a demonstration of how the robotic, prosthetic hand can recognize and grasp objects, such as a set of keys. But when they needed it most, their technology didn't work. Khan and his teammate, Hamayal Choudhry, were sure they had lost the competition.
“At the Canadian finals, there was applause in the middle of our presentation because people were just so blown away by it,” says Khan, a University College student who is entering his fourth year of studies in cognitive and computer science.
“Feedback from Microsoft was that these things happen all the time in presentations," he says. "We moved on quickly and didn’t get too nervous. They could tell that we'd thought through our idea from beginning to end.”
Their pitch, with its one flaw, resulted in Canada’s first-ever win in the Microsoft competition, The team won a mentoring session with Microsoft CEO Satya Nadella, US$85,000 in cash and a $50,000 Microsoft Azure grant to continue developing their technology.
“Making it to top three, and being able to showcase our idea on a global stage, we were happy, regardless of how things turned out,” says Khan.
U of T’s Nina Haikara spoke to Khan about the win and what’s next for their innovation, smartARM.
How did you and Choudhry, students from two different institutions, meet and decide to form a team for Imagine Cup?
In middle school we were part of a gifted program of about 30 students. We hadn't seen each other in five or six years when we met again at UofTHacks [where Microsoft was sponsoring one of the hackathon’s challenges].
Hamayal and I started sharing our philosophies for the tech industry. We were tired of seeing innovation in areas that seemed like they weren't necessary – for example, slimming down the next cellphone, or making a sleeker car. We think innovation is impactful when it improves the lives of people that need it most.
Hamayal is in the mechatronics engineering program at UOIT and I'm coming from the world of cognitive and computer science, with a software background. I asked him, 'Hey, why don't we sign up for the Microsoft challenge while you are here. Let's try to make something in the next 36 hours [at UofTHacks].'
Did you receive any institutional support?
In terms of the skill sets that I've developed, there are professors that have impacted me at U of T. I’ve been doing research with Yang Xu, an assistant professor in the computer science department and cognitive science program [at University College] over the past semester, and started to learn some machine learning programming with him that was very helpful.
Ishtiaque Ahmed [an assistant professor in the department of computer science] inspires me to focus on technology that empowers people. A lot of his work is bringing technology back to Bangladesh and using it to give voice to the people who need it the most. My NSERC USRA project with him is designing a platform for students to collectively report their academic success and social well-being in a way that smoothly collects and analyzes the data and then provides compelling visualization for potential intervention.
Once I started getting more active on social media, I received so much support from U of T’s social, especially during the world finals.
How did you identify this problem as a non-amputee?
The creative process was very messy. We just threw ideas around.
Hamayal had done some work with robotics before and 3D printing. He suggested, 'Hey, I can make some motors move a hand.'
Going back to our idea of making innovation that's impactful, as well as considering both of our skill sets, we thought; what if we tried to co-ordinate some motor movements with some computer vision?
With some quick online searches, we found that this could actually be used by amputees. And if we look at what's available for amputees right now, the costs go into tens of thousands of dollars because they use complex neuromuscular interfacing.
While we were simultaneously coming up with this idea and working on a presentation for the judges at UofTHacks, we started to realize that this project might have some big potential to help a lot of amputees and also address this really big gap within the prosthetics industry. Microsoft’s encouragement in our idea inspired us to keep going.
What’s most unique about smartARM’s technology?
Robotic vision hasn't been applied to amputees, as far as we know. The technology we used was largely pre-existing, using some of Microsoft's API [application programming interface], and Raspberry Pi and some mechatronics applications. Stitching it together and using a 3D printed model in order to help amputees was the innovation. Also, 3D printing and computer vision is very inexpensive.
So what advice would you give to students interested in competing on the world stage?
People ask me how I juggled this project with the other things I'm doing. I think the biggest thing that keeps me motivated is making sure I'm held accountable by the people around me. A lot of people have amazing, interesting ideas, but executing them is a whole other thing. And getting to the execution stage really involves connecting with the right people and getting the right encouragement.
So what's next for smartArm?
First thing, I think we want to optimize the boundary between functionality and cost. That's our biggest concern. As it is right now, it has that basic functionality, but we want to make sure that when we do roll this product out, it doesn't disappoint the amputees who are using it.
We want to continue working with our subject expert, Annelisa Bowry. She's part of the War Amps and we want to talk to more of her peers and get an idea of what they would want to see.
With prosthetics, you have the socket and you have the prosthesis, and the prosthesis in this case is smartArm. The medical device itself is actually just the socket, which is just placed comfortably over their residual limb. We want to figure out a way to interface that in the most comfortable way, and start some trials.
We have hopes of expanding this to other areas.
Who knows? There might be a smartLeg.