Helping People with Needs

Doug Moore pulls back the curtain on a new technology group in Plano that’s working on a very different kind of mobility

August 23, 2017

All in the Family -- Doug Moore (far right) and TMC Senior Advisor Yutaka Takaoka join Romy and Gaby Camargo for a photo with Toyota's Human Suppport Robot.

Earlier this year, Toyota Motor North America formed the Technology for Human Support Group, based in Plano. We’d heard it had something to do with developing robots and other technologies that could help people who need assistance with common household tasks – an intriguing concept that invited further inquiry. So we tracked down Doug Moore, senior manager of this new endeavor. And he offered this glimpse of our goals for a not-too-distant future that promises to usher in a different kind of mobility.
 
Driver’s Seat: How did you get your start at Toyota?
 
Doug Moore: I came on board six years ago as a researcher in the Partner Robotics Group. They were focused on home applications of the technology. For example, we have an aging society. How can a robot help in these spaces?
 
Most team members probably know that Toyota Motor Corporation (TMC) has been active in robotics research in Japan for some time. But who knew that similar work was being carried out here in the US?
 
I certainly didn’t know about it before I started the interview process for the job.
 
How did you end up in this very specialized field.
 
I earned my undergraduate degree in mechanical engineering at the University of California, San Diego. Back then, I thought I’d get a job with Porsche Design or be working on Formula 1 cars.. While completing my undergraduate work at UCSD I had a chance to immerse myself in the interaction of Brain Based Artificial Intelligence (AI), robotics and the real world at the Neurosciences Institute in San Diego. It was a great experience, extending my mechanical engineering background to innovations like neural modeling and robotics.  Dr. Gerald Edelman, the director of the institute at the time always said, “the brain is embodied and the body is embedded,” which meant you needed a robot in the real world to test AI.  So it wasn’t just about the technology but how we interact with it. It turned out to be the perfect training ground for my work here at Toyota.

Wearable Technology -- BLAID, an experimental device designed to help the vision impaired, was Moore's first project after he joined Toyota six years ago.

Sounds like you chose the right path. Breakthroughs in AI and robotics are very much in the news these days. What role is the Technology for Human Support Group playing in all of this?
 
Basically, we exist to help realize Toyota’s vision of Mobility for All through taking advanced research – the kind of work our group, The Partner Robot group, and TRI (Toyota Research Institute) is doing – and see if we can turn it into practical applications that could potentially be commercialized as end products. Outside of the Toyota community we also collaborate with universities on specialized research projects.
 
Our scope is mobility, which is really broad. Robots are part of the mix. But so are other tools, such as wearable technology that could possibly could be developed to assist people who may need help with their daily activities.
 
Are you referring, in part, to BLAID?
 
Yes. That was the first Toyota project I worked on and the first personal mobility technology project for the Partner Robot Group  in North America. One of the key steps in the investigation of that idea was to better understand the experiences of people who need assistance understanding their environment. So we partnered with Lighthouse for the Blind in San Francisco, among others. For example, we attended special meetups where we talked to the visually impaired community about what their world is like.
 
What did you learn from that interaction?
 
The big takeaway for me was that we have to set aside our assumptions about people with needs of any kind. The tendency is to place greater limits on them than they do on themselves. The fact is, they are super capable people who just have this thing, whatever it might be, that requires some assistance. By applying the right technology in the right way, we can help to make a difference in their lives.
 
BLAID wasresearched from that understanding. It’s  a conceptual device that we envision, if it ever is developed for market, that someone could wear around their shoulders.  It would be equipped with cameras that can detect a person’s surroundings and then communicate through speakers and vibration motors. In turn, a person could interact with the device through voice recognition and buttons. 


Focus Group -- Moore (third from left) organized "lab meetups" with members of Lighthouse for the Blind that helped lay the foundation for his team's work on BLAID. Other current Toyota team members include Ryan Klem (far left) and Mark Boire (second from right).

Sounds amazing. And it could one day be available for purchase?
 
That remains our passion but making the leap from concept to commercial product raises a lot of implications, such as FDA (Food and Drug Administration) requirements. Figuring all of that out is one of the reasons our group was formed.
 
Toyota’s Human Support Robot (HSR) has also been getting a lot of media attention lately. Your team is involved in that project, too, right?
 
Yes. HSR originated at TMC. We’ve been focused on developing practical applications of it as well as other types of robots.
 
Now, I should be clear that these are not the robots Toyota uses in its plants to build cars. A factory is a super constrained space with a well-defined environment. Within that, you can program each robot to do one thing over and over again.
 
Our mission is to create robots that can help people in their day-to-day lives.  They have to operate within people’s homes, which are definitely not controlled environments.
 
How do you even begin to account for all of the variables?
 
It’s highly complex. We’re approaching this challenge from four different angles:
 
1) vision, or how the robot perceives its environment;
2) navigation, or how the robot moves in space;
3) communication, so things like speech to text and text to speech; and
4) motion planning, or how the robot behaves.
 
As humans, there’s so much we take for granted when it comes to interacting with one another. With a robot, you have to go deep into the details. Still, you can’t program for every contingency. You have to figure out how to make it possible for the robot to learn over time.
 
For example, if a robot snuck up behind you and surprised you, you’d probably react in a certain way. Can the robot pick up on that reaction and change its behavior the way a human would? Can the robot make mistakes and learn from them, in the same way that we learn? It’s about using AI to simulate natural learning.
 
Unlike BLAID, development of HSR began in Japan. What we’re trying to do is explore the possibilities for using the technology for different settings and needs.
 
So, for example, last fall we conducted an informal evaluation of the performance of one robot in the home of Romy Camargo. Romy’s been paralyzed from the neck down since he was wounded on the battlefield in Afghanistan in 2008. Romy and his wife Gaby founded Stay in Step, a spinal cord injury recovery center. Toyota’s sponsorship of that center gave us the idea of seeing how the HSR could help Romy, with the full support of Romy, Gaby and their son. It took a lot of customized programming and engineering, but we were able to confirm that HSR can be a big help with everyday tasks such as opening doors and bringing food from the kitchen. Romy said the robot could help him regain some of his independence and live more freely – which would be a huge win for him and for us.


TMC R&D -- Moore's counterparts at Toyota Motor Corporation in Japan are also working on innovative mobility technology, such as the Welwalk WW-1000 robot that's designed to aid in the rehabilitation of individuals with lower limb paralysis.

Where does HSR go from here?
 
We’ve been in a bit of holding pattern since the day we spent with Romy while we figure that out. But one possibility is to use emerging computer vision research that can pick up on the subtle cues communicated through a person’s face, such as the tilt of the head or the opening of the eyes.
 
We’re making progress, but we still have a long way to go. It’s kind of like when the first dishwashers came out. The dishes didn’t always get perfectly clean. But they work a lot better now. Or think about how far vehicles have come since they were first invented.
 
The thing about robots is that when you use the term, people immediately think of Rosie from “The Jetsons,” or R2D2 and C3PO from “Star Wars.” We, however, think of them as tools that can make a big difference in people’s lives. So we focus as much, if not more, on the person than we do on the technology. What do they need? What’s the best solution?
 
Seems like a huge ask for a small team.
 
Well, we’re not alone in this search. Other automotive entities are also exploring this space. So are start-up robotics companies. And Silicon Valley companies. Multiple industries are converging. We have to go beyond a competitive mindset and be open to collaboration. Who can we partner with? How do we operate at scale? How do we not just sell these products but also service them?
 
The opportunities are incredibly exciting. I often catch myself thinking, “I’m getting to do something that is truly amazing.” It’s a bit like being alive in the 15th century when Gutenberg invented the printing press. Think about the huge transformation that technology ushered in.
 
In the same way I wonder: With my skill set and experience, what can I do to impact the world? I feel very fortunate to be here.
 
By Dan Miller
<< Back

You must be logged in to view this item.



Login

This area is reserved for members of the news media. If you qualify, please update your user profile and check the box marked "Check here to register as an accredited member of the news media". Please include any notes in the "Supporting information for media credentials" box. We will notify you of your status via e-mail in one business day.