Clarkson instructors receive robot to continue human-robot handover interaction research

Natasha and Sean Banerjee (right side of photo), assistant professors of computer science at Clarkson University, are conducting research related to humans and robots. They’re working with students on the research. The students include (from left to right) Yijun Jiang, a graduate student studying computer science; and Elim Schenck, a Clarkson student also studying computer science & computer engineering. (PHOTO CREDIT: CLARKSON UNIVERSITY WEBSITE)

POTSDAM — Two assistant professors at Clarkson University are using a new mobile grasp robot to continue their research related to intuitive human-robot handover interactions. Natasha and Sean Banerjee, assistant professors in computer science, acquired the robot through a joint Facebook and Carnegie Mellon grant, the university announced on its website. The Banerjees submitted one […]

Already an Subcriber? Log in

Get Instant Access to This Article

Become a Central New York Business Journal subscriber and get immediate access to all of our subscriber-only content and much more.

POTSDAM — Two assistant professors at Clarkson University are using a new mobile grasp robot to continue their research related to intuitive human-robot handover interactions.

Natasha and Sean Banerjee, assistant professors in computer science, acquired the robot through a joint Facebook and Carnegie Mellon grant, the university announced on its website.

The Banerjees submitted one of 30 grant proposals that was awarded one of the robots. 

The device will help the professors continue with their research on augmenting robots to be “human-aware” by using “deep learning to automatically detect where humans prefer to hold objects and provide assistance with human awareness built-in.”

“The driving force behind this research was that we are very rapidly moving toward a world where robots are going to be a part of our daily interactions, so it is really important for those robots to collaborate and cooperate with humans because it does not make sense for them to just be independent,” Natasha Banerjee said in a statement. “We are spurring a new area of research on creating artificial intelligence algorithms for robots that are human-aware. There is a pretty broad research area on human-robot interaction or HRI, but a lot of this research has focused on experimental or toy problems. My research makes novel contributions to HRI by assessing how to ensure a robot hands over an object to a human such that a human is comfortable holding it.”

Banerjee said she has recently presented work that focuses on detecting where humans prefer to hold cups, and that research can help determine where robots should be gripping objects to best interact with humans.

“Let’s say you have an elderly individual and they want assistance. A cup is at a height where they are not able to get it. If you had an assistive robot that had a gripper arm, then the robot should hold the cup around the body so the person can hold it around the handle, especially if there are hot contents. A robot’s gripper is able to handle that heat better than a human hand,” Banerjee said.

Banerjee noted that where her research is beginning to differ is that no one else is using a data-driven perspective, and most other researchers have been looking at only one object at a time, such as a bottle or a screwdriver.

“If you want these robots to be universally acceptable, they have to be able to understand any object in your environment and predict where a human is likely to hold an object,” she noted.

Having the ability to predict this requires machine learning. Banerjee said she and her research team are using a “special brand of computational neural networks” that help predict a distribution map that can indicate where humans are more likely to hold an object.

The robot is equipped with a camera that can be used to create an image that combines color and depth to tell the robot where it should prefer to hold an object based on where a human would hold it. The robot will analyze how to hold the object in places where a human would tend not to hold it. This method can be used to generate predictions for any average object.

The Banerjees work with three students on the project. Yijun Jiang is a computer science graduate student, who provides research for the project and develops algorithms to support the work. Elim Schenck is a double major in computer science and computer engineering, who supports the work of Jiang by helping develop algorithms and has been in charge of learning the controls for the robot. In addition, electrical engineering student Jack Lamuraglia has been in charge of assembling the robotic platform and getting it running.       

Eric Reinhardt: