Yasemin Bekiroglu: Towards Robust and Goal-oriented Robotic Grasping and Manipulation

Abstract

Robots are envisioned as capable machines who easily navigate and interact in a world built for humans. However, looking around us we see robots mainly confined to factories only performing repetitive tasks in environments built such as to circumvent their limitations. The central question of my research is how we can create robots that are capable of adapting such that they can co-inhabit our world. This means designing systems that are capable of functioning in unstructured environments that are continuously changing with unlimited combination of shapes, sizes, appearance, and positions of objects, able to understand, adapt and learn from humans, and importantly do so from small amounts of data. In specific my work focuses on grasping and manipulation, fundamental aspects to enable a robot to interact with humans in our environment, along with dexterity (e.g. to use objects/tools successfully) and high-level reasoning (e.g. to decide about which object/tool to use). Despite decades of research, robust autonomous grasping and manipulation remains an elusive goal. The difficulty lies in dealing with the inevitable uncertainties in how a robot perceives the world. Our environment is dynamic, has a complex structure and sensory measurements are noisy and associated with a large degree of uncertainty. In real-world settings, these issues can lead to grasp failures with serious consequences. In my research I have developed methodologies that enable a robot to interact with natural objects and learn about object properties and relations between tasks and sensory streams. I have developed tools that allow a robot to use multiple streams of sensory data in a complementary fashion. In this talk I will specifically address how a robot can use vision and touch to address grasp related questions e.g. estimating unknown object properties or grasp success before manipulating objects that can be used to trigger plan corrections.

Date
February 27, 2020 15:30 — 16:00
Event
SML Seminar
Location
AI Centre, UCL
Avatar
Marc Deisenroth
DeepMind Chair in Artificial Intelligence