Robot Visual-Tactile Intelligence

Seminar
This event has now finished.
  • Date and time: Wednesday 5 March 2025, 2pm to 3pm
  • Location: In-person and online
    ISA-135 Seminar Room, Institute for Safe Autonomy (Map)
  • Admission: Free admission, booking not required

Event details

Digital images are composed of pixels. Recently, optical tactile sensors have emerged, utilising cameras to capture contact information. These sensors introduce a variable approach for robot tactile sensing, employing tactile pixels (taxels) to detect contacts, force, pressure, and object properties. This talk explores the important role of tactile-/ pixels in enabling robots to interact with the physical world. I will first introduce early attempts to interpret tactile array data as tactile images and leverage computer vision methods for contact extraction. Then I will delve into our developments in optical tactile sensors in the past years, from GelTip to TouchRoller, GelFinger and to RoTipBot that possesses omnidirectional tactile sensing and high dexterity. Pixels also facilitate simulating the physical world and sensors; I will showcase our pioneering work in using depth sensors and Tacchi for tactile skin deformation simulation, along with the recent FOTS simulator for acceleration. Finally, I will introduce our works on extracting information from tactile images and integrating visual and tactile sensory modalities to enhance perception for dexterous manipulation.

About the speaker

Dr Shan Luo

Dr Shan Luo is a Reader (Associate Professor) in Robotics and AI at King’s College London, where he leads the Robot Perception Lab. Having earned his PhD in Robotics from King's College London in 2016, Shan furthered his expertise through a Visiting Scientist position at MIT’s Computer and Artificial Intelligence Laboratory (CSAIL) under the mentorship of Prof. Edward Adelson. He also worked as a Postdoctoral Research Fellow at both the University of Leeds and Harvard University, contributing to the advancement of research in robot perception. In 2018, Shan joined the University of Liverpool as a Lecturer and was the Director the smARTLab within the Department of Computer Science until he returned to King’s in 2021. Throughout his career, Shan’s pioneering research has been dedicated to endowing robots with human-like perception capabilities, particularly focusing on vision and touch modalities. His team has developed the first full-finger optical tactile sensor GelTip and pioneered simulators for optical tactile sensors. In addition, his team has created a series of algorithms for tactile perception, multimodal and cross-modal visual-tactile perception, with applications in robot dexterous manipulation. His research has been published in high-impact robotics journals and international conferences, including T-RO, T-Mech, TIP, ICRA, IROS, RSS and NeuRIPS. He has been a Guest Editor for prestigious journals including T-RO and RAM, and an Associate Editor for ICRA and IROS. His research has received funding from prestigious funding bodies and industrial support including EPSRC, AHRC, Innovate UK, Royal Society and Unilever, with him as a PI for over £1.5 million. Shan's accolades also include the EPSRC New Investigator Award, UK-RAS Early Career Award, CASE Best Student Paper finalist, and 2024 UK Manipulation Workshop Best Poster Award.

Online Zoom Link: https://york-ac-uk.zoom.us/j/91719238351?pwd=ykow2C7hmKa4pKy8TICTOWVo8LwzfG.1

Venue details

  • Wheelchair accessible