Use this URL to cite or link to this record in EThOS: https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.737957
Title: Transfer of tool affordances in computer vision for robotics
Author: Abelha Ferreira, Paulo
ISNI:       0000 0004 7226 075X
Awarding Body: University of Aberdeen
Current Institution: University of Aberdeen
Date of Award: 2018
Availability of Full Text:
Access from EThOS:
Full text unavailable from EThOS. Please try the link below.
Access from Institution:
Abstract:
Robots working in constrained environments in the industry have achieved great success for a variety of tasks. Future service robots working in unconstrained domains (e.g. home or hospital) will have to cope with unforeseen circumstances, such as not having the usual tool to perform a known task. They will have to assess the affordances of candidate substitute tools and also how best to grasp and orient a tool (tool-pose) for a given task. Everyday tasks in the home often involve using a tool in non-canonical ways, e.g., the handle of a spoon oriented in the right way to retrieve something from a gap; or a bottle of wine used as a rolling pin to roll dough. It is possible to exploit these similarities between different tools and their tool-poses if the robot can learn by trying different tool-poses and also transfer what was learned to assess substitute candidate tools. Learning and dealing with substitute tools comes naturally to humans and is already present in toddlers and in some animals. Research in cognitive science provides insight into a possible mechanism playing an important role in human concept adaptability: projection. Here we provide an application of this cognitive science idea into the real-world domain of computer vision for service robotics. We show both that projection can be made to work in a real-world domain and that our approach can achieve better results than the closest one in the literature. The two main contributions of this dissertation are: 1. A first approach to bringing the idea of projection from cognitive science into a real-world 3D computer vision domain. Instead of a one-pass assessment from sensor data to abstraction and then to score, we have a bottom-up exploration from sensor data to representation and a top-down selection of best alternatives. 2. A semi-automatic framework for assessing tool affordances and tool-pose starting from unsegmented point clouds and including segmentation, simulation, learning and flexible assessment. These contributions enable us to achieve 69% overall accuracy on five different everyday tasks compared to our closest competitor in the literature achieves only 32% on the same four tasks. These results can be obtained when (a) it is possible to create a simulation for the task (b) it is possible to pre-train the system on 5000 different tools. This dissertation demonstrates that it is possible to bring the projection idea into a real-world domain and that combining top-down pressure with bottom-up search and a flexible representation improves accuracy when assessing tool affordances for service robotics.
Supervisor: Not available Sponsor: Brazilian Government (CAPES Foundation)
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID: uk.bl.ethos.737957  DOI: Not available
Keywords: Robots ; Tools ; Implements ; utensils ; etc.
Share: