Object classification for robotic platforms

Brandenburg, S., Machado, P. ORCID: 0000-0003-1760-3871, Shinde, P., Ferreira, J.F. ORCID: 0000-0002-2510-2412 and McGinnity, T.M. ORCID: 0000-0002-9897-4748, 2019. Object classification for robotic platforms. In: ROBOT 2019: Fourth Iberian Robotics Conference, Porto, Portugal, 20-22 November 2019.

[img]
Preview
Text
14786_Machado.pdf - Pre-print

Download (1MB) | Preview

Abstract

Computer vision has been revolutionised in recent years by increased research in convolutional neural networks (CNNs); however, many challenges remain to be addressed in order to ensure fast and accurate image processing when applying these techniques to robotics. These challenges consist of handling extreme changes in scale, illumination, noise, and viewing angles of a moving object. The project main contribution is to provide insight on how to properly train a convolutional neural network (CNN), a specific type of DNN, for object tracking in the context of industrial robotics. The proposed solution aims to use a combination of documented approaches to replicate a pick-and-place task with an industrial robot using computer vision feeding a YOLOv3 CNN. Experimental tests, designed to investigate the requirements of training the CNN in this context, were performed using a variety of objects that differed in shape and size in a controlled environment. The general focus was to detect the objects based on their shape; as a result, a suitable and secure grasp could be selected by the robot. The findings in this article reflect the challenges of training the CNN through brute force. It also highlights the different methods of annotating images and the ensuing results obtained after training the neural network.

Item Type: Conference contribution
Creators: Brandenburg, S., Machado, P., Shinde, P., Ferreira, J.F. and McGinnity, T.M.
Date: November 2019
Divisions: Schools > School of Science and Technology
Record created by: Linda Sullivan
Date Added: 09 Sep 2019 13:09
Last Modified: 21 Jul 2021 16:15
Related URLs:
URI: https://irep.ntu.ac.uk/id/eprint/37616

Actions (login required)

Edit View Edit View

Views

Views per month over past year

Downloads

Downloads per month over past year