Download PDFOpen PDF in browser

Training Dense Object Nets: a Novel Approach

EasyChair Preprint no. 11422

6 pagesDate: November 29, 2023

Abstract

Our work proposes a novel framework that addresses the computational limitations associated with training Dense Object Nets (DON) while achieving robust and dense visual object descriptors. DON’s descriptors are known for their robustness to viewpoint and configuration changes, but training these requires image pairs with computationally expensive correspondence mapping. This limitation hampers dimensionality and robustness, thereby restricting object generalization. To overcome this, we introduce a data generation procedure using synthetic augmentation and a novel deep-learning architecture that produces denser visual descriptors with reduced computational demands. Notably, our framework eliminates the need for image pair correspondence mapping and showcases its application in a robotic grasping pipeline. Experimental results demonstrate that our approach yields descriptors as robust as those generated by DON.

Keyphrases: Dense Object Nets, generalized object representation, reduced computation costs, Robot Grasping

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@Booklet{EasyChair:11422,
  author = {Kanishk Navale and Ralf Gulde and Marc Tuscher and Oliver Riedel},
  title = {Training Dense Object Nets: a Novel Approach},
  howpublished = {EasyChair Preprint no. 11422},

  year = {EasyChair, 2023}}
Download PDFOpen PDF in browser