Young Researcher Paper Award 2023
🥇Winners

Notice of retraction
Vol. 34, No. 8(3), S&M3042

Notice of retraction
Vol. 32, No. 8(2), S&M2292

Print: ISSN 0914-4935
Online: ISSN 2435-0869
Sensors and Materials
is an international peer-reviewed open access journal to provide a forum for researchers working in multidisciplinary fields of sensing technology.
Sensors and Materials
is covered by Science Citation Index Expanded (Clarivate Analytics), Scopus (Elsevier), and other databases.

Instructions to authors
English    日本語

Instructions for manuscript preparation
English    日本語

Template
English

Publisher
 MYU K.K.
 Sensors and Materials
 1-23-3-303 Sendagi,
 Bunkyo-ku, Tokyo 113-0022, Japan
 Tel: 81-3-3827-8549
 Fax: 81-3-3827-8547

MYU Research, a scientific publisher, seeks a native English-speaking proofreader with a scientific background. B.Sc. or higher degree is desirable. In-office position; work hours negotiable. Call 03-3827-8549 for further information.


MYU Research

(proofreading and recording)


MYU K.K.
(translation service)


The Art of Writing Scientific Papers

(How to write scientific papers)
(Japanese Only)

Sensors and Materials, Volume 32, Number 4(1) (2020)
Copyright(C) MYU K.K.
pp. 1245-1259
S&M2174 Research Paper of Special Issue
https://doi.org/10.18494/SAM.2020.2549
Published in advance: February 5, 2020
Published: April 10, 2020

Image-similarity-based Convolutional Neural Network for Robot Visual Relocalization [PDF]

Li Wang, Ruifeng Li, Jingwen Sun, Hock Soon Seah, Chee Kwang Quah, Lijun Zhao, and Budianto Tandianus

(Received July 31, 2019; Accepted December 2, 2019)

Keywords: visual relocalization, CNN, image similarity

Convolutional neural network (CNN)-based methods, which train an end-to-end model to regress a six degree of freedom (DoF) pose of a robot from a single red–green–blue (RGB) image, have been developed to overcome the poor robustness of robot visual relocalization recently. However, the pose precision becomes low when the test image is dissimilar to training images. In this paper, we propose a novel method, named image-similarity-based CNN, which considers the image similarity of an input image during the CNN training. The higher the similarity of the input image, the higher precision we can achieve. Therefore, we crop the input image into several small image blocks, and the similarity between each cropped image block and training dataset images is measured by employing a feature vector in a fully connected CNN layer. Finally, the most similar image is selected to regress the pose. A genetic algorithm is utilized to determine the cropped position. Experiments on both open-source dataset 7-Scenes and two actual indoor environments are conducted. The results show that the proposed algorithm leads to better results and reduces large regression errors effectively compared with existing solutions.

Corresponding author: Ruifeng Li, Lijun Zhao


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Cite this article
Li Wang, Ruifeng Li, Jingwen Sun, Hock Soon Seah, Chee Kwang Quah, Lijun Zhao, and Budianto Tandianus, Image-similarity-based Convolutional Neural Network for Robot Visual Relocalization, Sens. Mater., Vol. 32, No. 4, 2020, p. 1245-1259.



Forthcoming Regular Issues


Forthcoming Special Issues

Applications of Novel Sensors and Related Technologies for Internet of Things
Guest editor, Teen-Hang Meen (National Formosa University), Wenbing Zhao (Cleveland State University), and Cheng-Fu Yang (National University of Kaohsiung)
Call for paper


Special Issue on Advanced Data Sensing and Processing Technologies for Smart Community and Smart Life
Guest editor, Tatsuya Yamazaki (Niigata University)
Call for paper


Special Issue on Advanced Sensing Technologies and Their Applications in Human/Animal Activity Recognition and Behavior Understanding
Guest editor, Kaori Fujinami (Tokyo University of Agriculture and Technology)
Call for paper


Special Issue on International Conference on Biosensors, Bioelectronics, Biomedical Devices, BioMEMS/NEMS and Applications 2023 (Bio4Apps 2023)
Guest editor, Dzung Viet Dao (Griffith University) and Cong Thanh Nguyen (Griffith University)
Conference website
Call for paper


Special Issue on Piezoelectric Thin Films and Piezoelectric MEMS
Guest editor, Isaku Kanno (Kobe University)
Call for paper


Special Issue on Advanced Micro/Nanomaterials for Various Sensor Applications (Selected Papers from ICASI 2023)
Guest editor, Sheng-Joue Young (National United University)
Conference website
Call for paper


Copyright(C) MYU K.K. All Rights Reserved.