Cited time in webofscience Cited time in scopus

Full metadata record

DC Field Value Language
dc.contributor.author Prada, John David Prieto -
dc.contributor.author Lee, Myeongho -
dc.contributor.author Song, Cheol -
dc.date.accessioned 2023-12-26T18:14:23Z -
dc.date.available 2023-12-26T18:14:23Z -
dc.date.created 2022-02-07 -
dc.date.issued 2022-01-23 -
dc.identifier.isbn 9781510647329 -
dc.identifier.issn 1605-7422 -
dc.identifier.uri http://hdl.handle.net/20.500.11750/46872 -
dc.description.abstract Augmented reality environments allow users to interact naturally with 3D objects, including robots. Many robots have been used in the automated sector for painting, picking, packing, and palletizing tasks. The Baxter robot is an example of an industrial robot ideal for research and education. Baxter robots can offer multiple benefits compared with regular robots. In this study, we designed an augmented reality system that makes users intuitively interact in 3D environments by using a Leap Motion controller as a hand tracker and performing a basic human-robot coordination task with the Baxter robot. And the Baxter robot with a stereo camera is connected to a Linux computer, which was programmed with python language. The augmented reality world was programmed in the Unity software. The human robot-coordination task consisted of an augmented reality alignment. We asked the subject to wear the head-mounted display and move the hands. Every hand motion was translated into the robot’s limb motion in real-time. The subject and robot had to align two augmented reality markers. Here are two different experimental conditions: visual information activated and deactivated. The subject performed three trials under each condition. The experimental results showed that the subject under the visual information activated mode improved the average time by 70.63 %. © 2022 SPIE -
dc.language English -
dc.publisher SPIE -
dc.title A study on human-robot coordination in augmented reality assisted hand tracking system -
dc.type Conference Paper -
dc.identifier.doi 10.1117/12.2614986 -
dc.identifier.scopusid 2-s2.0-85131226014 -
dc.identifier.bibliographicCitation SPIE AR | VR | MR -
dc.identifier.url https://spie.org/ar-vr-mr/presentation/A-study-on-human-robot-coordination-in-augmented-reality-assisted/11931-49?enableBackToBrowse=true -
dc.citation.conferencePlace US -
dc.citation.conferencePlace San Francisco -
dc.citation.title SPIE AR | VR | MR -
Files in This Item:

There are no files associated with this item.

Appears in Collections:
Department of Robotics and Mechatronics Engineering Intelligent Bio-OptoMechatronics Lab 2. Conference Papers

qrcode

  • twitter
  • facebook
  • mendeley

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE