<?xml version="1.0" encoding="UTF-8"?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns="http://purl.org/rss/1.0/" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel rdf:about="https://scholar.dgist.ac.kr/handle/20.500.11750/56663">
    <title>Repository Community: null</title>
    <link>https://scholar.dgist.ac.kr/handle/20.500.11750/56663</link>
    <description />
    <items>
      <rdf:Seq>
        <rdf:li rdf:resource="https://scholar.dgist.ac.kr/handle/20.500.11750/60312" />
        <rdf:li rdf:resource="https://scholar.dgist.ac.kr/handle/20.500.11750/60099" />
        <rdf:li rdf:resource="https://scholar.dgist.ac.kr/handle/20.500.11750/60098" />
        <rdf:li rdf:resource="https://scholar.dgist.ac.kr/handle/20.500.11750/60052" />
      </rdf:Seq>
    </items>
    <dc:date>2026-05-09T16:16:46Z</dc:date>
  </channel>
  <item rdf:about="https://scholar.dgist.ac.kr/handle/20.500.11750/60312">
    <title>PAPRLE: Plug-And-Play Robotic Limb Environment: A Modular Ecosystem for Robotic Limbs</title>
    <link>https://scholar.dgist.ac.kr/handle/20.500.11750/60312</link>
    <description>Title: PAPRLE: Plug-And-Play Robotic Limb Environment: A Modular Ecosystem for Robotic Limbs
Author(s): Kwon, Obin; Yamsani, Sankalp; Myers, Noboru; Taylor, Sean; Hong, Jooyoung; Park, Kyungseo; Alspach, Alex; Kim, Joohyung
Abstract: We introduce PAPRLE (plug-and-play robotic limb environment), a modular ecosystem that enables flexible placement and control of robotic limbs. With PAPRLE a user can change the arrangement of the robotic limbs and control them using a variety of input devices, including puppeteers, gaming controllers, and virtual reality (VR) devices. This versatility supports a wide range of teleoperation scenarios and promotes adaptability to different task requirements. We also introduce a pluggable puppeteer device that can be easily mounted and adapted to match the target robot configurations. PAPRLE supports bilateral teleoperation through these puppeteer devices, agnostic to the type or configuration of the follower robot. The modular design of PAPRLE facilitates novel spatial arrangements of the limbs and enables scalable data collection, thereby advancing research in embodied artificial intelligence (AI) and learning-based control. We validate PAPRLE in various real-world settings, demonstrating its versatility across diverse combinations of leader devices and follower robots. The system will be released as open source, including both hardware and software components, to support broader adoption and extension.</description>
    <dc:date>2025-12-31T15:00:00Z</dc:date>
  </item>
  <item rdf:about="https://scholar.dgist.ac.kr/handle/20.500.11750/60099">
    <title>Fully 3D printable Robot Hand and Soft Tactile Sensor based on Air-pressure and Capacitive Proximity Sensing</title>
    <link>https://scholar.dgist.ac.kr/handle/20.500.11750/60099</link>
    <description>Title: Fully 3D printable Robot Hand and Soft Tactile Sensor based on Air-pressure and Capacitive Proximity Sensing
Author(s): Taylor, Sean; Park, Kyungseo; Yamsani, Sankalp; Kim, Joohyung
Abstract: Soft tactile sensors can enable robots to grasp objects easily and stably by simultaneously providing tactile data and mechanical compliance to robotic hands. If there are low-cost and easy-to-build robotic hands equipped with soft tactile sensors, they would be highly accessible and facilitate many robotics projects. To this end, we propose an accessible robot hand capable of tactile sensing, which can be produced through digital fabrication. We made the robot hand using commercial servo motors as well as components 3D printed from PETG, TPU, and conductive TPU. These materials allow the robot hand to have a soft, durable, and even functional structure. Specifically, the soft fingertip was crafted from TPU and conductive TPU, and their mechanical and electrical properties enable easy implementation of tactile sensing capabilities, such as force and capacitive touch, simply by adding off-the-shelf sensors (air-pressure and capacitance). The proposed robot hand could effectively sense interaction forces and proximity to conductive objects, and its utilization in various tasks was also demonstrated successfully.</description>
    <dc:date>2024-05-15T15:00:00Z</dc:date>
  </item>
  <item rdf:about="https://scholar.dgist.ac.kr/handle/20.500.11750/60098">
    <title>MelumiTac: Vision-based Tactile Sensor Using Mechanoluminescence for Dynamic Tactile and Nociceptive Perception</title>
    <link>https://scholar.dgist.ac.kr/handle/20.500.11750/60098</link>
    <description>Title: MelumiTac: Vision-based Tactile Sensor Using Mechanoluminescence for Dynamic Tactile and Nociceptive Perception
Author(s): Bae, Sunggyu; Song, Seongkyu; Jeong, Soon Moon; Park, Kyungseo
Abstract: This paper presents MelumiTac, a vision-based tactile (ViTAC) sensor enhanced with mechanoluminescent (ML) materials that emit green light under dynamic tactile stimuli. The integration of an ML elastomer generates self-illumination in response to dynamic tactile stimuli, enabling direct visualization of both dynamic tactile events and nociceptive responses while simultaneously tracking deformation in real-time. Experimental evaluations involving cyclic loading, in-plane motion, and piercing reveal a strong correlation between ML emission, stress rate, and localized deformation, thereby validating its multi-modal tactile sensing capabilities. Additionally, frame-by-frame analysis offers rich insights into the contact dynamics during physical interactions. These improvements, implemented within a small form factor of conventional ViTac sensor, render the approach highly accessible. Thus, we expect that the proposed solution will offer practical and unique advantages to engineers developing and applying vision-based multi-modal tactile sensors.</description>
    <dc:date>2025-10-21T15:00:00Z</dc:date>
  </item>
  <item rdf:about="https://scholar.dgist.ac.kr/handle/20.500.11750/60052">
    <title>Spatial Sensitivity Equalization of ERT-Based Robotic Skin Through Gauge Factor Distribution Optimization</title>
    <link>https://scholar.dgist.ac.kr/handle/20.500.11750/60052</link>
    <description>Title: Spatial Sensitivity Equalization of ERT-Based Robotic Skin Through Gauge Factor Distribution Optimization
Author(s): Cho, Junhwi; Chung, Hyunjo; Park, Kyungseo; Kim, Jung
Abstract: Electrical Resistance Tomography (ERT) has emerged as a promising technology for large-area robotic skin due to its ability to reconstruct pressure distribution over extensive regions using a few sparsely distributed electrodes. Despite ERT&amp;apos;s potential to reconstruct the external forces applied on 3D surfaces, the uneven distribution of spatial sensitivity leads to significant errors in identifying the physical quantities of contacts, inhibiting this technique from being an effective tactile sensor. To address this issue, this paper proposes a method to equalize the spatial sensitivity by modulating the conductivity of ERT sensors through topology optimization. In a simulation environment, the sensor&amp;apos;s conductive domain was converted into a binary image and optimized to equalize spatial sensitivity and reduce disparities between low and highsensitivity areas. Additionally, we present a sensor fabrication method with a complex optimized conductive patch pattern from simulation by applying screen printing techniques. The effectiveness of the implemented spatial sensitivity equalization was validated by comparing it to a conventional ERT sensor in both simulations and real-world environments. The proposed sensitivity optimization method expands the use of ERT-based sensors for distributed tactile sensing in physical human-robot interaction scenarios.</description>
    <dc:date>2025-05-18T15:00:00Z</dc:date>
  </item>
</rdf:RDF>

