<?xml version="1.0" encoding="UTF-8"?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns="http://purl.org/rss/1.0/" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel rdf:about="https://scholar.dgist.ac.kr/handle/20.500.11750/9932">
    <title>Repository Community: null</title>
    <link>https://scholar.dgist.ac.kr/handle/20.500.11750/9932</link>
    <description />
    <items>
      <rdf:Seq>
        <rdf:li rdf:resource="https://scholar.dgist.ac.kr/handle/20.500.11750/60227" />
        <rdf:li rdf:resource="https://scholar.dgist.ac.kr/handle/20.500.11750/59889" />
        <rdf:li rdf:resource="https://scholar.dgist.ac.kr/handle/20.500.11750/59853" />
        <rdf:li rdf:resource="https://scholar.dgist.ac.kr/handle/20.500.11750/59332" />
      </rdf:Seq>
    </items>
    <dc:date>2026-04-24T14:16:59Z</dc:date>
  </channel>
  <item rdf:about="https://scholar.dgist.ac.kr/handle/20.500.11750/60227">
    <title>Toward virtual bladder: real-time bladder volume monitoring with flexible AuCNT strain sensors</title>
    <link>https://scholar.dgist.ac.kr/handle/20.500.11750/60227</link>
    <description>Title: Toward virtual bladder: real-time bladder volume monitoring with flexible AuCNT strain sensors
Author(s): Cho, Youngjun; Jo, Yujin; Kang, Minseok; Shin, Heejae; Cho Jeongmok; Jeong Hyunghwa; Suh Hyunsuk Peter; Pak Changsik John; Park, Jeonhyeong; Kwon Soonchul; Choi Hongsoo; Yu, Jaesok; Kim, Hoe Joon; Lee, Sanghoon
Abstract: Digital twin technology holds considerable potential for personalized diagnostics and treatment of bladder dysfunction, particularly neurogenic conditions such as underactive bladder (UAB). In this study, to address the need for precise monitoring, we introduce a flexible, stretchable strain sensor composed of gold-coated carbon nanotubes (AuCNTs) embedded in Ecoflex. We specifically designed a three-channel configuration to capture anisotropic expansion and evaluated the sensor's performance using both two-dimensional balloon models and ex-vivo three-dimensional porcine bladder models. As a result, the AuCNT sensor demonstrated high sensitivity, and the three-channel design significantly enhanced spatial accuracy compared to single-channel approaches. Based on these measurements, we created a preliminary &amp;quot;Virtual Bladder&amp;quot; model that provides dynamic, real-time visualization of bladder volume changes. While our current model requires further development to incorporate multimodal data and anatomical variability, it serves as a foundational step towards developing advanced digital twin frameworks and closed-loop neuromodulation systems for bladder dysfunction.</description>
    <dc:date>2025-12-31T15:00:00Z</dc:date>
  </item>
  <item rdf:about="https://scholar.dgist.ac.kr/handle/20.500.11750/59889">
    <title>HOLLOW MICRONEEDLE AND MANUFACTURING METHOD THEREFOR</title>
    <link>https://scholar.dgist.ac.kr/handle/20.500.11750/59889</link>
    <description>Title: HOLLOW MICRONEEDLE AND MANUFACTURING METHOD THEREFOR
Author(s): 정진웅; 이상훈
Abstract: The present disclosure relates to a hollow microneedle, a hollow microneedle electrode, and a manufacturing method therefor, wherein a microneedle array comprises a base part and one or a plurality of microneedles protruding from the base part, the microneedle array includes a shape memory polymer, and the microneedles are shaped to be hollow.</description>
  </item>
  <item rdf:about="https://scholar.dgist.ac.kr/handle/20.500.11750/59853">
    <title>신체 부착형 근전도 센서</title>
    <link>https://scholar.dgist.ac.kr/handle/20.500.11750/59853</link>
    <description>Title: 신체 부착형 근전도 센서
Author(s): 정진웅; 이상훈; 박재우
Abstract: 본 발명은 근육의 신호를 감지하여 근육의 수축 및 이완 정도를 제공하는 근전도 센서에 관한 것으로, 더욱 상세하게는 신체 부착하는 근전도 센서에 있어서, 특히 절단 환자의 로봇다리를 물리적으로 연결해주는 소켓에 있어서 소켓 내부에 보다 자연스럽게 부착되어 착용감을 향상할 수 있도록 매우 얇게 형성되고, 또한 근육의 움직임에 따라 자연스럽게 신축되도록 신축성이 있으면서, 땀과 같은 분비물을 효과적으로 배출할 수 있도록 통기성 있는 재질로 형성되어, 반복 착용과 장시간 부착이 가능한 신체 부착형 근전도 센서에 관한 것이다.</description>
  </item>
  <item rdf:about="https://scholar.dgist.ac.kr/handle/20.500.11750/59332">
    <title>Contrast agent-free 3D ultrasound deep-depth vascular imaging with a 2D row column addressed Array: In vivo human clinical feasibility study</title>
    <link>https://scholar.dgist.ac.kr/handle/20.500.11750/59332</link>
    <description>Title: Contrast agent-free 3D ultrasound deep-depth vascular imaging with a 2D row column addressed Array: In vivo human clinical feasibility study
Author(s): Guezzi, Nizar; Lee, Sangheon; Nam, Sangwoo; Jung, Dongkyu; Noman, Muhammad; Seong, Hyojin; Lee, Sanghoon; Kim, Hoe Joon; Yu, Jaesok
Abstract: Three-dimensional (3D) imaging of vascular networks is essential for accurately diagnosing deep organ diseases. However, current ultrasound imaging methods are primarily limited to visualizing 2D cross-sections, which restricts the ability to evaluate the full structure of vascular networks. Although several 3D ultrasound techniques have been proposed to overcome this limitation, most struggle to achieve deep penetration and a wide field of view due to their high resource requirements. Row-column addressed arrays (RCAs) have emerged as a promising solution, enabling 3D imaging with significantly reduced hardware complexity. Nevertheless, the limited image quality achievable with RCAs has hindered their broader application. In this study, we propose a coded plane-wave-based, contrast-free 3D imaging system using RCAs for in vivo imaging of deep human vasculature. To validate the method, we imaged the liver and spleen of two healthy adult volunteers and successfully visualized vascular structures without contrast agent injection. Flow dynamics were captured at a frame rate of 27 Hz. Additionally, we demonstrated contrast-to-noise ratio (CNR) improvements of approximately 9 dB and 10 dB in the z-y and z-x planes, respectively, compared to non-coded excitation. This approach offers strong potential for in vivo 3D visualization and assessment of complex, deeply located vascular networks.</description>
    <dc:date>2026-01-31T15:00:00Z</dc:date>
  </item>
</rdf:RDF>

