WEB OF SCIENCE
SCOPUS
For a delicate micromanipulation, subjects need to perform tasks while effectively minimizing their hand tremor. Recently, there is growing interest in creating virtual reality (VR) microsurgery systems to enable enhanced visual feedback for precise manipulation tasks as well as a training environment. For an accurate micro-location, it is needed the use of various sensors (sensor fusion) which encompasses different ways of combining data through algorithms in order to refine information quality or derive more information about it. To enable precise location tracking of surgical tools (e.g., forceps), we propose a VR microsurgical environment featuring a sensor fusion algorithm for location tracking system with an additional inertial measurement unit (IMU). The system reconstructs an image of surgical instruments (i.e., forceps) and 2D figures with a zoomed view of them, to help surgeons have a more precise sense in visual feedback. We hypothesize that merging two different sensors and matching them in a Kalman filter algorithm, the micro-localization will be more accurately. We develop and test two ways, that is, (1) location tracking without Kalman filter, and (2) location tracking with Kalman filter activated. A performance evaluation of location tracking in terms of error has been conducted in real and virtual experiments. The result shows that the condition with sensor fusion activated achieves more precise location tracking than without Kalman filter condition.
더보기