University of South Florida Virtually Transparent Epidermal Imagery
(This material is based upon work supported by the National Science Foundation under Grant No. 1035594)
(RPAL) Home
Lab Members
Collaborations
Publications
Media/Videos


Figure 1: Effect illustration Figure 3: System Setup

The objective of this research is to develop a cyber-physical system to display the inside of a patient on the skin through a 3D projector- array and a micro camera cluster, giving the appearance of “transparent skin” and enabling single incision surgery with the visual benefits of open cavity surgery.

The major difficulty in minimally invasive surgery is the loss of natural visual perception and hand-eye coordination, which results in a higher skill requirement, longer training and actual surgery time. This system will give surgeons an "X-ray" vision experience, since they see directly through the skin, and remove a spatial bottleneck and additional scarring caused by laparoscopes.

The broad challenges being addressed in this project are reducing the skill requirements to successfully perform an MIS (minimally invasive surgery); reducing the invasiveness, cost, and duration of MIS; and improving the efficiency of surgery training. The expected outcomes of this research project will be improved dexterity for MIS surgeons and significant economic growth in MIS and other healthcare-related fields with numerous benefits for the nation-at-large.

We are developing a set of micro-cameras that: occupy no space required by surgical tools, produce no additional scarring to the patient, and transfer wireless high-definition video images. Our research will create a virtual view generating system to project the panoramic 3D videos from all cameras to the right spot on the patient’s body with geometry and color distortion compensation. A surgeon-camera-interaction system is under-development to allow surgeons to control viewpoint with gesture recognition and finger tracking.

Vessel feature detection and tracking process
Vessel features detected in a Hamlyn video
Vessel features detected in another laparoscopy video
Figure 4: Mosaiced images from different cameras Figure 5: Projection on abdomen

This project benefits the millions of surgeries capable of being performed through a single incision in the abdomen by providing virtually transparent skin to surgeons who will enjoy all the visual benefits of open-cavity surgery without all the associated risks to the patient. The goals of this research are extremely “hands-on” and immediately applicable to outreach activities that can excite youth, minority students, and others about the science, medicine and engineering careers.


3D Surface Reconstruction:


Immersive Learning:

Figure 6: Picture taken at 2012 Engineering EXPO.
Figure 7: SAGE platform.
On-Body Immersive Learning System Demo


Participants

Faculty Students
Yu Sun, (PI) Bingxiong Lin (Ph.D. student)
Adam Anderson Adrian Johnson (Ph.D. student)  
Rich Gitlin Cristian Castro (Graduated)  
  Justin Fouts (REU)  

Collaborators

Jaime Sanchez, M.D. (USF Health and Tampa General Hospital)



Publication:
Johnson, S., Sanchez, J, French, A. and Sun, Y. (2014) Unobtrusive Augmentation of Critical Hidden Structures in Laparoscopy, MMVR, pp 1-4. (in press)

Johnson A., Sun Y. (2013) Spatial Augmented Reality on Person: Exploring the Most Personal Medium, VAMR/HCII, Part I, LNCS 8021, pp. 169-174.

Lin, B., Sun, Y., Sanchez, J., and Qian X.(2013) Vesselness Based Feature Extraction for Endoscopic Image Analysis, ISBI (in press) Data Sets

Lin, B., Johnson, A., Qian X., Sanchez, J., Sun, Y. (2013) Simultaneous Tracking, 3D Reconstruction and Deforming Point Detection for Stereoscope Guided Surgery, Augmented Reality Environments for Medical Imaging and Computer-Assisted Interventions, pp 35-44 pdf

Johnson, A. S., & Sun, Y. (2013). Exploration of spatial augmented reality on person. In IEEE Virtual Reality (VR), pp. 59-60.

Lin B., Sun Y., Qian X., (2013) Dense Surface Reconstruction with Shadows in MIS, IEEE Transactions on Biomedical Engineering, pp. 1-10 (Accepted). (video from left camera, video from right camera, Shadow casting process video)

Anderson, A., Lin, B., Sun Y., (2013) Virtually Transparent Epidermal Imagery (VTEI): On New Approaches To In Vivo Wireless High-Definition Video and Image Processing, IEEE Transactions on Biomedical Circuits and Systems, pp 1-9 (in Press).

Lin B., Sun Y., Qian X., (2013) Thin Plate Spline Feature Point Matching for Organ Surfaces in Minimally Invasive Surgery Imaging, SPIE Medical Imaging, pp. 1-6 (accepted, oral presentation).

Sun Y. Anderson A, Castro C, Lin B, Gitlin R (2011) Virtually Transparent Epidermal Imagery for Laparo-Endoscopic Single-Site Surgery, International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC'11), pp. 2107-2110, Boston, MA, USA, August 30 - September 3, 2011. (pdf)


Dataset:
In-Vivo Dataset for Vessel Feature Detection Evaluation



Software Release:

1. Multi-micro-camera video mosaicing: Data, Code, Result1, Result2.
2. In-Vivo Vessel Feature Detector Matlab code for our 2014 ISBI paper.
3. Matlab source code for our TBME paper: Efficient Vessel Feature Detection for Endoscopic Image Analysis



Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation


Department of Computer Science and Engineering • 4202 E. Fowler Ave • Tampa, FL 33620 • (813)974-7508
Created by: Emmanuel Stinson - Send comments to estinson@mail.usf.edu