MSc Medical Visualisation & Human Anatomy School of Simulation & Visualisation

Kylie Seidner (She/Her)

Hello, I’m Kylie Seidner! I am currently working as an anatomy demonstrator at the University of Glasgow with the aim to become a forensic pathologist. My background is a BSc (Hons) Biomedical Science and my love of combining art with science lead me to pursue this MSc in Medical Visualisation and Human Anatomy. My thesis focused on creating an augmented reality application on android devices to provide a new way to help students grasp the difficult concepts in dissection guides provided from the Royal College of Pathologists. I believe that every student deserves to learn in the way that best suits them and for some of us that means having a visual representation of what is being asked. I hope to bring my new digital illustration and 3D modelling skills with me into the medical community and to continue furthering my knowledge and passion for anatomy and education.

 

Contact
kylies2000@gmail.com
k.seidner1@student.gsa.ac.uk
Digital Portfolio
LinkedIn
Instagram
Projects
NephrectomyAR
Volumetric Visualisation

NephrectomyAR

Using Augmented Reality to Enhance Biomedical Pathology Dissection Guides

In recent years, researchers have begun to explore extended reality’s use in education. Several universities have implemented visualization tablets and virtual reality applications to learn about human anatomy.  Researchers have found that extended reality has a greater impact on spatial learning than traditional 2D textbooks and videos. Although there is plenty of research for medical students, there is a distinct lack of high-quality research regarding Augmented Reality for teaching biomedical and pathology technician students.  Several developments have been made to augment books. Taking this approach, the aim of this thesis was to develop an application to augment the Royal College of Pathologist dissection guide on nephrectomies.

 

A series of 3D kidney models which were segmented in 3D Slicer from a computerized tomography (CT) scan were then imported into 3DS Max and ZBrush for retopology, texturing, and animation. The application utilized augmented reality with image targets and a series of model controls which allowed the user to animate the models and provide information in a clear, intuitive format for amateurs following along the dissection guide. Following the application development, user testing was done utilizing a standardized survey called the System Usability Scale.This provided valuable user insight and feedback into the application which demonstrated that the application was fully functional and easy to use for a novice.

Storyboard & Moodboard

The final storyboard and color scheme used to develop the final look of the application

Application Structure

Structural overview of the final application. Blue boxes indicate the scenes, yellow boxes indicate a button with functions, and grey boxes are content informational boxes with no functions.

color palate

Final color scheme utilized throughout the application (made on Coolors.co)

Simple Nephrectomy

A whole kidney model representing a simple nephrectomy which was created using CT scan data.

Simple Nephrectomy Controls

Screenshot of the application to show the controls available to the user

Simple Nephrectomy Macroscopic Evalutaiton

This panel demonstrated the anatomy to be visualized on each model. This button shows the user where the renal vein is by highlighting it

Initial Incision Controls

The controls the user can access for the model representing the initial incision

Initial Incision

A screenshot from the initial incision animation in the application.

Initial Incision Macroscopic Evaluation

This panel allows the user to revise the internal anatomy of the kidney. The button clicked was for renal pyramids which are highlighted

NephrectomyAR: Application Walkthrough

This is a short video which walk you through the application to show you how it is intended to work.

Volumetric Visualisation

The process of creating 3-dimensional (3D) renders from traditional 2-dimensional (2D) data sets such as CT and MRI scans is called Volumetric Visualisation. During this module I aligned several datasets and segmented different pathologies seen in a variety of data sets. I was able to produce several high quality pieces of work utilising direct and indirect rendering techniques to show the volume, location, size and density of different tumours and bone fractures. This module was challenging but I thoroughly enjoyed it. This enjoyment was demonstrated in my final project which received full marks (100%) and I ended up utilising these skills for my thesis to create NephrectomyAR, my augmented reality dissection guide application.

hip fracture before and after surgery.

Hip Fracture

Top images: Indirect render of hip fracture prior to surgery Bottom images: A post surgery direct volume render detailing titanium screw and plate locations
images of brain tumour (blue-green) located inside the brain and skull

Indirect Render of Brain Tumour

Left images (top and bottom) visualise the tumour (blue/blue-green) located inside the skull and brain. Right images (top and bottom) demonstrate the location of the same tumour within the skull. Bottom middle image shows the tumour location in the brain only