MSc Medical Visualisation & Human Anatomy School of Simulation & Visualisation

Nicole Gourlay (She/Her)

I am Nicole Gourlay, an aspiring professional at the intersection of design and medical science. My journey in this field began with the successful completion of my Bachelor of Science degree in Interaction Design in 2022. Building upon this foundation, I embarked on a transformative educational path, recently completing a Master of Science in Medical Visualisation and Human Anatomy.

My academic endeavours have equipped me with a unique and dynamic skill set that bridges the realms of creativity, technology, and artistry. With a passion for innovation, my goal is to revolutionise the way we perceive and engage with medical knowledge.

My thesis project, developed in collaboration with the Royal Hospital for Children, Glasgow, resulted in an augmented reality (AR) app called “VitaSight AR Surgical”. This app was designed to enhance surgical precision by using advanced depth perception technology to align anatomical models, segmented directly from CT Scan data, with AR markers.

I am dedicated to pushing the boundaries of design and medical science further, leveraging my expertise in interaction design and medical visualisation to create innovative solutions that improve healthcare outcomes and enhance our understanding of the human body. With a commitment to ongoing learning and collaboration, I look forward to contributing to the evolution of medical technology and education in the years to come.

VitaSight AR Surgical
Volumetric Visualisation

Collaborative Work
Application Development

VitaSight AR Surgical

VitaSight AR Surgical is the result of a collaborative thesis project between myself, Glasgow School of Art, University of Glasgow, and Royal Hospital for Children, Glasgow.

This project presents an exploration into the integration of Augmented Reality (AR) within the field of surgery, with a specific focus on user-friendliness and its potential impact on surgical precision and outcomes. The project’s core objective revolved around crafting a user-centric model, employing AR technology on a tablet device. Uniquely, this model leverages cutting-edge depth depiction technology to ensure precise alignment of anatomical models beneath the AR marker decal. Unlike prevalent AR approaches that merely overlay digital content, this innovation distinguishes itself by perceiving depth, offering an unparalleled visual representation.

The application was well-received by a diverse group of participants from the medical field, surpassing usability benchmarks and earning high user satisfaction. Notably, participants saw great potential in using the application to explain complex medical concepts to patients and parents, improving communication in healthcare.

The possible outcomes for future developments are substantial, suggesting the potential for a fundamental rise in surgical accuracy, as the model’s depth perception enables translation of pre-operative imaging to intra-operative scenarios with anatomical precision. This advancement stands poised to minimise inadvertent harm to intricate structures and significantly elevate surgical outcomes.

Yet, the impact moves beyond the surgical site. The model’s potential could usher in a new era of surgical education, by providing trainees with immersive, depth-aware insights, learning experiences could be enriched.

This project’s innovative fusion of depth-savvy AR technology into paediatric surgery holds immense transformative promise. Through its ability to faithfully represent anatomical structures, it stands to revolutionise surgical practice, enhance patient care, and reshape the experience of surgical training.

VitaSight AR: Logo

VitaSight AR: Project Board

VitaSight AR: Main Menu

VitaSight AR: 3D Mode

VitaSight AR: 3D Anatomical Model Development

VitaSight AR: 3D Model Texturing - Pancreas

VitaSight AR: 3D Model Texturing - Kidney

VitaSight AR: Custom Decal

VitaSight AR: Scroll Menu Development

VitaSight AR: Bespoke Graphics Pack

Volumetric Visualisation

In modern healthcare, the ability to interpret and communicate complex medical data is paramount. Software such as 3D Slicer and MITK (Medical Imaging Interaction Toolkit) allow for the innovation of transforming raw medical datasets into interpretable representations that can guide patients, clinicians, researchers and educators.

Using volume and surface visualisation, acquisition and visualisation systems, and visualisation of medical volumetric data in MRI, CT, and Ultrasound, two-dimensional medical scans can be transformed into immersive 3D reconstructions. This three-dimensional perspective enhances our ability to decipher complex anatomical structures and anomalies. Using software such as Slicer and MITK, we can navigate through layers of data to segment valuable insights into interactive insights which could ultimately elevate the standard of medical practice.

The below works are some examples of these visualisations. Amongst these, one piece showcases a fractured pelvis before, and after surgery. Through the 3D visualisations presented, viewers can gain understanding of the fracture itself, and how the post-surgery placement of metal pins have effectively addressed and mended the fracture, offering a visual narrative of the healing process.

Pre-Op Pelvic Fracture


Post-Op Pelvic Fracture

Volume Render: Tooth

Lung Segmentation with Tumour

Abdomen Segmentation

Lung Segmentation with Tumour

Lung Segmentation with Tumour

Application Development