top of page

Demo video of Master's Research Project an augmented reality app with petrous temporal bone model and textbook illustration of cochlea

​​Visuo-Haptic Learning of the Inner Ear:​ 

Using physical optical glyphs with Augmented Reality​ ©


Andréa Zariwny
University of Toronto
Toronto, Canada

Abstract— A novel augmented reality (AR)-enabled teaching tool demonstrating the complex structure of the human cochlea to medical students.
The cochlea is a small but intricate anatomical structure often represented as a snail shell-like object. It is more accurately defined as a spiral negative space within the temporal bone, but this is difficult to convey with traditional teaching tools such as prosections and illustrations. Using a handheld tablet equipped with an integrated camera, digitally-rendered 3D models of this structure can be visually superimposed over illustrations of the cochlea and/or physical models of the petrous temporal bone, thus highlighting the negative space.

Keywords- augmented reality; 3D printing; visuo-haptic; anatomy; cochlea; pedagogy; mobile app (key words)

Learning about the cochlea is a significant challenge for medical students. It has a complex shape, it is small (< 4mm) and it is buried in the dense bone of the petrous temporal region. The cochlea is a spiral tunnel of negative space, which forms the bony labyrinth with semicircular canals. This can be challenging to illustrate and often shown and thought to be ¬a structure similar to a snail shell. This approach is misleading because the walls of the cochlea are not surrounded externally by space, but by solid bone.
The development of Augmented Reality (AR) technology provides an opportunity to create a digital graphic representation of hidden, or otherwise unavailable information, and incorporate it by means of a digital device with the surrounding environment. Using this technology to combine a digital functioning real-time animation of the spiral ganglion, with an accurate anatomical physical model of the cochlea, will facilitate visuo-haptic learning and provide supplementary means of obtaining structural and functional information on this difficult subject matter.
There are two different apps to guide the student through the micro and macro anatomy: (1) An AR mobile app using an illustration of the petrous temporal bone in cross section as an optical glyph to trigger a 3d graphic of the cochlear nerve, spiral ganglion, radial fibres, and fluid filled parts of the cochlea; (2) An AR mobile app using a physical scale model of the petrous temporal bone as an optical glyph to trigger a 3d graphic of the cochlear negative space of said bone.


A high-resolution CT scan provides the digital morphometric data of the bony labyrinth. This data is used to develop accurate illustrations and 3D digital models of the structure. An interactive AR module is next developed to display these models when the device camera recognizes the appropriate visual triggers (illustrations or glyphs). An extension of this module includes testing the recognition of physical models as triggers.

A. Target Audience
Undergraduate medical and paramedical students especially those with interest in neurosurgery, ear nose and throat surgery, and audiology.

B. Measurement of Goals
Alan W Cole research grant provided by: The Vesalius Trust
The project will be evaluated by peer review and a team of anatomists and first year medical students from the University of Toronto in a usability test with time restrictions.


This is a proof of concept in technology-driven pedagogical design research. The methods and results are tightly integrated. To-date, results suggest that the illustrated cochlea is a highly suitable trackable object, effective in triggering a 3D representation. Experiments incorporating an optical glyph with a physical model are underway.

The challenges associated with using AR to communicate the intricate structure of the cochlea include: (1) Rendering high polygon count 3D models within the authoring environment; (2) Providing adequate contrast levels in trackable 2D imagery without sacrificing detail; and (3) Managing the quality of physical models produced through rapid prototyping.

Vesalius Trust, AMI, ISTAS, IAMSE, Gabby Resch, Steve Cory from Objex Unlimited for the 3D printing, Anton Semechko, Dr. Patricia Stewart, Marc Dryer, Biomedical Communications program, STTARR, The Association of Medical Illustrators, and the University of Toronto.

[1] Handbook of augmented reality2011. , ed. Borivoje Furht. New York, NY: Springer.
[2] Handbook of medical image processing and analysis2009. , eds. I. N. Bankman, ScienceDirect (Online service). 2nd ed. ed. Amsterdam: Elsevier/Academic Press.
[3] Gray's atlas of anatomy2008. , eds. Richard L. Drake, Richard L. Drake and Henry Gray. Philadelphia: Churchill Livingstone Elsevier.
[4] Rapid manufacturing : An industrial revolution for the digital age2006. , eds. Neil Hopkinson, R. J. M. Hague and P. M. Dickens. Chichester, England: John Wiley.
[5] Rapid prototyping : Theory and practice2006. , eds. Ali K. Kamrani, Emad Abouel Nasr. London: Springer. 
[6] The cochlea1996. , eds. Peter Dallos, Richard R. Fay and Arthur N. Popper. New York: Springer.
[7] Bruns, Patrick, Charles Spence, and Brigitte Röder. 2011. Tactile recalibration of auditory spatial representations. Experimental Brain Research 209 (3): 333-44.
[8] Fruhstorfer, B. H., J. Palmer, S. Brydges, and P. H. Abrahams. 2011. The use of plastinated prosections for teaching anatomy—The view of medical students on the value of this learning resource. Clinical Anatomy 24 (2): 246-52.
[9] Gonzalez, Carlos. The organ of corti. in US Southwestern Biomedical Communications Graduate program [database online]. 2012 [cited May 27 2012 Available from, (accessed May 27 2012).
[10] Gopalan, K. Murugan, and K. Unnikrishnan Menon. 2011. A new technique of tracing the intra-temporal course of the facial nerve. Journal of Biomedical Communications 37 (3) (2011): E28.
[11] Grasha, Anthony F. 1996. Teaching with style : A practical guide to enhancing learning by understanding teaching and learning styles. Pittsburgh: Alliance Publishers..
[12] Herman, Tim, Jennifer Morris, Shannon Colton, Ann Batiza, Michael Patrick, Margaret Franzen, and David S. Goodsell. 2006. Tactile teaching: Exploring protein structure/function using physical models*. Biochemistry and Molecular Biology Education 34 (4): 247-54.
[13] Hopwood, Nick. 2002. Embryos in wax : Models from the ziegler studio, eds. Friedrich Ziegler, Universität Bern. and Whipple Museum of the History of Science. [Bern]: Institute of the History of Medicine.
[14] Kritikos, Ada, and Cecily Brasch. 2008. Visual and tactile integration in action comprehension and execution. Brain Research 1242 (0): 73.
[15] Liu, Bo. 2007. A detailed 3D model of the guinea pig cochlea. Brain Structure & Function 212 (2) (-09-30): 223-30.
[16] Marković, Danica, and Bojana Marković - Živković. 2010. DEVELOPMENT OF ANATOMICAL MODELS – CHRONOLOGY. History of Medicine 49 (2): 56-62.
[17] Mclachlan, John C., and Sam Regan De Bere. 2004. How we teach anatomy without cadavers. The Clinical Teacher 1 (2): 49-52.
[18] McNeill, Warrick. 2011. Anatomy in 3D. Journal of Bodywork & Movement Therapies 15 (3) (201107): 375-9.
[19] Minogue, James, and M. Gail Jones. 2006. Haptics in education: Exploring an untapped sensory modality. Review of Educational Research 76 (3): 317-48.
[20] Nadol Jr., Joseph B. 1988. Comparative anatomy of the cochlea and auditory nerve in mammals. Hearing Research 34 (3) (8): 253-66.
[21] Noorani, Rafiq. 2006. Rapid prototyping : Principles and applications. Hoboken, NJ: Wiley.
[22] Papagiannis,Helen.Augmentedstories augmented reality projects & articles by helen papagiannis. in wordpress [database online]. March 01 2012 [cited May 27 2012 Available from
[23] Raphael, Yehoash, and Richard A. Altschuler. 2003. Structure and innervation of the cochlea. Brain Research Bulletin 60 (5–6) (6/15): 397-422.
[24] Ratner, Peter. 2009. 3-D human modeling and animation. 3rd ed. ed. Hoboken, N.J.: Wiley.
[25] Schünke, Michael. 2007. Head and neuroanatomy, eds. Lawrence M. Ross, Edward D. Lamperti, Erik Schulte and Udo Schumacher. New York: Thieme.
[26] Werner, H., J. R. L. dos Santos, R. Fontes, P. Daltro, E. Gasparetto, E. Marchiori, and S. Campbell. 2010. Additive manufacturing models of fetuses built from three-dimensional ultrasound, magnetic resonance imaging and computed tomography scan data. Ultrasound in Obstetrics and Gynecology 36 (3) (September 2010): 355-61.
[27] Wijntjes, Maarten, Robert Volcic, Sylvia Pont, Jan Koenderink, and Astrid Kappers. 2009. Haptic perception disambiguates visual perception of 3D shape. Experimental Brain Research 193 (4): 639-44.

[Master's Research Project ] Visuo-Haptic Learning of the Inner Ear with physical optical glyphs and AR

bottom of page