The University of Southampton
University of Southampton Institutional Repository

DeepNav: joint view learning for direct optimal path perception in cochlear surgical platform navigation

DeepNav: joint view learning for direct optimal path perception in cochlear surgical platform navigation
DeepNav: joint view learning for direct optimal path perception in cochlear surgical platform navigation

Although much research has been conducted in the field of automated cochlear implant navigation, the problem remains challenging. Deep learning techniques have recently achieved impressive results in a variety of computer vision problems, raising expectations that they might be applied in other domains, such as identifying the optimal navigation zone (OPZ) in the cochlear. In this paper, a 2.5D jointview convolutional neural network (2.5D CNN) is proposed and evaluated for the identification of the OPZ in the cochlear segments. The proposed network consists of two complementary sagittal and bird-view (or top view) networks for the 3D OPZ recognition, each utilizing a ResNet-8 architecture consisting of five convolutional layers with rectified nonlinearity unit (ReLU) activations, followed by average pooling with a size equal to the size of the final feature maps. The last fully connected layer of each network has four indicators, equivalent to the classes considered: the distance to the adjacent left and right walls, collision probability and heading angle. To demonstrate this, the 2.5D CNN was trained using a parametric data generation model, and then evaluated using anatomically constructed cochlea models from micro-CT images of different cases. Prediction of the indicators demonstrates the effectiveness of the 2.5D CNN, for example, the heading angle has less than 1° error with computation delays of less that <1 milliseconds.

Automated insertion, cochlear implant, convolutional neural network, low-cost navigation, realtime systems, robust centerline tracing, virtual surgery
2169-3536
120593-120602
Zamani, Majid
431788cc-0702-4fa9-9709-f5777a2d0d25
Demosthenous, Andreas
bed19531-d770-4f48-8464-59d225ddea8d
Zamani, Majid
431788cc-0702-4fa9-9709-f5777a2d0d25
Demosthenous, Andreas
bed19531-d770-4f48-8464-59d225ddea8d

Zamani, Majid and Demosthenous, Andreas (2023) DeepNav: joint view learning for direct optimal path perception in cochlear surgical platform navigation. IEEE Access, 11, 120593-120602. (doi:10.1109/ACCESS.2023.3320557).

Record type: Article

Abstract

Although much research has been conducted in the field of automated cochlear implant navigation, the problem remains challenging. Deep learning techniques have recently achieved impressive results in a variety of computer vision problems, raising expectations that they might be applied in other domains, such as identifying the optimal navigation zone (OPZ) in the cochlear. In this paper, a 2.5D jointview convolutional neural network (2.5D CNN) is proposed and evaluated for the identification of the OPZ in the cochlear segments. The proposed network consists of two complementary sagittal and bird-view (or top view) networks for the 3D OPZ recognition, each utilizing a ResNet-8 architecture consisting of five convolutional layers with rectified nonlinearity unit (ReLU) activations, followed by average pooling with a size equal to the size of the final feature maps. The last fully connected layer of each network has four indicators, equivalent to the classes considered: the distance to the adjacent left and right walls, collision probability and heading angle. To demonstrate this, the 2.5D CNN was trained using a parametric data generation model, and then evaluated using anatomically constructed cochlea models from micro-CT images of different cases. Prediction of the indicators demonstrates the effectiveness of the 2.5D CNN, for example, the heading angle has less than 1° error with computation delays of less that <1 milliseconds.

Text
DeepNav_Joint_View_Learning_for_Direct_Optimal_Path_Perception_in_Cochlear_Surgical_Platform_Navigation - Version of Record
Available under License Creative Commons Attribution.
Download (1MB)

More information

Accepted/In Press date: 13 September 2023
e-pub ahead of print date: 28 September 2023
Keywords: Automated insertion, cochlear implant, convolutional neural network, low-cost navigation, realtime systems, robust centerline tracing, virtual surgery

Identifiers

Local EPrints ID: 489230
URI: http://eprints.soton.ac.uk/id/eprint/489230
ISSN: 2169-3536
PURE UUID: 4909dbaa-90fb-4375-8447-200a537ca8db
ORCID for Majid Zamani: ORCID iD orcid.org/0009-0007-0844-473X

Catalogue record

Date deposited: 18 Apr 2024 16:36
Last modified: 06 Jun 2024 02:19

Export record

Altmetrics

Contributors

Author: Majid Zamani ORCID iD
Author: Andreas Demosthenous

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×