Keyword search (4,164 papers available)

"Chatelain Y" Authored Publications:

Title Authors PubMed ID
1 Open-source platforms to investigate analytical flexibility in neuroimaging Sanz-Robinson J; Wang M; McPherson B; Chatelain Y; Kennedy D; Glatard T; Poline JB; 40800896
ENCS
2 An analysis of performance bottlenecks in MRI preprocessing Dugré M; Chatelain Y; Glatard T; 40072903
ENCS
3 Longitudinal brain structure changes in Parkinson's disease: A replication study Sokolowski A; Bhagwat N; Chatelain Y; Dugré M; Hanganu A; Monchi O; McPherson B; Wang M; Poline JB; Sharp M; Glatard T; 38295031
ENCS
4 Numerical stability of DeepGOPlus inference Gonzalez Pepe I; Chatelain Y; Kiar G; Glatard T; 38285635
ENCS
5 Numerical uncertainty in analytical pipelines lead to impactful variability in brain networks Kiar G; Chatelain Y; de Oliveira Castro P; Petit E; Rokem A; Varoquaux G; Misic B; Evans AC; Glatard T; 34724000
ENCS

 

Title:Numerical stability of DeepGOPlus inference
Authors:Gonzalez Pepe IChatelain YKiar GGlatard T
Link:https://pubmed.ncbi.nlm.nih.gov/38285635/
DOI:10.1371/journal.pone.0296725
Publication:PloS one
Keywords:
PMID:38285635 Category: Date Added:2024-01-29
Dept Affiliation: ENCS
1 Department of Computer Science and Software Engineering, Concordia University, Montreal, Qc, Canada.
2 Computational Neuroimaging Laboratory, Child Mind Institute, New York, NY, United States of America.

Description:

Convolutional neural networks (CNNs) are currently among the most widely-used deep neural network (DNN) architectures available and achieve state-of-the-art performance for many problems. Originally applied to computer vision tasks, CNNs work well with any data with a spatial relationship, besides images, and have been applied to different fields. However, recent works have highlighted numerical stability challenges in DNNs, which also relates to their known sensitivity to noise injection. These challenges can jeopardise their performance and reliability. This paper investigates DeepGOPlus, a CNN that predicts protein function. DeepGOPlus has achieved state-of-the-art performance and can successfully take advantage and annotate the abounding protein sequences emerging in proteomics. We determine the numerical stability of the model's inference stage by quantifying the numerical uncertainty resulting from perturbations of the underlying floating-point data. In addition, we explore the opportunity to use reduced-precision floating point formats for DeepGOPlus inference, to reduce memory consumption and latency. This is achieved by instrumenting DeepGOPlus' execution using Monte Carlo Arithmetic, a technique that experimentally quantifies floating point operation errors and VPREC, a tool that emulates results with customizable floating point precision formats. Focus is placed on the inference stage as it is the primary deliverable of the DeepGOPlus model, widely applicable across different environments. All in all, our results show that although the DeepGOPlus CNN is very stable numerically, it can only be selectively implemented with lower-precision floating-point formats. We conclude that predictions obtained from the pre-trained DeepGOPlus model are very reliable numerically, and use existing floating-point formats efficiently.





BookR developed by Sriram Narayanan
for the Concordia University School of Health
Copyright © 2011-2026
Cookie settings
Concordia University