Conference proceeding
A Low-cost Fault Corrector for Deep Neural Networks through Range Restriction
2021 51st Annual IEEE/IFIP International Conference on Dependable Systems and Networks (DSN), pp.1-13
06/2021
DOI: 10.1109/DSN48987.2021.00018
Abstract
The adoption of deep neural networks (DNNs) in safety-critical domains has engendered serious reliability concerns. A prominent example is hardware transient faults that are growing in frequency due to the progressive technology scaling, and can lead to failures in DNNs. This work proposes Ranger, a low-cost fault corrector, which directly rectifies the faulty output due to transient faults without re-computation. DNNs are inherently resilient to benign faults (which will not cause output corruption), but not to critical faults (which can result in erroneous output). Ranger is an automated transformation to selectively restrict the value ranges in DNNs, which reduces the large deviations caused by critical faults and transforms them to benign faults that can be tolerated by the inherent resilience of the DNNs. Our evaluation on 8 DNNs demonstrates Ranger significantly increases the error resilience of the DNNs (by 3x to 50x), with no loss in accuracy, and with negligible overheads.
Details
- Title: Subtitle
- A Low-cost Fault Corrector for Deep Neural Networks through Range Restriction
- Creators
- Zitao Chen - University of British ColumbiaGuanpeng Li - University of IowaKarthik Pattabiraman - University of British Columbia
- Resource Type
- Conference proceeding
- Publication Details
- 2021 51st Annual IEEE/IFIP International Conference on Dependable Systems and Networks (DSN), pp.1-13
- DOI
- 10.1109/DSN48987.2021.00018
- eISSN
- 2158-3927
- Publisher
- IEEE
- Grant note
- Natural Sciences and Engineering Research Council of Canada (10.13039/501100000038)
- Language
- English
- Date published
- 06/2021
- Academic Unit
- Computer Science
- Record Identifier
- 9984259434302771
Metrics
27 Record Views