Publication | Closed Access
SiRnn: A Math Library for Secure RNN Inference
82
Citations
83
References
2021
Year
Unknown Venue
Artificial IntelligenceConvolutional Neural NetworkEngineeringMachine LearningNeural Networks (Machine Learning)VerificationComputer-aided VerificationFormal VerificationRecurrent Neural NetworkSocial SciencesData ScienceAdversarial Machine LearningComputing SystemsEmbedded Machine LearningMath LibrariesMath LibrarySquare RootComputer ScienceNeural Networks (Computational Neuroscience)Deep LearningModel CompressionData SecurityDeep Neural NetworksAutomated ReasoningFormal MethodsComplex Machine Learning
Complex machine learning (ML) inference algorithms like recurrent neural networks (RNNs) use standard functions from math libraries like exponentiation, sigmoid, tanh, and reciprocal of square root. Although prior work on secure 2-party inference provides specialized protocols for convolutional neural networks (CNNs), existing secure implementations of these math operators rely on generic 2-party computation (2PC) protocols that suffer from high communication. We provide new specialized 2PC protocols for math functions that crucially rely on lookup-tables and mixed-bitwidths to address this performance overhead; our protocols for math functions communicate up to 423× less data than prior work. Furthermore, our math implementations are numerically precise, which ensures that the secure implementations preserve model accuracy of cleartext. We build on top of our novel protocols to build SiRnn, a library for end-to-end secure 2-party DNN inference, that provides the first secure implementations of an RNN operating on time series sensor data, an RNN operating on speech data, and a state-of-the-art ML architecture that combines CNNs and RNNs for identifying all heads present in images. Our evaluation shows that SiRnn achieves up to three orders of magnitude of performance improvement when compared to inference of these models using an existing state-of-the-art 2PC framework.
| Year | Citations | |
|---|---|---|
Page 1
Page 1