Bookmark and Share

"An Ultra-low-power Multi-core Engine for Inference on Encrypted DNNs," a Presentation from Xperi

Register or sign in to access the Embedded Vision Academy's free technical training content.

The training materials provided by the Embedded Vision Academy are offered free of charge to everyone. All we ask in return is that you register, and tell us a little about yourself so that we can understand a bit about our audience. As detailed in our Privacy Policy, we will not share your registration information, nor contact you, except with your consent.

Registration is free and takes less than one minute. Click here to register, and get full access to the Embedded Vision Academy's unique technical training content.

If you've already registered, click here to sign in.

See a sample of this page's content below:

Petronel Bigioi, CTO for Imaging at Xperi, presents the "An Ultra-low-power Multi-core Engine for Inference on Encrypted DNNs" tutorial at the May 2019 Embedded Vision Summit.

Neural network encryption is a useful method to secure a company’s IP. This presentation focuses on the design details of an ultra-low-power, scalable neural network core capable of performing inference on encrypted neural networks. Decryption of the neural network weights and topology take place inside the core, avoiding the need for decrypted networks to be present at any time in main memory. Bigioi also discusses solutions clustering together multiple neural network cores to meet the neural inference processing requirements of a target SoC platform.