Generate confidential inferences without compromising user data

Photo by Jason Dent on Unsplash

The ever-expanding field of confidential computing essentially boils down to two main properties:

  1. Confidentiality: How do we perform computational operations on sensitive user data (such as health records and credit histories) without ever “seeing” the data in its unencrypted form?
  2. Integrity: If we are able to pull off (1), how do we prove to the user that their privacy was preserved?

AWS thinks they have the answer in their Nitro Enclave offering, which leverages the Trusted Execution Environments (TEE’s) on modern processors to create attestable applications that are isolated from tampering by external networks or even the host OS. TEE’s provide confidentiality and integrity, but they are notoriously difficult to use. In this tutorial, we will use the Nitro Enclave platform for a relatively complex task (by confidential computing standards): secure evaluation of a convolutional neural network from a user-supplied image.

“Lift and Shift”

AWS touts Nitro Enclaves as a “lift and shift” solution, but TEE’s introduce some idiosyncrasies that we will have to address in our application. The good news is that standard docker images can be converted into the “Enclave Image File” (.EIF) format that the enclave expects. We can build relatively rich programs without being constrained to a certain language or OS, and we can allocate almost all of our host’s memory and CPU resources to the TEE. Using the Nitro CLI, many of the low-level operations are abstracted into a handful of simple commands.

While these features drastically lower the learning curve for developing confidential applications (especially compared to legacy platforms), it does not mean that you can drop an unmodified docker image into an enclave and expect it to work. The most significant difference is the networking interface — a Nitro Enclave is completely isolated from all external connections, except for a local socket that allows it to communicate with the host instance. While this results in a tamperproof computing environment, it also means that we will have to resort to low-level protocols (e.g. Python’s socket library) in order to pipe data to and from the enclave.

Nitro Enclaves can only communicate with the parent instance through a secure virtual socket (image by author)

Another problematic byproduct of enclaves’ isolation is that they have difficulty generating entropy….

Continue reading: https://towardsdatascience.com/privacy-preserving-deep-learning-with-aws-nitro-enclaves-74c72a17f857?source=rss—-7f60cf5620c9—4

Source: towardsdatascience.com