The presentation cover couldn't load. :/
Profile picture

Théo Ryffel

At the crossroads of Machine Learning and Cryptography


Generic placeholder image
Pysyft + Opacus: Federated Learning with Differential Privacy
by Théo Ryffel on September 30th, 2020

Deep Learning Federated Learning Differential Privacy

We present here a very simple example of combining Federated Learning (FL) with Differential Privacy (DP), which can be an interesting baseline to experiment with these great technologies. More specifically, we show how the DP Opacus library released by PyTorch can be used in PySyft FL workflows with very little overhead.

Generic placeholder image
Encrypted inference with Resnet-18 using Pytorch + Pysyft on ants & bees images
by Théo Ryffel on September 15th, 2020

Function Secret Sharing Secure Multi-Party Computation Encrypted Computation Deep Learning

We label encrypted images with an encrypted ResNet-18 using PySyft and Function Secret Sharing.

Generic placeholder image
Encrypted training with Pytorch + Pysyft
by Théo Ryffel on August 5th, 2019

Deep Learning Private AI Secure Multi-Party Computation Encrypted Computation

We use the PySyft library to encrypt a neural network and privately classify MNIST images using Secure Multi-Party Computation (SMPC). We achieve classification in <33ms with>98% accuracy over local (virtualized) computation.


Generic placeholder image
Encrypted Deep Learning Classification with PyTorch & PySyft
by Théo Ryffel on April 16th, 2019

Deep Learning Private AI Secure Multi-Party Computation Encrypted Computation

We use the PySyft library to train a PyTorch neural network on MNIST using Secure Multi-Party Computation (SMPC). We combine PyTorch nets, SMPC & Autograd in a single demo.


Generic placeholder image
Deep Learning & Federated Learning in 10 Lines of PyTorch + PySyft
by Théo Ryffel on March 1st, 2019

Deep Learning Federated Learning PyTorch

We show how to do Federated Learning with PySyft in a very easy way by training a Convolutional Neural Network over a distributed version of the MNIST Dataset.

Federated Learning
Federated Learning is a very exciting and upsurging Machine Learning technique for learning on decentralized data. The core idea is that a training dataset can remain in the hands of its producers (also known as _workers_) which helps improve privacy and ownership, while the model is shared between workers. One popular application of Federated Learning is for learning the "next word prediction" model on your mobile phone when you write SMS messages: you don't want the data used for training that predictor — i.e. your text messages — to be sent to a central server. (Read more.)