Yes, we have developed modular and efficient secure aggregation and differential privacy solutions that can help people dial in the amount of protection they need. We have documented an early version of the secure aggregation here: https://flower.dev/docs/secagg.html Documentation and updates on both methods will be released soon.
Thanks for the question, very natural to ask. We are also fans of PySft. It offers support for a very wide range of privacy enhancing machine learning tools. But where Flower and PySft differ is in focus. Federated learning is difficult and requires many technical moving parts all working together (e.g., secure aggregation, differential privacy, scalable simulation, device deployments, integration with conventional ML frameworks etc.). All of these need to tightly integrated, and in a manner that performs federated learning efficiently. This is where Flower currently excels. It offers comprehensive, extensible and, most important, easy to use construction of federations that need these different parts together. We believe it offers the best user experience for federated learning currently out there. We hope in the future many tool suites that offer private machine learning (like PySft and others) will actually adopt Flower components so we can all work better together.
I appreciate you taking the time to break this down, I’ve spent a decent chunk of time having to roll my own stuff so when pygrid/pysyft came along it was just easier. I will say the flower components look interesting and I’ll give it a shot
Thanks for adding this here! We added these DP wrappers, and we're working on something similar for Secure Aggregation, but I must admit that we have to document them better to make using them easier for everyone