Oasis Labs announced a partnership with Meta and the launch of a platform to evaluate the fairness of Meta products while protecting people’s privacy.
As Meta’s technology partner, Oasis Labs built the platform that uses Secure Multi-Party Computation (SMPC) to protect information, as Meta asked users on Instagram to take a survey to voluntarily share their race or ethnicity.
The project will advance the measurement of fairness in AI models that will positively impact the lives of individuals around the world and benefit society as a whole.
This first-of-its-kind platform will play a major role in an initiative that is an important step in determining whether AI models are fair and allow for appropriate mitigation.
How the platform will assess the fairness of AI models
Meta’s Responsible AI, Instagram Equity and Civil Rights teams are rolling out an off-platform survey to Instagram users. Users will be asked to voluntarily share their race and/or ethnicity.
The data collected by the third-party survey provider will be shared confidentially with the third-party service provider so that neither the service provider nor the Meta can learn about the user’s survey responses.
The facilitators then compute measurements using encrypted prediction data from the AI model, which is cryptographically shared by the Meta, and the combined, de-identified results from each facilitator are reconstituted by the Meta into an overall fairness measure.
The cryptography used by the platform enables Meta to measure bias and fairness, while providing a high level of privacy protection for individuals who provide sensitive demographic measurements.
To learn more about the platform, its goals and release, please visit here.
keep in touch
If you want to keep up with the latest developments at Oasis Labs, please join:
Discord | Twitter | Subscribe to Newsletter
All information contained on our website is published in good faith and for general information purposes only. Any action that readers take with respect to the information on our site is entirely at their own risk.