On Wednesday Facebook announced it will start investing in privacy-enhancing technologies (PETs) to minimise the amount of data processed to help protect personal information.
Facebook to invest in PETs
Facebook says it will start working with academics, global organisations, and developers to build solutions and best practices. The leading tech company believes that PETs are the next generation of digital advertising.
“PETs can be used in many different contexts, like COVID-19 contact tracing, identifying city relocation trends and sending electronic payments,” Facebook says.
Facebook wants to be able to use one person’s piece of data, combine with other people’s data, without having to reveal their identity. The company believes this is a very fair and safe way that advertisers should be leveraging.
What are PETs?
The main goal of privacy-enhancing technologies is controlling how much data is sent to third parties. They involve up-to-the-minute techniques drawn from cryptography and statistics. Most importantly, these sophisticated techniques help minimise the data that’s processed while preserving critical functionality like ad measurement and personalisation.
Facebook will be considering these three types of PETs: secure multi-party computation (MPC), on-device learning, and differential privacy.
Secure multi-party computation allows two or more organisations to work together while limiting the information that either party can learn. End-to-end encryption is used to ensure that the parties don’t see each other ‘s data. While enhancing privacy, MPC can also be used to calculate outcomes from more than one party. For example, reporting the results of an ad campaign.
Facebook has already started putting MPC to test. Last year, they began testing a solution called Private Lift Measurement, which uses MPC to help advertisers understand performance.
On-device learning trains an algorithm from insights processed right on your device without sending individual data such as an item purchased or your email address to a remote server or cloud.
Lastly, differential privacy can be used on its own or applied to other privacy-enhancing technologies to protect data from being re-identified.
“Differential privacy works by including carefully calculated “noise” to a dataset. For example, if 118 people bought a product after clicking on an ad, a differentially private system would add or subtract a random amount from that number.instead of 118, someone using that system would see a number like 120 or 114,” Facebook explains.