Privacy

Implement differential privacy to power up data sharing and cooperation

Comment

climbing ropes connected by carabiner
Image Credits: massimo colombo (opens in a new window) / Getty Images

Maxime Agostini

Contributor

Maxime Agostini is the co-founder and CEO of Sarus, a privacy company supported by Y Combinator that lets organizations leverage confidential data for analytics and machine learning.

Traditionally, companies have relied upon data masking, sometimes called de-identification, to protect data privacy. The basic idea is to remove all personally identifiable information (PII) from each record. However, a number of high-profile incidents have shown that even supposedly de-identified data can leak consumer privacy.

In 1996, an MIT researcher identified the then-governor of Massachusetts’ health records in a supposedly masked dataset by matching health records with public voter registration data. In 2006, UT Austin researchers re-identifed movies watched by thousands of individuals in a supposedly anonymous dataset that Netflix had made public by combining it with data from IMDB.

In a 2022 Nature article, researchers used AI to fingerprint and re-identify more than half of the mobile phone records in a supposedly anonymous dataset. These examples all highlight how “side” information can be leveraged by attackers to re-identify supposedly masked data.

These failures led to differential privacy. Instead of sharing data, companies would share data processing results combined with random noise. The noise level is set so that the output does not tell a would-be attacker anything statistically significant about a target: The same output could have come from a database with the target or from the exact same database but without the target. The shared data processing results do not disclose information about anybody, hence preserving privacy for everybody.

Operationalizing differential privacy was a significant challenge in the early days. The first applications were primarily the provenance of organizations with large data science and engineering teams like Apple, Google or Microsoft. As the technology becomes more mature and its cost decreases, how can all organizations with modern data infrastructures leverage differential privacy in real-life applications?

Differential privacy applies to both aggregates and row-level data

When the analyst cannot access the data, it is common to use differential privacy to produce differentially private aggregates. The sensitive data is accessible through an API that only outputs privacy-preserving noisy results. This API may perform aggregations on the whole dataset, from simple SQL queries to complex machine learning training tasks.

A typical set-up for leveraging personal data with differential privacy guarantees
A typical setup for leveraging personal data with differential privacy guarantees. Image Credits: Sarus

One of the disadvantages of this setup is that, unlike data masking techniques, analysts no longer see individual records to “get a feel for the data.” One way to mitigate this limitation is to provide differentially private synthetic data where the data owner produces fake data that mimics the statistical properties of the original dataset.

The whole process is done with differential privacy to guarantee user privacy. Insights on synthetic data are generally noisier than when running aggregate queries, so it is not advisable to perform aggregations from synthetic data; when possible, differentially private aggregation on the raw data is preferable.

However, synthetic data does provide analysts with the look and feel to which they are accustomed. Data professionals can now have access to their row-level data while eating their differential privacy cake.

Choosing the right architecture

The way data is separated from the analyst depends on where the data is stored. The two main architectures are global differential privacy and local differential privacy.

In the global setup, a central party aggregates the data from many individuals and processes it with differential privacy. This is what the U.S. Census Bureau did in 2022 when it released census data for the entire country’s population.

In the local setup, the data remains on the user’s device; queries from data practitioners are processed on the device with differential privacy. This is being used by Google in Chrome or Apple on the iPhone, for instance.

While more secure, the local approach requires significantly more noise per user, making it unsuitable for small user bases. It is also much harder to deploy since every computation needs to be orchestrated across thousands or millions of independent user devices.

Most enterprises have already collected raw user data on their servers where data can be processed. For them, global differential privacy is the right approach.

In local DP, each device processes its own data and adds noise locally; the analyst eventually aggregates all noisy outputs to reconstruct the desired insight
In local DP, each device processes its own data and adds noise locally; the analyst eventually aggregates all noisy outputs to reconstruct the desired insight. Image Credits: Sarus

Data sharing with differential privacy

Differential privacy can also be used for data sharing. Obviously, one should not expect to share a differentially private dataset and still be able to match records. It would be a blatant breach of the promise of privacy (differential privacy actually guarantees this will never be possible). This leaves the practitioner with two options.

The first option is to share differentially private synthetic data. However, while synthetic datasets can be trained to be accurate on a predetermined set of queries or metrics, they come with no accuracy guarantees for new queries outside of the predetermined set. For this reason, insights built on synthetic data can be risky to use for decision-making. Plus, matching users is clearly impossible on a purely synthetic dataset.

The second option is to share access to the confidential data via a differentially private aggregation mechanism instead of the data itself. The insights are typically much more accurate than if estimated from differentially private synthetic data. Even better, matching individuals remains possible.

The analyst may submit a query that first joins their new outside user-level data with the confidential user-level data, run the query on the joint data with differential privacy, and derive truly anonymous insights from it. This approach achieves the best of both worlds: optimal privacy while preserving the full granularity of each record.

It makes one assumption, though: While the original confidential information remains confidential, the new outside data must travel onto a third-party system during computation. This approach has been popularized by Google and Facebook under the name of data clean rooms.

There are variations that rely on cryptographic techniques to protect the confidentiality of both datasets that are beyond the scope of this article.

Differential-privacy based systems allow knowledge sharing on matched users without breaking the promise of privacy
Differential-privacy-based systems allow knowledge sharing on matched users without breaking the promise of privacy. Image Credits: Sarus

Proprietary and open source software

To implement differential privacy, one should not start from scratch, as any implementation mistake could be catastrophic for the privacy guarantees. On the open source side, there are plenty of libraries that provide the basic differential privacy building blocks.

The main ones are:

  • OpenDP (SmartNoise Core and SmartNoise SDK): an initiative led by Harvard University providing differentially private primitives and tools to transform any SQL query into a differentially private one.
  • Google DP and TensorFlow Privacy: Google open source project providing primitives for analytics and machine learning.
  • OpenMined PyDP: a Python library built on top of Google DP.
    OpenMined PipelineDP: also built on top of Google DP’s privacy on Beam.
  • Pytorch Opacus: a Facebook project for differentially private deep learning in pytorch.
  • PyVacy: from Chris Waites.
  • IBM Diffprivlib: an IBM initiative with differential privacy primitives and machine learning models.

However, just like having access to a secure cryptographic algorithm is not sufficient to build a secure application, having access to secure implementation of differential privacy is not sufficient to power a data lake with privacy guarantees.

Some startups are bridging this gap by proposing proprietary solutions available off the shelf.

Leapyear has raised $53 million to propose differentially private processing of databases. Tonic raised $45 million to let developers build and test software with fake data generated with the same mathematical protection.

Gretel, with $68 million in the bank, uses differential privacy to propose privacy engineering as a service while open sourcing some of the synthetic data generation models. Privitar, a data privacy solution that has raised $150 million so far, now features differential privacy among other privacy techniques.

Datafleets was developing distributed data analysis powered by differential privacy before being acquired by LiveRamp for over $68 million. Sarus, where one of the authors is a co-founder, recently joined Y Combinator and proposes to power data warehouses with differential privacy without changing existing workflows.

In conclusion

Differential privacy doesn’t just better protect privacy — it can also power data sharing solutions to facilitate cooperation across departments or companies. Its potential comes from the ability to provide guarantees automatically without lengthy, burdensome privacy risk assessments.

Projects used to require a custom risk analysis and a custom data masking strategy to implement into the data pipeline pipeline. Differential privacy takes the onus off compliance teams, letting math and computers algorithmically determine how to protect user privacy cheaply, quickly and reliably.

The shift to differential privacy will be critical for companies looking to stay nimble as they capture part of what McKinsey estimates is $3 trillion of value generated by data collaboration.

More TechCrunch

The Twitter for Android client was “a demo app that Google had created and gave to us,” says Particle co-founder and ex-Twitter employee Sara Beykpour.

Google built some of the first social apps for Android, including Twitter and others

WhatsApp is updating its mobile apps for a fresh and more streamlined look, while also introducing a new “darker dark mode,” the company announced on Thursday. The messaging app says…

WhatsApp’s latest update streamlines navigation and adds a ‘darker dark mode’

Plinky lets you solve the problem of saving and organizing links from anywhere with a focus on simplicity and customization.

Plinky is an app for you to collect and organize links easily

The keynote kicks off at 10 a.m. PT on Tuesday and will offer glimpses into the latest versions of Android, Wear OS and Android TV.

Google I/O 2024: How to watch

For cancer patients, medicines administered in clinical trials can help save or extend lives. But despite thousands of trials in the United States each year, only 3% to 5% of…

Triomics raises $15M Series A to automate cancer clinical trials matching

Welcome back to TechCrunch Mobility — your central hub for news and insights on the future of transportation. Sign up here for free — just click TechCrunch Mobility! Tap, tap.…

Tesla drives Luminar lidar sales and Motional pauses robotaxi plans

The newly announced “Public Content Policy” will now join Reddit’s existing privacy policy and content policy to guide how Reddit’s data is being accessed and used by commercial entities and…

Reddit locks down its public data in new content policy, says use now requires a contract

Eva Ho plans to step away from her position as general partner at Fika Ventures, the Los Angeles-based seed firm she co-founded in 2016. Fika told LPs of Ho’s intention…

Fika Ventures co-founder Eva Ho will step back from the firm after its current fund is deployed

In a post on Werner Vogels’ personal blog, he details Distill, an open-source app he built to transcribe and summarize conference calls.

Amazon’s CTO built a meeting-summarizing app for some reason

Paris-based Mistral AI, a startup working on open source large language models — the building block for generative AI services — has been raising money at a $6 billion valuation,…

Sources: Mistral AI raising at a $6B valuation, SoftBank ‘not in’ but DST is

You can expect plenty of AI, but probably not a lot of hardware.

Google I/O 2024: What to expect

Dating apps and other social friend-finders are being put on notice: Dating app giant Bumble is looking to make more acquisitions.

Bumble says it’s looking to M&A to drive growth

When Class founder Michael Chasen was in college, he and a buddy came up with the idea for Blackboard, an online classroom organizational tool. His original company was acquired for…

Blackboard founder transforms Zoom add-on designed for teachers into business tool

Groww, an Indian investment app, has become one of the first startups from the country to shift its domicile back home.

Groww joins the first wave of Indian startups moving domiciles back home from US

Technology giant Dell notified customers on Thursday that it experienced a data breach involving customers’ names and physical addresses. In an email seen by TechCrunch and shared by several people…

Dell discloses data breach of customers’ physical addresses

Featured Article

Fairgen ‘boosts’ survey results using synthetic data and AI-generated responses

The Israeli startup has raised $5.5M for its platform that uses “statistical AI” to generate synthetic data that it says is as good as the real thing.

5 hours ago
Fairgen ‘boosts’ survey results using synthetic data and AI-generated responses

Hydrow, the at-home rowing machine maker, announced Thursday that it has acquired a majority stake in Speede Fitness, the company behind the AI-enabled strength training machine. The rowing startup also…

Rowing startup Hydrow acquires a majority stake in Speede Fitness as their CEO steps down

Call centers are embracing automation. There’s debate as to whether that’s a good thing, but it’s happening — and quite possibly accelerating. According to research firm TechSci Research, the global…

Retell AI lets companies build ‘voice agents’ to answer phone calls

TikTok is starting to automatically label AI-generated content that was made on other platforms, the company announced on Thursday. With this change, if a creator posts content on TikTok that…

TikTok will automatically label AI-generated content created on platforms like DALL·E 3

India’s mobile payments regulator is likely to extend the deadline for imposing market share caps on the popular UPI (unified payments interface) payments rail by one to two years, sources…

India likely to delay UPI market caps in win for PhonePe-Google Pay duopoly

Line Man Wongnai, an on-demand food delivery service in Thailand, is considering an initial public offering on a Thai exchange or the U.S. in 2025.

Thai food delivery app Line Man Wongnai weighs IPO in Thailand, US in 2025

The problem is not the media, but the message.

Apple’s ‘Crush’ ad is disgusting

Ever wonder why conversational AI like ChatGPT says “Sorry, I can’t do that” or some other polite refusal? OpenAI is offering a limited look at the reasoning behind its own…

OpenAI offers a peek behind the curtain of its AI’s secret instructions

The federal government agency responsible for granting patents and trademarks is alerting thousands of filers whose private addresses were exposed following a second data spill in as many years. The…

US Patent and Trademark Office confirms another leak of filers’ address data

As part of an investigation into people involved in the pro-independence movement in Catalonia, the Spanish police obtained information from the encrypted services Wire and Proton, which helped the authorities…

Encrypted services Apple, Proton and Wire helped Spanish police identify activist

Match Group, the company that owns several dating apps, including Tinder and Hinge, released its first-quarter earnings report on Tuesday, which shows that Tinder’s paying user base has decreased for…

Match looks to Hinge as Tinder fails

Private social networking is making a comeback. Gratitude Plus, a startup that aims to shift social media in a more positive direction, is expanding its wellness-focused, personal reflections journal to…

Gratitude Plus makes social networking positive, private and personal

With venture totals slipping year-over-year in key markets like the United States, and concern that venture firms themselves are struggling to raise more capital, founders might be worried. After all,…

Can AI help founders fundraise more quickly and easily?

Google has found a way to bring a variation of its clever “Circle to Search” gesture to iPhone users. The new interaction, launched in January, allows Android users to search…

Google brings a variation on ‘Circle to Search’ to iPhone users

A new sculpture going live on Wednesday in the Flatiron South Public Plaza in New York is not your typical artwork. It combines technology, sociology, anthropology and art to let…

Always-on video portal lets people in NYC and Dublin interact in real time