Improving privacy for digital public goods

Researchers with the Upanzi Network develop recommendations for the Digital Public Goods Alliance

Sarah Maenner

Jan 8, 2025

Recent CMU alumnae led a research group looking to transform the way digital privacy is evaluated. Privacy engineering graduates Geetika Gopi (SCS '23) and Aadyaa Maddi (SCS '23) worked with advisor Giulia Fanti, professor of electrical and computer engineering, to develop recommendations for the Digital Public Goods Alliance (DPGA) about how open source digital goods could be vetted to better protect user data. Along with CMU-Africa's Assane Gueye, Fanti is the co-director of CyLab-Africa and the Upanzi Network, and is also part of a UN-led working group for the DPI Safeguards initiative.

Geetika Gopi and Aadyaa Maddi

Geetika Gopi (SCS '23) and Aadyaa Maddi (SCS '23)

Digital public goods (DPGs) and digital public infrastructure (DPI) are catch-all terms for open source software meant to serve the public. "Digital public goods are open source software, data, AI models, and standards that are intended to contribute to sustainable digital development," Gopi said. DPGs may be developed by individuals, teams, or whole organizations, and many are used by governments across the world to host or facilitate digital civil infrastructure, such as healthcare or systems holding identity information. DPGs are vetted by the DPGA, which was formed by a United Nations mandate in order to ensure these goods are safe and of acceptable quality. To assess DPGs, the DPGA built the DPG Standard: organizations submit a form and share documentation with the DPGA, and if they pass the standard’s requirements, the DPGA adds their projects to a registry from which they can be chosen by prospective users.

"For this study, we wanted to know if the standard is truly effective at evaluating DPG for potential privacy issues," Gopi said, so they evaluated the DPG Standard by focusing on the privacy-related sections of the questionnaire.

Their project came in three parts. To begin, they analyzed the responses to the form filled out by successful DPG candidates. They found that most of the DPGs answered the privacy question in terms of data security, which is a related but distinct concept from privacy: "Security refers to protecting data from unauthorized access, while privacy is about controlling and managing how personal information is collected, used, and shared," explained Gopi. Many DPGs also included few details on how they would protect data, or they made the end-user entirely responsible for their own privacy. "We were shocked to see that these kinds of responses were passing the privacy bar and being admitted to the DPG registry," Gopi said.

Then, they delved into a small number of case studies on DPGs with at least one million users, from sectors including digital ID, news and media, and public health. They identified several ways in which these DPGs could better protect users’ privacy.

Student presenting poster

Source: CSAW 2024

Geetika Gopi presenting at CSAW 2024.

As the third component of the project, the researchers developed recommendations for the DPGA.

They recommended that the privacy standard should use existing, third-party tools and organizations to assess privacy, rather than having the DPGA build out their own custom assessment. They would be required to submit documentation showing that they had undergone a Privacy Impact Assessment (PIA) from an approved provider, so the DPGA could then publicly share those reports so that downstream users may determine if the DPG satisfies their privacy needs.

The team presented their results at this year's Symposium on Usable Privacy and Security (SOUPS), which met in Philadelphia from August 11–13. As a usable privacy and security conference, they felt that SOUPS was the right place to receive feedback from experts and to introduce these recommendations to a wider audience. Gopi reflected on the experience, saying, "A huge piece of changing the way the DPGA conducts privacy reviews is to balance manageable reporting requirements for open-source projects with meaningful privacy protections, which is why we said the SOUPS community is the right one to help us out here."

The team's recommendations are published in the SOUPS conference proceedings as "Privacy Requirements and Realities of Digital Public Goods." The work also was a finalist in the CSAW 2024 Applied Research Competition (North America), and won the social impact category of the competition.

"I initially viewed privacy as more of a policy or regulatory requirement," Gopi said. "But I've come to appreciate the technical nuances and the need to engineer effective and usable privacy protections. What really drives me is the opportunity to empower stakeholders and organizations, and to safeguard end users—that mission-oriented focus drew me to privacy engineering, and I haven't looked back since."