Differential privacy (DP) has become the gold standard in privacy-preserving data analysis, effectively balancing data utility with robust privacy protections. However, analysts often face strict accuracy requirements, which traditional DP mechanisms struggle to meet, especially under tight privacy constraints. In our recent research at TikTok, we introduced an innovative privacy boosting framework that addresses this challenge by dynamically adapting noise distributions to meet specific utility requirements.
Our approach modifies the noise added by DP mechanisms to significantly increase the chances that outputs stay within the desired accuracy range, referred to as preferred regions, while still strictly maintaining DP guarantees. Unlike traditional DP, which focuses on privacy and calculates utility afterward, our framework starts by targeting utility requirements directly and adapts the noise accordingly.
Key highlights of our framework include:
- Adaptable Utility Constraints: Supports complex, data-dependent, or independent utility requirements, accommodating scenarios where absolute, relative, or deterministic error bounds are critical.
- Reduced Privacy Leakage: Strategically increases probability density within preferred regions, allowing for lower overall privacy loss compared to standard DP mechanisms.
- Comprehensive Privacy Analysis: Provides detailed characterizations of privacy loss distributions, including rigorous formulations for (ε, δ)-DP and Rényi DP (RDP), ensuring transparency and trust in privacy guarantees.
- Efficiency and Flexibility: Utilizes versatile kernel mechanisms (e.g., Gaussian or Laplace noise) and efficiently calculates optimal parameters, enabling seamless integration into diverse real-world applications.
We validated our framework through extensive experiments, showcasing notable improvements over existing DP methods. Our findings indicate significant reductions in privacy loss without sacrificing critical utility standards, proving particularly effective in cases with high query sensitivity.
This work not only advances differential privacy theory but also offers practical tools for organizations aiming to maintain strong privacy while meeting precise utility demands.
For a deeper look at our methodology and results, read the full paper here: https://arxiv.org/abs/2406.02463



