Interval Privacy - data privacy framework
A novel data privacy framework that enables easy data sharing for machine learning and other applications without compromising privacy.
- Data Privacy
- Machine Learning & AI
Data privacy has become an increasingly important concern in various engineering and social science fields including communications, networking, epidemiology, where sensitive data are often exposed to potential adversaries. Prof. Jie Ding has developed a novel framework of data privacy, named as Interval Privacy, for protecting data that are being collected for subsequent learnings. This framework will allow data collectors to infer population-wide information, but prohibit them from inferring specific information of any individual. The key idea is to transform each data point into a random interval containing it, thus obscuring the data collector to an extent that is designable, while still allowing accurate population-wide statistical inferences. The developed framework includes novel notions and design methods for practical implementations. The unique advantages of this framework are:
1) it is insensitive to extreme values, and
2) its privacy leakage is easily interpretable, but the privacy leakage under differential privacy is purely mathematical and obscure.
Phase of DevelopmentTRL: 3
Algorithms developed for applying interval privacy to a wide range of machine learning domains covering supervised learning, unsupervised learning, and reinforcement learning.
Desired PartnershipsThis technology is now available for:
- Sponsored research
Please contact our office to share your business’ needs and learn more.
- Jie Ding, PhD, Assistant Professor, Statistics