Differentially Private Federated Learning with Time-Adaptive Privacy Spending
Date:
In this talk, I presented my two Ph.D. projects that explore privacy-preserving machine learning mechanisms to promote more equitable and private collaborative learning. In the first project, we introduced a framework for learning with group identities, allowing individuals to share data within specific groups, such as business data within a company group or personal genomic data within their family group. We modeled and controlled privacy leakage propagation across potentially overlapping group structures. In the second project, we proposed a novel time-adaptive privacy spending mechanism, enabling participants to preserve more privacy during certain training rounds. Together, these works offer new perspectives on how trust and privacy can be formalized and quantified in federated learning systems.