Biomedical data are often collected and hosted by different institutions in a distributed manner. Developing distributed data analysis methods to build a global model is of interest to data owners but we need to protect the privacy of involved participants. Differential privacy can protect model parameters learned by the logistic regression. In a distributed setting, we can decompose the necessary noise to perturb the local objectives to ensure the final learned model is differentially private and the intermediary statistics must be well protected. We propose a hybrid mechanism using homomorphic encryption together with distributed Laplacian perturbation algorithm (DLPA) to obtain an secure and efficient solution.
Back to Algorithmic Challenges in Protecting Privacy for Biomedical Data