Skip to ContentSkip to Navigation
University of Groningenfounded in 1614  -  top 100 university
About us Latest news Events PhD ceremonies

Privacy-preserving machine learning over distributed data

PhD ceremony:Mr A.R. (Ali Reza) Ghavamipour
When:December 03, 2024
Start:14:30
Supervisors:F. (Fatih) Turkmen, Prof, D. (Dimka) Karastoyanova, Prof
Where:Academy building RUG / Student Information & Administration
Faculty:Science and Engineering
Privacy-preserving machine learning over distributed data

Machine Learning is widely used in and outside of academia. A computer 'learns' by itself based on a dataset, and to do this well, large amounts of data are needed. But what about privacy and integrity? Ali Reza Ghavamipour explores methods to maintain the scalability and accuracy of algorithms across various environments, focusing on compliance with privacy regulations and optimal use of machine learning technologies. The central research question investigates how machine learning models can be trained over distributed data in a privacy-preserving manner, broken down into sub-questions that explore different facets of this challenge.

Ghavamipour first focused on the feasibility of performing machine learning operations on encrypted data using multi-party computation. This foundational work led to further investigations into federated learning and synthetic data generation. Ghavamipour introduced a novel framework combining homomorphic encryption and differential privacy, offering a practical approach to synthetic data generation in distributed settings. Additionally, the thesis proposes secure protocols for efficiently aggregating information from distributed sources, enhancing security against both semi-honest and Byzantine adversaries.

These methods were rigorously evaluated using various datasets and scenarios, which demonstrated their practical effectiveness in enhancing privacy and security in distributed learning environments. They provide robust protocols that integrate cryptographic techniques with machine learning algorithms to ensure privacy, marking a significant advancement towards practical, privacy-preserving machine learning. The findings lay a solid foundation for future research and applications that aim to balance leveraging big data for machine learning with upholding stringent data privacy standards. This thesis thus takes a pivotal step forward in making privacy-preserving machine learning achievable.

View this page in: Nederlands