Oracles for Privacy-Preserving Machine Learning
dc.contributor.advisor | Baldimtsi, Foteini | |
dc.creator | Do, Minh Quan | |
dc.date | 2022-10-28 | |
dc.date.accessioned | 2023-06-13T13:30:16Z | |
dc.date.available | 2023-06-13T13:30:16Z | |
dc.description.abstract | Currently, the process of deploying machine learning models in production can leak information about the model such as model parameters. This leakage of information is problematic because it opens the door to a plethora of attacks that can compromise the privacy of the data used to train the model. In this thesis, we will introduce definitions for new primitives that are specifically designed for deploying machine learning models into production in such a way that guarantees the privacy of the model’s parameters and the underlying dataset. We will also provide definitions for security, propose a scheme for deploying a model into production, and informally argue the security of our scheme. | |
dc.format.medium | masters theses | |
dc.identifier.uri | https://hdl.handle.net/1920/13295 | |
dc.language.iso | en | |
dc.rights | Copyright 2022 Minh Quan Do | |
dc.rights.uri | https://rightsstatements.org/vocab/InC/1.0 | |
dc.subject.keywords | Machine learning | |
dc.subject.keywords | Security | |
dc.subject.keywords | Privacy preserving | |
dc.subject.keywords | Applied cryptography | |
dc.title | Oracles for Privacy-Preserving Machine Learning | |
dc.type | Text | |
thesis.degree.discipline | Computer Science | |
thesis.degree.grantor | George Mason University | |
thesis.degree.level | Master's | |
thesis.degree.name | Master of Science in Computer Science |