Structural Differential Privacy in Graph Neural Networks

No Thumbnail Available

Date

2025-07-23

Journal Title

Journal ISSN

Volume Title

Publisher

Indian Statistical Institute, Kolkata

Abstract

Graph Neural Networks (GNNs) have demonstrated impressive performance across a range of graph-based learning tasks. However, their application to domains with sensitive relational data raises serious privacy concerns, as the graph structure itself may leak confidential information. This thesis investigates a decentralized framework for enforcing edge-level local di!erential privacy (LDP) in graph-structured data. We introduce two mechanisms that perturb a node’s neighborhood in a privacy-preserving yet utility-aware manner. The first approach replaces randomly selected neighbors with feature-similar nodes from the 2-hop neighborhood, ensuring structural realism while preserving degree. The second approach eliminates the need for explicit 2-hop propagation and dummy vectors, instead relying on randomized feature queries to identify plausible substitutes. Both approaches are evaluated on benchmark graph datasets such as Cora, PubMed, and LastFM using GNN architectures like GCN, GraphSAGE, and GAT. Experimental results show that our methods achieve a favorable trade-o! between structure privacy and learning utility, while avoiding the overhead and privacy leakage risks of centralized or semi-local protocols.

Description

Dissertation under the supervision of Prof. Subhankar Mishra & Prof. Debrup Chakraborty

Keywords

Differential Privacy, Local Differential Privacy, Graph Neural Networks, Privacy-Utility Trade-off

Citation

27p.

Endorsement

Review

Supplemented By

Referenced By