Demystifying Uneven Vulnerability of Link Stealing Attacks against Graph Neural Networks

Date

2023

Authors

Zhang, H.
Wu, B.
Wang, S.
Yang, X.
Xue, M.
Pan, S.
Yuan, X.

Editors

Advisors

Journal Title

Journal ISSN

Volume Title

Type:

Conference paper

Citation

Proceedings of the 40th International Conference on Machine Learning (ICML'23), as published in Proceedings of Machine Learning Research, 2023, vol.202, pp.41737-41752

Statement of Responsibility

He Zhang, BangWu, ShuoWang, Xiangwen Yang, Minhui Xue, Shirui Pan, Xingliang Yuan

Conference Name

40th International Conference on Machine Learning (ICML) (23 Jul 2023 - 29 Jul 2023 : Honolulu, Hawaii, USA)

Abstract

While graph neural networks (GNNs) dominate the state-of-the-art for exploring graphs in realworld applications, they have been shown to be vulnerable to a growing number of privacy attacks. For instance, link stealing is a well-known membership inference attack (MIA) on edges that infers the presence of an edge in a GNN’s training graph. Recent studies on independent and identically distributed data (e.g., images) have empirically demonstrated that individuals from different groups suffer from different levels of privacy risks to MIAs, i.e., uneven vulnerability. However, theoretical evidence for such uneven vulnerability is missing. In this paper, we first present theoretical evidence of the uneven vulnerability of GNNs to link stealing attacks, which lays the foundation for demystifying such uneven risks among different groups of edges. We further demonstrate a group-based attack paradigm to expose the practical privacy harm to GNN users derived from the uneven vulnerability of edges. Finally, we empirically validate the existence of obvious uneven vulnerability on ten real-world datasets (e.g., about 25% AUC difference between different groups in the Credit graph). Compared with existing methods, the outperformance of our group-based attack paradigm confirms that customising different strategies for different groups results in more effective privacy attacks.

School/Discipline

Dissertation Note

Provenance

Description

Access Status

Rights

© Copyright 2023 by the author(s).

License

Call number

Persistent link to this record