You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I understand from your excellent article that you have reproduced a lot of noise based adversarial attack methods. I would like to ask you if there are any adversarial attacks that can be achieved without labelling by reproducing those attack methods. That is, the dataset has no labels, only GT graphs, and I would like to ask you if you have any attack algorithms for this kind of dataset. Or I would like to ask you if there is a way to apply those attack methods with this kind of dataset.
Thank you.
The text was updated successfully, but these errors were encountered:
I understand from your excellent article that you have reproduced a lot of noise based adversarial attack methods. I would like to ask you if there are any adversarial attacks that can be achieved without labelling by reproducing those attack methods. That is, the dataset has no labels, only GT graphs, and I would like to ask you if you have any attack algorithms for this kind of dataset. Or I would like to ask you if there is a way to apply those attack methods with this kind of dataset.
Thank you.
The text was updated successfully, but these errors were encountered: