Skip to content

val-iisc/Hard-Label-Model-Stealing

Repository files navigation

Towards Data-Free Model Stealing in a Hard Label Setting

CVPR 2022

Sunandini Sanyal, Sravanti Addepalli, R. Venkatesh Babu

Video Analytics Lab, Indian Institute of Science, Bengaluru

Approach

Approach_Diagram

Setup the requirements

The following versions of Pytorch and Tensorflow are needed to run the code.

Pytorch 1.9.1

Tensorflow 2.6.0

Run the Model Stealing Attack

The folder contains the code and the script files to run the code with different settings of proxy data. Command to run 10 random classes of CIFAR-100 with AlexNet as victim model and AlexNet-half as clone model:

./run_cifar10_rand_class_alexnet.sh

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published