8000 GitHub - StormFlaate/adversarial_mitigation
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

StormFlaate/adversarial_mitigation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Jupyter Notebook Python Anaconda PyTorch scikit-learn NumPy

Adversarial Machine Learning Mitigation

Author:

Table of Contents

Project Description

In this project we will implement two CNN's, Inception V3 and Resnet-18. These models will be trained on the ISIC2018 and ISIC2019 datasets. Furthermore, we will implement two types of adversarial attacks to pertubate input images and try to fool the models. When this milestone is finished, we will extract kernels and other parameters from the model to train a adversarial detector model, which are trained on perturbated and non-pertubated images.

Project Structure

  • data/ - Directory for storing data files
  • augmented_data/ - Directory for storing augmented data files
  • models/ - Default directory for trained and tested models
  • run_* - Files used to run python scripts
  • *_helper - Files used for helper functions
    • train_model_helper.py: Helper functions related to the run_train_model.py file.
      • train_model()
      • test_model()
      • get_category_counts()
      • random_split()
      • get_data_loaders()
    • data_exploration_helper.py: Helper functions related to EDA (Exploratory Data Analysis)
      • dataset_overview()
      • perform_eda()
    • misc_helper.py: A collection of miscellaneous helper functions that do not have a specific category assignment
      • truncated_uuid4()
      • get_trained_or_default_model()
      • save_model_and_parameters_to_file()
      • load_model_from_file()
      • file_exists()
      • folder_exists()
    • adversarial_attacks_helper.py: Helper functions related to adversarial attacks
  • customDataset.py: Custom dataset which inherits from torch.data.utils.Dataset. Collects dataset based on the value provided in config.py parameter file.
  • README.md - This file, containing information about the project.

Getting Started

  1. Clone the repo and enter directory adversarial_mitigation
    git clone https://github.com/StormFlaate/adversarial_mitigation.git
  2. create environment
    conda create -n adv_mit python=3.10
  3. activate environment
    conda activate adv_mit

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published
0