8000 GitHub - srijith1996/FSL-SAGE at llm
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

srijith1996/FSL-SAGE

 
 

Repository files navigation

FSL-SAGE: Federated Split Learning via Smashed Activation Gradient Estimation

Introduction

Our Federated Split Learning (FSL) algorithm cuts down on communication overheads in traditional Split Learning methods by directly estimating server-returned gradients at each client using auxiliary models. The auxiliary models are much smaller versions of the server model which are explicitly trained to estimate the gradients that the server model would return for the client's local input.

The algorithm is summarized in the following schematic:

FSL-SAGE schematic

Requirements

The project requirements can be simply installed using the environment config file conda_env.yaml as follows:

conda env create -f conda_env.yaml

Configuration

This project is powered by Hydra, which allows hierarchical configurations and easy running of multiple ML experiments. The config files for hydra are located in the folder hydra_config.

There is a high degree of customizability here; datasets, models and FL algorithms can be plugged in using configs.

Running

To run FSL-SAGE with defaults from hydra_config, you can simply run

python main.py

To choose a specific model or algorithm, the Hydra command-line override functionality can be used

python main.py model=resnet18 algorithm=cse_fsl

We also support multiruns in parallel using the hydra-joblib-launcher Thus, it is possible to run multiple experiments for different combinations of hyperparams, models, datasets or algorithms.

python main.py -m model=resnet18,simple_conv algorithm=fed_avg,sl_single_server,sl_multi_server,cse_fsl,fsl_sage

The above would create parallel jobs that would run main.py on all combinations of specified options

About

Federated Split Learning via Smashed Activation Gradient Estimation

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published
0