8000 Tags · MisaOgura/flashtorch · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

Tags: MisaOgura/flashtorch

Tags

v0.1.3

Toggle v0.1.3's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
IMPROVEMENT: handle models with grayscale input (#31)

* Handle models with grayscale input

* Fix spacing

* Fix linting

* Bump version

v0.1.2

Toggle v0.1.2's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
Backprop.visualize passes use_gpu flag (#25)

* Backprop.visualize passes use_gpu flag

* Bump version

v0.1.1

Toggle v0.1.1's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
Remove README depencendy from setup.py (#15)

v0.1.0

Toggle v0.1.0's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
Implement activation maximization (#7)

* Break from the iteration as soon as the first conv layer is found

* Extract common utils

* Implement gradient ascent and optimization with adam

* Iterate only twice for faster test speed

* Test registration of the hooks to the right layers

* Create method for visualisation

* Test visualisation of one filter

* Make the method exclusive for visualising one filter

* Change the default learning rate for Adam to 0.01

* Accommodate for other int types

* Create method for plotting random filters from a layer

* Enable setting of lr and weight decay for Adam

* Set img_size as the attribute of the class

* Set with_adam as the attribute of the class

* Test when num_subplots > total_num_filters

* Create a single api entry point for plotting

* Pass in a conv layer to visualise rather than layer idx

* Reorganise public and private interfaces

* Remove unnecesary class inheritance

* Enable custom adjustment of color saturation and brightness

* Remove gradient ascent with adam

* Make sure to remove existing hooks before optimizing

* Return every iteration of the optimization

* Update docstrings and comments

* Add docstrings

* Make visualize function as a method of the Backprop class

* Standardize the use of language

* Set the title for each subplot

* Create demo notebooks for activemax

* Allow users to set plot titles

* Enable the use of gpu

* Use gpu in colab

* Rename module

* Use the new module name

* Test execusion

* Use z...

* Bump version

* Specify version in __init__.py

* Add brief explanation and set kernel to python3

* Install flashtorch

* Use gpu

* Add some comments on filter visualization

* Add deepdream api

* Add demo for deepdream

* Don't pass in the lr to optimize, use the class attribute

* Update docstring

* Update example notebooks

* Add explanation for deepdream in the example notebooks

* Update README

* Update an image

* Update README

* Fix wrong link

v0.0.8

Toggle v0.0.8's commit message

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
Fix/cuda transfer error (#3)

* Allow explicit setting of a device

* Test using gpu if available

* Indicate supported python versions

* Bump version

v0.0.7

Toggle v0.0.7's commit message
Bump version

v0.0.6

Toggle v0.0.6's commit message
Include ImageNet json in the package

v0.0.5

Toggle v0.0.5's commit message
Bump version

v0.0.4

Toggle v0.0.4's commit message
Add epsilin only when std is zero

v0.0.3

Toggle v0.0.3's commit message
Bump version

0