10000 GitHub - gcristinelli/One-cut-conditional-gradient: This module solves a PDE constrained minimisation problem with TV-regularization, using the method described in the paper "Linear convergence of a one-cut conditional gradient method for total variation regularization".
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

This module solves a PDE constrained minimisation problem with TV-regularization, using the method described in the paper "Linear convergence of a one-cut conditional gradient method for total variation regularization".

License

Notifications You must be signed in to change notification settings

gcristinelli/One-cut-conditional-gradient

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

One-cut-conditional-gradient

DOI

Authors: Giacomo Cristinelli, José A. Iglesias, Daniel Walter

This module solves the control problem with Total Variation regularization

$$\min_{u\in \text{BV}(\Omega)} \frac{1}{2\alpha} |Ku-y_d|^2 + \text{TV}(u,\Omega)$$

where K is an operator associated with a linear PDE.

It employs the method described in the paper "Linear convergence of a one-cut conditional gradient method for total variation regularization".

Important libraries:

FEniCS (Dolfin) --version 2019.1.0 (https://fenicsproject.org/)

Maxflow (http://pmneila.github.io/PyMaxflow/maxflow.html)

NetworkX --version 3.0 (https://networkx.org)

About

This module solves a PDE constrained minimisation problem with TV-regularization, using the method described in the paper "Linear convergence of a one-cut conditional gradient method for total variation regularization".

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

0