8000 How can I add custom gradients? · Issue #77 · ekmett/ad · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
How can I add custom gradients? #77
Open
@kai-qu

Description

@kai-qu

If I want to give a custom or known gradient for a function, how can I do that in this library? (I don't want to autodifferentiate through this function.) I am using the grad function.

If the library doesn't provide this feature, is there some way I can easily implement this functionality myself, perhaps by changing the definitions of leaf nodes or by editing the dual numbers that presumably carry the numerical gradients?

Here's a concrete example of what I mean:

Say I have some function I want to take the gradient of, say f(x, y) = x^2 + 3 * g(x, y)^2. Then say that g(x, y) is a function whose definition is complicated and involves lots of Haskell code, but whose gradient I've already calculated analytically and is quite simple. Thus, when I take grad f and evaluate it at a point (x, y), I'd like to just plug in my custom gradient for g, instead of autodiffing through it: something like my_nice_grad_of_g (x, y).

I see other autodiff libraries do provide this feature, for example Stan and Tensorflow both allow users to define gradients of a function.

Thanks!

Metadata

Metadata

Assignees

No one assigned

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      0