Skip to content

Gradient computation tutorial #42

@yoavram

Description

@yoavram

Hi
I think the docs are missing an example of how we can use sunode to compute gradients wrt to model parameters using the adjoint solver.
I'm not even sure what gradients are computed - of the solution ? Sundials talks about a "derived function", which I take to be , for example, a loss function. But not sure how this applies to sunode.
I understand I should use solve_backward; but the function is not documents. I'm especially confused about what the "grads" argument is, and what the output variables are - "grad_out" vs "lamda_out".
If you explain this I would be happy to share a notebook that uses sunode to fit an ODE - this could be useful for the documentation.
Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions