Skip to content

docs(implicit_diff): implicit differentiation integration #73

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 25 commits into from
Sep 22, 2022
Merged
Changes from 1 commit
Commits
Show all changes
25 commits
Select commit Hold shift + click to select a range
14dbb31
docs: init implicit differentiation integration
Benjamin-eecs Sep 9, 2022
0d76585
fix: linear solve docs error remains
Benjamin-eecs Sep 9, 2022
a9cc777
feat(tutorials): add implicit differentiation
Benjamin-eecs Sep 10, 2022
672005a
fix(tutorials): update torchopt import
Benjamin-eecs Sep 11, 2022
6cdd1f0
docs: pass api docs
Benjamin-eecs Sep 11, 2022
9a261f9
docs: pass api docs
Benjamin-eecs Sep 11, 2022
e915753
docs: pass api docs
Benjamin-eecs Sep 11, 2022
d5564b7
fix(implicit): remove argument
JieRen98 Sep 11, 2022
a47beb0
docs: update `custom_root` docstring
XuehaiPan Sep 13, 2022
e4f512f
Merge branch 'main' into docs/implicit_gradient
XuehaiPan Sep 13, 2022
5cf9018
docs: update colab links
Benjamin-eecs Sep 15, 2022
37298d5
Merge branch 'main' into docs/implicit_gradient
Benjamin-eecs Sep 22, 2022
4c0b69b
Merge branch 'main' into docs/implicit_gradient
XuehaiPan Sep 22, 2022
ae89467
docs(implicit): update docstrings for `custom_root`
XuehaiPan Sep 22, 2022
4a36212
docs(CHANGELOG): update CHANGELOG.md
XuehaiPan Sep 22, 2022
623324b
docs(CHANGELOG): update CHANGELOG.md
XuehaiPan Sep 22, 2022
059fc79
docs(implicit): update tutorial
XuehaiPan Sep 22, 2022
84e06b2
docs(implicit): update docstrings
XuehaiPan Sep 22, 2022
df764cf
docs(README): update future plan
XuehaiPan Sep 22, 2022
8b6a945
chore: update gitignore
Benjamin-eecs Sep 22, 2022
a043f7b
chore: update makefile
Benjamin-eecs Sep 22, 2022
956a780
docs: update dictionary
XuehaiPan Sep 22, 2022
504d699
Merge branch 'main' into docs/implicit_gradient
XuehaiPan Sep 22, 2022
d881334
docs(implicit): update docstrings
XuehaiPan Sep 22, 2022
37ea557
fix(implicit): fix has_aux when result is single tensor
XuehaiPan Sep 22, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
docs(implicit): update docstrings for custom_root
  • Loading branch information
XuehaiPan committed Sep 22, 2022
commit ae89467eb4411aaf94300924ebcc3f06a973edc1
19 changes: 18 additions & 1 deletion torchopt/_src/implicit_diff.py
Original file line number Diff line number Diff line change
Expand Up @@ -349,14 +349,31 @@ def custom_root(
) -> Callable[[Callable], Callable]:
"""Decorator for adding implicit differentiation to a root solver.

This wrapper should be used as a decorator:

.. code-block:: python

def optimality_fun(params, ...):
...

@custom_root(optimality_fun, argnums=argnums)
def solver_fun(params, arg1, arg2, ...):
...
return optimal_params

The first argument to ``optimality_fun`` and ``solver_fun`` is preserved as ``params``.
The ``argnums`` argument refers to the indices of the variables in ``solver_fun``'s signature.
For example, setting ``argnums=(1, 2)`` will compute the gradient of ``optimal_params`` with
respect to ``arg1`` and ``arg2`` in the above example.

Args:
optimality_fun: (callable)
An equation function, ``optimality_fun(params, *args)``. The invariant is
``optimality_fun(sol, *args) == 0`` at the solution / root of ``sol``.
argnums: (int or tuple of int, default: :const:`0`)
Specifies arguments to compute gradients with respect to. The ``argnums`` can be an
integer or a tuple of integers, which respect to the zero-based indices of the arguments
of the ``optimality_fun(params, *args)`` function. The argument ``params`` is included
of the ``solver_fun(params, *args)`` function. The argument ``params`` is included
for the counting, while it is indexed as ``argnums=0``.
has_aux: (default: :data:`False`)
Whether the decorated solver function returns auxiliary data.
Expand Down
pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy