Skip to content

function transforms (aka torch.func, functorch)

Manuel edited this page Jul 3, 2024 · 3 revisions

Page Maintainers: @zou3519

Scope

  • understand what composable function transforms are and their most common use cases
  • understand what DynamicLayerStack is and how it is used to implement composition of function transforms

Learn about function transforms

Exercise

The advanced autodiff tutorial explains how to compute Jacobians via a composition of vmap and vjp.

  1. Without looking at the source code for jacfwd or torch.autograd.functional.jacobian, write a function to compute the Jacobian using forward-mode AD and a for-loop. Note that forward-mode AD computes Jacobian-vector products while reverse-mode AD (vjp, grad) compute vector-Jacobian products.
  2. Write a function to compute the Jacobian by composing vmap and jvp.

The APIs should have the following signature:

def jacobian(f, *args):
    pass

You can assume that f accepts multiple Tensor arguments and returns a single Tensor argument.

Understand how PyTorch implements composable function transforms

Read through this gdoc.

Next

Back to the Core Frontend Onboarding

Clone this wiki locally
pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy