diff --git a/beginner_source/basics/autogradqs_tutorial.py b/beginner_source/basics/autogradqs_tutorial.py index 8eff127dde..2753103eaa 100644 --- a/beginner_source/basics/autogradqs_tutorial.py +++ b/beginner_source/basics/autogradqs_tutorial.py @@ -133,7 +133,8 @@ # - To mark some parameters in your neural network as **frozen parameters**. # - To **speed up computations** when you are only doing forward pass, because computations on tensors that do # not track gradients would be more efficient. - +# See this `note `_ +# for additional reference. ###################################################################### @@ -160,6 +161,16 @@ # - accumulates them in the respective tensor’s ``.grad`` attribute # - using the chain rule, propagates all the way to the leaf tensors. # +# To get a sense of what this computational graph looks like we can use the following tools: +# +#1. torchviz is a package to visualize computational graphs. +# See the repository here: `https://github.com/szagoruyko/pytorchviz `_ +# +#2. Setting ``TORCH_LOGS="+autograd"`` enables logging for the backward pass. See details in this +# discussion: `https://dev-discuss.pytorch.org/t/highlighting-a-few-recent-autograd-features-h2-2023/1787 `_ +# +# +# # .. note:: # **DAGs are dynamic in PyTorch** # An important thing to note is that the graph is recreated from scratch; after each pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy