Shortcuts

torch.nn.attention

This module contains functions and classes that alter the behavior of torch.nn.functional.scaled_dot_product_attention

Utils

sdpa_kernel

Context manager to select which backend to use for scaled dot product attention.

SDPBackend

An enum-like class that contains the different backends for scaled dot product attention.

Submodules

flex_attention

This module implements the user facing API for flex_attention in PyTorch.

bias

Defines bias subclasses that work with scaled_dot_product_attention

experimental

Docs

Access comprehensive developer documentation for PyTorch

View Docs

Tutorials

Get in-depth tutorials for beginners and advanced developers

View Tutorials

Resources

Find development resources and get your questions answered

View Resources
pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy