Skip to content

feat(zero_order): implemented the zero order feature #93

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 14 commits into from
Oct 11, 2022
Prev Previous commit
Next Next commit
fix(zero_grad): fix bug
  • Loading branch information
JieRen98 committed Oct 10, 2022
commit 54e2dd29c47d31be45e1b6501e690bb668f762c1
4 changes: 2 additions & 2 deletions torchopt/zero_grad_diff.py
Original file line number Diff line number Diff line change
Expand Up @@ -277,8 +277,8 @@ def get_loss(transform_fn):

return fn(*args)

loss = get_loss(lambda tensor, noise: torch.add(noise, alpha=sigma)) - get_loss(
lambda tensor, noise: torch.sub(noise, alpha=sigma)
loss = get_loss(lambda tensor, noise: tensor.add(noise, alpha=sigma)) - get_loss(
lambda tensor, noise: tensor.sub(noise, alpha=sigma)
)
weighted_grad = grad_outputs[0].mul(loss).mul_(0.5 / sigma)

Expand Down
pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy