Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RuntimeError: Function FusedLeakyReLUFunctionBackward #18

Open
Ashly555 opened this issue Apr 9, 2024 · 3 comments
Open

RuntimeError: Function FusedLeakyReLUFunctionBackward #18

Ashly555 opened this issue Apr 9, 2024 · 3 comments

Comments

@Ashly555
Copy link

Ashly555 commented Apr 9, 2024

I really like your work. I've encountered a problem using DeltaEdit for a different field, but during the training process, I encountered the following error:

Traceback (most recent call last): File "scripts/train.py", line 63, in main(opts) File "scripts/train.py", line 52, in main loss.backward() File "/root/autodl-tmp/conda/envs/Del/lib/python3.8/site-packages/torch/tensor.py", line 221, in backward torch.autograd.backward(self, gradient, retain_graph, create_graph) File "/root/autodl-tmp/conda/envs/Del/lib/python3.8/site-packages/torch/autograd/init.py", line 130, in backward Variable._execution_engine.run_backward( RuntimeError: Function FusedLeakyReLUFunctionBackward returned an invalid gradient at index 1 - got [4] but expected shape compatible with [512]

Is there any solution to this problem?

@br0kyy
Copy link

br0kyy commented Jul 21, 2024

HI,have u solve the problem?Cause i have encountered the same problem.

@visa980911
Copy link

HI,have u solve the problem?Cause i have encountered the same problem.

@visa980911
Copy link

HI,have u solve the problem?Cause i have encountered the same problem.

HI,have u solve the problem?Cause i have encountered the same problem.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants