close

[Fixed] RuntimeError: CUDA out of memory. Tried to allocate

Today We are Going To Solve RuntimeError: CUDA out of memory. Tried to allocate in Python. Here we will Discuss All Possible Solutions and How this error Occurs So let’s get started with this Article.

How to Fix RuntimeError: CUDA out of memory. Tried to allocate Error?

  1. How to Fix RuntimeError: CUDA out of memory. Tried to allocate Error?

    To Fix RuntimeError: CUDA out of memory. Tried to allocate Error just Run the command. Just use the below command to solve this error torch.cuda.memory_summary(device=None, abbreviated=False)

  2. RuntimeError: CUDA out of memory. Tried to allocate

    To Fix RuntimeError: CUDA out of memory. Tried to allocate Error just Import torch. Just import torch and run the below command to solve this error. import torch torch.cuda.empty_cache()

Solution 1 : Run the command

Just use the below command to solve this error

torch.cuda.memory_summary(device=None, abbreviated=False)

Solution 2 : Import torch

Just import torch and run the below command to solve this error.

import torch
torch.cuda.empty_cache()

Conclusion

So these were all possible solutions to this error. I hope your error has been solved by this article. In the comments, tell us which solution worked? If you liked our article, please share it on your social media and comment on your suggestions. Thank you.

Also Read These Solutions

Leave a Comment