当前位置:   article > 正文

torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 86.00 MiB (GPU 0; 31.74 GiB total

torch.cuda.outofmemoryerror: cuda out of memory. tried to allocate 86.00 mib

torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 86.00 MiB (GPU 0; 31.74 GiB total capacity; 6.39 GiB already allocated; 78.38 MiB free; 6.47 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

解决办法
逐步调小PYTORCH_CUDA_ALLOC_CONF,但是会损失性能。
参考:https://blog.csdn.net/MirageTanker/article/details/127998036

set PYTORCH_CUDA_ALLOC_CONF=max_split_size_mb:16
  • 1
声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/花生_TL007/article/detail/220632
推荐阅读
相关标签
  

闽ICP备14008679号