当前位置:   article > 正文

RuntimeError: CUDA out of memory. Tried to allocate 5. If reserved memory is >> allocated memory_if reserved memory is >> allocated memory try sett

if reserved memory is >> allocated memory try setting max_split_size_mb to a

报错信息如下: 

RuntimeError: CUDA out of memory. Tried to allocate 50.00 MiB (GPU 0; 5.80 GiB total capacity; 4.39 GiB already allocated; 35.94 MiB free; 4.46 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation.  See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF 

经过向ChatGPT询问,得到如下答案:

 

  1. import torch
  2. torch.cuda.set_per_process_memory_fraction(0.9, 0) # 可根据需要调整0.9的值
  3. torch.cuda.set_per_process_memory_growth(True)
  4. torch.cuda.set_max_split_size(max_size_mb=1024) # 根据需要调整max_size_mb的值

 

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/AllinToyou/article/detail/538478
推荐阅读
相关标签
  

闽ICP备14008679号