当前位置:   article > 正文

下载 huggingface 模型 经常出现的443错误_(caused by proxyerror('unab huggingface

(caused by proxyerror('unab huggingface

记录:选择合适的库包版本和代理处理

BUG:

requests.exceptions.ProxyError: (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /sentence-transformers/all-MiniLM-L6-v2/resolve/main/config.json (Caused by ProxyError('Unable to connect to proxy', SSLError(SSLZeroReturnError(6, 'TLS/SSL connection has been closed (EOF)

正在下载模型,例如

model = SentenceTransformer('all-MiniLM-L6-v2')

出现 ProxyError: (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443) 

尝试解决:

1. 降版本: requests 和 urllib3

2. 并且需要 开启代理 (yiyuanfeiji)

  1. pip install requests==2.27.1
  2. pip install urllib3==1.25.11

成功下载!

  1. config_sentence_transformers.json: 100%|██████████| 116/116 [00:00<00:00, 58.2kB/s]
  2. README.md: 100%|██████████| 10.6k/10.6k [00:00<00:00, 3.55MB/s]
  3. sentence_bert_config.json: 100%|██████████| 53.0/53.0 [00:00<00:00, 17.7kB/s]
  4. config.json: 100%|██████████| 612/612 [00:00<00:00, 205kB/s]
  5. pytorch_model.bin: 100%|██████████| 90.9M/90.9M [00:15<00:00, 5.69MB/s]
  6. tokenizer_config.json: 100%|██████████| 350/350 [00:00<00:00, 117kB/s]
  7. vocab.txt: 100%|██████████| 232k/232k [00:00<00:00, 1.05MB/s]
  8. tokenizer.json: 100%|██████████| 466k/466k [00:00<00:00, 3.30MB/s]
  9. special_tokens_map.json: 100%|██████████| 112/112 [00:00<00:00, 37.5kB/s]
  10. 1_Pooling/config.json: 100%|██████████| 190/190 [00:00<00:00, 63.6kB/s]
  11. 2023-12-25 18:08:57 - Use pytorch device_name: cpu
  12. Batches: 100%|██████████| 1/1 [00:00<00:00,  2.68it/s]

声明:本文内容由网友自发贡献,不代表【wpsshop博客】立场,版权归原作者所有,本站不承担相应法律责任。如您发现有侵权的内容,请联系我们。转载请注明出处:https://www.wpsshop.cn/w/羊村懒王/article/detail/175577
推荐阅读
  

闽ICP备14008679号