Replies: 1 comment
-
找到原因了,是pytorch版本没装对,现在运行后报CUDA out of memory 哪里需要设置最大值吗 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
3080显卡,默认程序跑的时候都在CPU上,然后手动改了EMBEDDING_DEVICE和LLM_DEVICE为cuda,max_gpu_memory=“10GiB”后报Torch not compiled with CUDA enabled
求教如何设置能在GPU上跑?
Beta Was this translation helpful? Give feedback.
All reactions