求救,作者大佬推荐的CUDA版本没有适配的torch #1889
Replies: 1 comment
-
行叭,用cuda12.1也能顺利运行…… |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
我看配置wiki中作者大佬的推荐CUDA版本是12.2,结果我去https://pytorch.org/get-started/locally/ 上面看了下,最高支持到CUDA12.1……各位是怎么解决这个问题的啊?萌新大疑惑……

运行起来果然LLM是跑在cpu上的,而且只能访问API server,WEBUI Server直接给我拒绝了……
Beta Was this translation helpful? Give feedback.
All reactions