Skip to content

Commit 224c265

Browse files
committed
Update transformer to avoid triton errors.
1 parent f02e1dc commit 224c265

File tree

1 file changed

+1
-5
lines changed

1 file changed

+1
-5
lines changed

articles/gpt-oss/run-transformers.md

Lines changed: 1 addition & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -29,11 +29,7 @@ If you use `bfloat16` instead of MXFP4, memory consumption will be larger (\~48
2929
It’s recommended to create a fresh Python environment. Install transformers, accelerate, as well as the Triton kernels for MXFP4 compatibility:
3030

3131
```bash
32-
pip install -U transformers accelerate torch triton kernels
33-
```
34-
35-
```bash
36-
pip install git+https://github.com/triton-lang/triton.git@main#subdirectory=python/triton_kernels
32+
pip install -U transformers accelerate torch triton==3.4 kernels
3733
```
3834

3935
2. **(Optional) Enable multi-GPU**

0 commit comments

Comments
 (0)