Skip to content

vllm-project/recipes

Repository files navigation

This repo intends to host community maintained common recipes to run vLLM answering the question: How do I run model Y on hardware Y for task Z?

Guides

DeepSeek DeepSeek

GLM GLM

InternLM InternLM

Llama

OpenAI OpenAI

Qwen Qwen

Seed Seed

Contributing

Please feel free to contribute by adding a new recipe or improving an existing one, just send us a PR!

While the repo is designed to be directly viewable in GitHub (Markdown files as first citizen), you can build the docs as web pages locally.

uv venv
source .venv/bin/activate
uv pip install -r requirements.txt
uv run mkdocs serve

License

This project is licensed under the Apache License 2.0 - see the LICENSE file for details.

About

Common recipes to run vLLM

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 24