
GitHub - vllm-project/vllm: A high-throughput and memory …
vLLM is a fast and easy-to-use library for LLM inference and serving. Originally developed in the Sky Computing Lab at UC Berkeley, vLLM has evolved into a community-driven project with contributions from both academia and industry.
GitHub - vllm-project/media-kit: vLLM Logo Assets
This repository contains official logo assets for vLLM in various formats and styles.
media-kit/vLLM-Full-Logo.svg at main - GitHub
vLLM Logo Assets. Contribute to vllm-project/media-kit development by creating an account on GitHub.
vllm/README.md at main · vllm-project/vllm - GitHub
vLLM is a fast and easy-to-use library for LLM inference and serving. Originally developed in the Sky Computing Lab at UC Berkeley, vLLM has evolved into a community-driven project with contributions from both academia and industry.
vllm/docs/source/assets/logos/vllm-logo-text-light.png at main · …
A high-throughput and memory-efficient inference and serving engine for LLMs - vllm-project/vllm
Releases · vllm-project/vllm - GitHub
A high-throughput and memory-efficient inference and serving engine for LLMs - vllm-project/vllm
media-kit/README.md at main · vllm-project/media-kit - GitHub
vLLM Logo Assets. Contribute to vllm-project/media-kit development by creating an account on GitHub.
[Misc]: Brand guidelines around vLLM logo; is there a media
Dec 10, 2024 · I am working on a website about an AI model that uses vLLM as a runtime and wanted to use the logo in a diagram. I also was looking for an svg of this logo with a transparent background in both light and dark variants and could not find anything. What are the guidelines regarding the logo and can it be used in this scenario? Thanks
vllm/README.md at main · vllm-project/vllm - GitHub
- If you wish to use vLLM's logo, please refer to [our media kit repo](https://github.com/vllm-project/media-kit).
GitHub - HabanaAI/vllm-fork: A high-throughput and memory …
vLLM is a fast and easy-to-use library for LLM inference and serving. Originally developed in the Sky Computing Lab at UC Berkeley, vLLM has evolved into a community-driven project with contributions from both academia and industry.