vLLM
vLLM is a high-throughput and memory-efficient inference and serving engine for large language models (LLMs).
Contribute
Become a financial contributor.
Financial Contributions
Top financial contributors
Individuals
$100,000 USD since Jun 2024
$100,000 USD since Aug 2024
$10,000 USD since May 2024
$100 USD since May 2024
$100 USD since Jun 2024
$100 USD since Jun 2024
$25 USD since Jul 2024
$20 USD since May 2024
$20 USD since Jul 2024
$20 USD since Sep 2024
$20 USD since Nov 2024
$5 USD since Sep 2024
Organizations
$100,000 USD since Jun 2024
$854.13 USD since Jul 2024
$200 USD since Aug 2024
vLLM is all of us
Our contributors 18
Thank you for supporting vLLM.
Simon Mo
Woosuk Kwon
Zhuohan Li
Sequoia
$100,000 USD
Guest
$100,000 USD
SKYWORK AI
$100,000 USD
Dropbox Inc.
$10,000 USD
GitHub Sponsors
$854 USD
Yotta Labs
$200 USD
Tianle Cai
$100 USD
Rymon Yu
$100 USD
Marut Pandya
$100 USD
Budget
Transparent and open finances.
Credit from GitHub Sponsors to vLLM •
Credit from Tun Jian, Tan to vLLM •
$277,977.69 USD
$279,998.80 USD
$2,021.11 USD
$311,464.13 USD