vLLM
Fiscal Host: Open Source Collective
vLLM is a high-throughput and memory-efficient inference and serving engine for large language models (LLMs).
![](/static/images/collective-navigation/CollectiveNavbarIconContribute.png)
Contribute
Become a financial contributor.
Financial Contributions
Top financial contributors
Individuals
1
Guest
$100,000 USD since Jun 2024
2
Dropbox Inc.
$10,000 USD since May 2024
Tianle Cai
$100 USD since May 2024
Rymon
$100 USD since Jun 2024
Marut Pandya
$100 USD since Jun 2024
Organizations
Sequoia
$100,000 USD since Jun 2024
vLLM is all of us
Our contributors 10
Thank you for supporting vLLM.
Simon Mo
Admin
Woosuk Kwon
Admin
Zhuohan Li
Admin
Sequoia
$100,000 USD
Guest
$100,000 USD
Dropbox Inc.
$10,000 USD
Tianle Cai
$100 USD
Rymon
$100 USD
Marut Pandya
$100 USD
![](/static/images/collective-navigation/CollectiveNavbarIconBudget.png)
Budget
Transparent and open finances.
Credit from Marut Pandya to vLLM •
+$100.00USD
Completed
Contribution #771734
logo sticker printing for Berkeley LLM meetup
Category
Marketing, Design, & PR
from Kaichao You to vLLM •
-$94.93 USD
Paid
Reimbursement #207569
meetup
-$23.36 USD
Paid
Reimbursement #207261
meetup
travel
$
Today’s balance$188,794.29 USD
Total raised
$188,965.21 USD
Total disbursed
$170.92 USD
Estimated annual budget
$210,300.00 USD
![](/static/images/collective-navigation/CollectiveNavbarIconAbout.png)