vLLM

COLLECTIVE
Open source
github
python

vLLM is a high-throughput and memory-efficient inference and serving engine for large language models (LLMs).

Contribute


Become a financial contributor.

Financial Contributions

Custom contribution
Donation
Make a custom one-time or recurring contribution.

Latest activity by


+ 25

Top financial contributors

Individuals

1
Guest

$100,000 USD since Jun 2024

2
SKYWORK AI

$100,000 USD since Aug 2024

3
Dropbox Inc.

$10,000 USD since May 2024

4
Kindroid

$7,500 USD since Dec 2024

5
Aman Bhargava

$760 USD since Mar 2025

6
Ekagra Ranjan

$151 USD since Jan 2026

7
WorldSeek AI

$150 USD since May 2025

8
Simin Zhou

$111 USD since Jun 2025

9
Tianle Cai

$100 USD since May 2024

10
Rymon Yu

$100 USD since Jun 2024

11
Marut Pandya

$100 USD since Jun 2024

12
Lai Wei

$100 USD since Apr 2025

13
Jamin Ball

$100 USD since Oct 2025

14
Hui Liu

$30 USD since Feb 2025

15
Calvin Zhou

$25 USD since Jul 2024

Organizations

1
Sequoia

$100,000 USD since Jun 2024

2
The House Fund

$20,000 USD since Oct 2025

3
GitHub Sponsors

$3,018.92 USD since Jul 2024

4
WorldSeek AI

$300 USD since May 2025

5
Yotta Labs

$200 USD since Aug 2024

vLLM is all of us

Our contributors 32

Thank you for supporting vLLM.

Kaichao You

Admin

$20 USD

vLLM is a great community :)

Sequoia

$100,000 USD

Guest

$100,000 USD

SKYWORK AI

$100,000 USD

The House Fund

$20,000 USD

Dropbox Inc.

$10,000 USD

Kindroid

$7,500 USD

GitHub Sponsors

$3,019 USD

Budget


Transparent and open finances.

Contributions to the vllm project

Category
Maintenance and Development
from guoty to vLLM
-¥8,000.00 CNY
$1,165.56 USD
Paid

Contributions to the vllm project

Category
Travel
from princepride to vLLM
-$1,500.00 SGD
~$1,177.84 USD
Approved

DISCOURSE.ORG

Category
Hosting & Subscriptions
from Discourse to vLLM
-$100.00 USD
Paid
$
Today’s balance

$216,074.09 USD

Total raised

$307,633.07 USD

Total disbursed

$91,558.98 USD

Estimated annual budget

$22,924.77 USD

About


Our team

Kaichao You

Admin
vLLM is a great community :)