Skip to content

Releases: BerriAI/litellm

v1.61.11-nightly

20 Feb 06:30
cc77138
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.61.9-nightly...v1.61.11-nightly

## Docker Run LiteLLM Proxy

```
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.61.11-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 120.0 146.33082595240526 6.457801208431416 6.457801208431416 1933 1933 97.35924100004922 4080.5825460000165
Aggregated Failed ❌ 120.0 146.33082595240526 6.457801208431416 6.457801208431416 1933 1933 97.35924100004922 4080.5825460000165

v1.61.9.dev1

20 Feb 04:41
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.61.9-nightly...v1.61.9.dev1

## Docker Run LiteLLM Proxy

```
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.61.9.dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 180.0 209.72659395983104 6.321588488030633 6.321588488030633 1892 1892 147.1097109999846 3268.0857999999944
Aggregated Failed ❌ 180.0 209.72659395983104 6.321588488030633 6.321588488030633 1892 1892 147.1097109999846 3268.0857999999944

v1.61.9-nightly

19 Feb 08:03
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.61.8-nightly...v1.61.9-nightly

## Docker Run LiteLLM Proxy

```
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.61.9-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 180.0 203.54644847482734 6.3054769799102575 6.3054769799102575 1887 1887 146.3379119999786 3805.3281139999626
Aggregated Failed ❌ 180.0 203.54644847482734 6.3054769799102575 6.3054769799102575 1887 1887 146.3379119999786 3805.3281139999626

v1.61.8-nightly

18 Feb 07:07
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.61.7...v1.61.8-nightly

## Docker Run LiteLLM Proxy

```
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.61.8-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

## Docker Run LiteLLM Proxy

```
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.61.8-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

## Docker Run LiteLLM Proxy

```
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.61.8-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 120.0 129.19425708375965 6.54112229454407 6.54112229454407 1958 1958 94.39574200001744 2020.834275000027
Aggregated Failed ❌ 120.0 129.19425708375965 6.54112229454407 6.54112229454407 1958 1958 94.39574200001744 2020.834275000027

v1.61.7.dev1

18 Feb 07:07
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.61.7...v1.61.7.dev1

v1.61.7-nightly

18 Feb 06:49
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.61.6-nightly...v1.61.7-nightly

v1.61.7

18 Feb 06:57
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.61.3...v1.61.7

## Docker Run LiteLLM Proxy

```
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.61.7

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

## Docker Run LiteLLM Proxy

```
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.61.7

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 180.0 206.98769618433857 6.145029010811349 6.145029010811349 1839 1839 146.21495699998377 3174.8161250000067
Aggregated Failed ❌ 180.0 206.98769618433857 6.145029010811349 6.145029010811349 1839 1839 146.21495699998377 3174.8161250000067

v1.61.6.dev1

18 Feb 04:04
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.61.6-nightly...v1.61.6.dev1

## Docker Run LiteLLM Proxy

```
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.61.6.dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 170.0 197.04136517618934 6.316924319787487 6.316924319787487 1890 1890 142.7094059999945 2646.323271999961
Aggregated Failed ❌ 170.0 197.04136517618934 6.316924319787487 6.316924319787487 1890 1890 142.7094059999945 2646.323271999961

v1.61.6-nightly

16 Feb 09:30
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.61.5-nightly...v1.61.6-nightly

## Docker Run LiteLLM Proxy

```
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.61.6-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 170.0 197.37858561234376 6.172709160882249 6.172709160882249 1847 1847 139.8097940000298 3194.1706680000266
Aggregated Failed ❌ 170.0 197.37858561234376 6.172709160882249 6.172709160882249 1847 1847 139.8097940000298 3194.1706680000266

v1.61.5-nightly

16 Feb 02:50
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.61.3.dev1...v1.61.5-nightly

## Docker Run LiteLLM Proxy

```
docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.61.5-nightly

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 150.0 169.92952748954406 6.233287189548679 6.233287189548679 1865 1865 130.2254270000276 1515.568768999998
Aggregated Failed ❌ 150.0 169.92952748954406 6.233287189548679 6.233287189548679 1865 1865 130.2254270000276 1515.568768999998