Update README.md [skip ci]
This commit is contained in:
@@ -9,9 +9,6 @@ llama-swap is a light weight, transparent proxy server that provides automatic m
|
||||
|
||||
Written in golang, it is very easy to install (single binary with no dependencies) and configure (single yaml file). To get started, download a pre-built binary, a provided docker images or Homebrew.
|
||||
|
||||
> [!NOTE]
|
||||
> ⭐️ Star this project to help others discover it! Every star helps boost visibility in searches and showcases community support.
|
||||
|
||||
## Features:
|
||||
|
||||
- ✅ Easy to deploy: single binary with no dependencies
|
||||
@@ -210,4 +207,7 @@ For Python based inference servers like vllm or tabbyAPI it is recommended to ru
|
||||
|
||||
## Star History
|
||||
|
||||
> [!NOTE]
|
||||
> ⭐️ Star this project to help others discover it!
|
||||
|
||||
[](https://www.star-history.com/#mostlygeek/llama-swap&Date)
|
||||
|
||||
Reference in New Issue
Block a user