Logo
Explore Help
Sign In
andreas/llama-swap
1
0
Fork 0
You've already forked llama-swap
Code Issues Pull Requests Actions Packages Projects Releases Wiki Activity
Files
5c97299e7b2ff5d9f40e72c1a6054d128610a04f
llama-swap/proxy
History
Benson Wong 5c97299e7b Add support for sending a custom model name to upstream (#69) (#71)
* add test for splitRequestedModel()
* Add `useModelName` parameter to model configuration
* add docs to README
2025-03-14 21:07:52 -07:00
..
html
Refactor UI (#33)
2024-12-23 19:48:59 -08:00
config_test.go
remove cmd_stop configuration and functionality from PR #40 (#44)
2025-01-31 12:42:44 -08:00
config.go
Add support for sending a custom model name to upstream (#69) (#71)
2025-03-14 21:07:52 -07:00
helpers_test.go
Improve Concurrency and Parallel Request Handling (#19)
2024-11-30 15:24:42 -08:00
html_files.go
Refactor UI (#33)
2024-12-23 19:48:59 -08:00
logMonitor_test.go
Improve LogMonitor to handle empty writes and ensure buffer immutability
2024-11-02 10:41:23 -07:00
logMonitor.go
fix bad logging due to wrong []byte used #28
2024-12-16 16:22:14 -08:00
process_test.go
Improve Continuous integration and fix concurrency bugs (#66)
2025-03-11 10:39:14 -07:00
process.go
fix some potentially confusing Process.start() comment
2025-03-11 11:00:45 -07:00
proxymanager_loghandlers.go
Refactor UI (#33)
2024-12-23 19:48:59 -08:00
proxymanager_test.go
Add support for sending a custom model name to upstream (#69) (#71)
2025-03-14 21:07:52 -07:00
proxymanager.go
Add support for sending a custom model name to upstream (#69) (#71)
2025-03-14 21:07:52 -07:00
Powered by Gitea Version: 1.25.3 Page: 135ms Template: 9ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API