Implement Multi-Process Handling (#7)
Refactor code to support starting of multiple back end llama.cpp servers. This functionality is exposed as `profiles` to create a simple configuration format. Changes: * refactor proxy tests to get ready for multi-process support * update proxy/ProxyManager to support multiple processes (#7) * Add support for Groups in configuration * improve handling of Model alias configs * implement multi-model swapping * improve code clarity for swapModel * improve docs, rename groups to profiles in config
This commit is contained in:
@@ -113,7 +113,7 @@ func (p *Process) Stop() {
|
||||
p.Lock()
|
||||
defer p.Unlock()
|
||||
|
||||
if !p.isRunning {
|
||||
if !p.isRunning || p.cmd == nil || p.cmd.Process == nil {
|
||||
return
|
||||
}
|
||||
|
||||
|
||||
Reference in New Issue
Block a user