Local Models (Ollama, vLLM)
Koreshield isn't just for cloud APIs. You can protect your local inference endpoints too.
Documentation in progress. Guide on configuring the proxy to point to localhost:11434 (Ollama) or localhost:8000 (vLLM).
Koreshield isn't just for cloud APIs. You can protect your local inference endpoints too.
Documentation in progress. Guide on configuring the proxy to point to localhost:11434 (Ollama) or localhost:8000 (vLLM).