LangChain Integration
Use LangChain normally, but route model calls through the KoreShield proxy.
Runtime Contract
- Chat route:
POST /v1/chat/completions - Optional RAG pre-scan:
POST /v1/rag/scan - Auth: Bearer JWT,
X-API-Key, orks_access_tokencookie
Python Example
import os
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
model="deepseek-chat",
base_url="http://localhost:8000/v1",
api_key="unused-by-koreshield-provider-client",
default_headers={
"Authorization": f"Bearer {os.environ['KORESHIELD_JWT']}"
},
)
result = llm.invoke("Summarize this safely")
print(result.content)
RAG Pattern
- Retrieve documents from your vector store.
- Call
/v1/rag/scanwithuser_query+ documents. - Filter/block on unsafe result.
- Send safe context to
/v1/chat/completions.
Notes
If a third-party LangChain utility expects older scan-style SDK methods, adapt it to call the two server endpoints above.