from ai_infra.llm import LLMBaseLLMDirect model convenience interface (no agent graph). The LLM class provides a simple API for chat-based interactions with language models. Use this when you don't need tool calling. Example - Basic usage:
llm = LLM()
response = llm.chat("What is the capital of France?")
print(response.content) # "Paris is the capital of France."Example - With structured output:
from pydantic import BaseModel
class Answer(BaseModel):
city: str
country: str
llm = LLM()
result = llm.chat(
"What is the capital of France?",
output_schema=Answer,
)
print(result.city) # "Paris"Example - Streaming tokens:
llm = LLM()
async for token, meta in llm.stream_tokens("Tell me a story"):
print(token, end="", flush=True)