Published at

LM Studio+ Pydantic AI: A Quick Example

LM Studio+ Pydantic AI: A Quick Example

This example demonstrates how to use LM Studio with Pydantic AI and OpenAI, showcasing a simple query for city location information. It highlights the agent's ability to extract structured data from text.

Authors
  • avatar
    Name
    James Lau
    Twitter
  • Indie App Developer at Self-employed
Sharing is caring!
Table of Contents

LM Studio+ Pydantic AI

This blog post illustrates a quick example of integrating LM Studio with Pydantic AI and OpenAI for extracting structured data from natural language prompts.

Setting up the Environment

The code snippet below utilizes pydantic_ai to interact with an LLM (in this case, Gemma-3-4b-it-qat running through LM Studio). It leverages Pydantic’s ability to define data models and extract structured information from the LLM’s output.

from pydantic import BaseModel

from pydantic_ai import Agent
from pydantic_ai.models.openai import OpenAIModel
from pydantic_ai.providers.openai import OpenAIProvider

class CityLocation(BaseModel):
    city: str
    country: str

ollama_model = OpenAIModel(
    model_name='gemma-3-4b-it-qat', provider=OpenAIProvider(base_url='http://127.0.0.1:1234/v1')
)
agent = Agent(ollama_model, output_type=CityLocation)

result = await agent.run('Where were the olympics held in 2012?')
print(result.output)
#> city='London' country='United Kingdom'
print(result.usage())
#> Usage(requests=1, request_tokens=57, response_tokens=8, total_tokens=65) # This shows the tokens used for the query and the LLM’s response.

Explanation

The code defines a CityLocation Pydantic model with two fields: city and country. The Agent object is initialized using the ollama_model (Gemma-3-4b-it-qat) and specifies that the expected output type is CityLocation. The agent.run() method sends a prompt to the LLM, requesting information about the 2012 Olympics.

The LLM responds with a string in the format city='London' country='United Kingdom', which conforms to the defined Pydantic model. The result.output attribute contains this formatted output.

Key Concepts

  • Pydantic: A Python library for data validation and parsing.
  • LM Studio: An open-source platform for running LLMs locally.
  • Pydantic AI: A library that extends Pydantic to work with LLMs, enabling structured data extraction from text outputs.
  • OpenAIProvider: Provides an interface to OpenAI models accessible through LM Studio.

This example demonstrates a simple but powerful way to leverage LLMs for extracting specific information and converting it into a structured format using Pydantic.

Sharing is caring!