Python Examples

Complete runnable Python examples for Blazen

Python Examples

Five complete, runnable examples that demonstrate core Blazen workflow patterns in Python using the subclass event model.


Basic Workflow

A 3-step sequential pipeline: StartEventGreetEventFormattedEventStopEvent.

class GreetEvent(Event):
    name: str

@step
async def parse_input(ctx: Context, ev: Event):
    return GreetEvent(name=ev.name)

@step
async def greet(ctx: Context, ev: GreetEvent):
    return StopEvent(result={"greeting": f"Hello, {ev.name}!"})
python examples/basic_workflow.py

Streaming Workflow

Publishes typed progress events while processing via ctx.write_event_to_stream().

class ProgressEvent(Event):
    step: int

ctx.write_event_to_stream(ProgressEvent(step=i))
async for event in handler.stream_events():
    print(event.event_type)
python examples/streaming_workflow.py

Branching Workflow

Conditional fan-out by returning a list of typed events.

class PositiveEvent(Event):
    text: str
class NegativeEvent(Event):
    text: str

return [PositiveEvent(text=text), NegativeEvent(text=text)]
python examples/branching_workflow.py

LLM RAG Workflow

Multi-step RAG pipeline with context sharing between steps. Uses typed ChatMessage and CompletionResponse:

from blazen import CompletionModel, ChatMessage, CompletionResponse

model = CompletionModel.openai(os.environ["OPENAI_API_KEY"])
response: CompletionResponse = await model.complete([
    ChatMessage.system("Answer based on the provided documents."),
    ChatMessage.user(query),
])
print(response.content)       # typed attribute access
print(response.usage)         # TokenUsage with .prompt_tokens, .completion_tokens, .total_tokens
ctx.set("documents", docs)
python examples/llm_rag_workflow.py

Human-in-the-Loop

Side-effect steps that pause for external input with typed review events.

class ReviewComplete(Event):
    pass

ctx.send_event(ReviewComplete())
return None
python examples/human_in_the_loop.py