Open Source · AGPL-3.0 License

Blazen

Event-driven AI workflow engine with first-class LLM integration, written in Rust.

Type-safe events · 15+ LLM providers · Multi-workflow pipelines · Python & TypeScript bindings

Everything you need to build AI workflows

Blazen gives you the primitives to build complex, production-ready AI pipelines with minimal boilerplate.

Event-Driven Architecture

Type-safe events with zero boilerplate. Derive macros generate all the wiring so you focus on logic, not plumbing.

🔀

Multi-Workflow Pipelines

Compose sequential and parallel stages into pipelines. Pause, resume, and stream results across stages.

🤖

15+ LLM Providers

OpenAI, Claude, Gemini, Azure, fal.ai, OpenRouter, Groq, and more. Swap providers without changing workflow logic.

📝

Prompt Management

Versioned prompt templates with {{variable}} interpolation. Load from YAML or JSON registries.

🌐

Branching & Streaming

Conditional branching, parallel fan-out, and real-time streaming. Build complex DAGs with simple step definitions.

💾

Persistence

Embedded redb for local storage, Redis/ValKey for distributed. MessagePack serialization for compact, fast state.

🐍

Polyglot

Native Rust performance with Python (PyO3) and Node.js/TypeScript (napi-rs) bindings. Use your preferred language.

🛠️

CLI Scaffolding

blazen-cli scaffolds workflows, pipelines, and prompt registries. Get productive in seconds, not hours.

👤

Human-in-the-Loop

Steps can pause for human feedback via callbacks. Build durable approval workflows with first-class support.

Write workflows in your language

Native bindings for Rust, Python, and TypeScript. Same concepts, idiomatic APIs.

Basic Workflow

use blazen::prelude::*;

#[derive(Debug, Clone, Serialize, Deserialize, Event)]
struct GreetEvent {
    name: String,
}

#[step]
async fn parse_input(event: StartEvent, _ctx: Context) -> Result<GreetEvent, WorkflowError> {
    let name = event.data["name"].as_str().unwrap_or("World").to_string();
    Ok(GreetEvent { name })
}

#[step]
async fn greet(event: GreetEvent, _ctx: Context) -> Result<StopEvent, WorkflowError> {
    Ok(StopEvent {
        result: serde_json::json!({ "greeting": format!("Hello, {}!", event.name) }),
    })
}

#[tokio::main]
async fn main() -> anyhow::Result<()> {
    let workflow = WorkflowBuilder::new("greeter")
        .step(parse_input_registration())
        .step(greet_registration())
        .build()?;

    let result = workflow.run(serde_json::json!({ "name": "Blazen" })).await?.result().await?;
    println!("{}", result.to_json());
    Ok(())
}

Human-in-the-Loop

let workflow = WorkflowBuilder::new("assistant")
    .step(ai_step_registration())
    .step(review_step_registration())
    .input_handler(Arc::new(|request| Box::pin(async move {
        println!("AI asks: {}", request.prompt);
        let answer = get_user_input().await;
        Ok(InputResponseEvent {
            request_id: request.request_id,
            response: serde_json::json!(answer),
        })
    })))
    .build()?;

let result = workflow.run(input).await?.result().await?;

Get started in seconds

Install Blazen for your language of choice.

Rust
cargo add blazen
Python
pip install blazen
Node.js
pnpm add blazen
CLI
cargo install blazen-cli

Modular crate architecture

Use only what you need. Each crate is independently versioned and focused on a single responsibility.

Crate Description
blazen Umbrella crate with prelude re-exports
blazen-events Event trait, StartEvent, StopEvent, derive macro support
blazen-macros Procedural macros: #[step], #[derive(Event)]
blazen-core Workflow runtime, context, event loop, step registry
blazen-llm LLM provider abstraction, 15+ integrations
blazen-pipeline Multi-workflow pipelines, sequential/parallel stages
blazen-prompts Prompt template engine, variable interpolation, registries
blazen-persist Persistence layer: redb, Redis/ValKey, MessagePack
blazen-py Python bindings via PyO3
blazen-node Node.js/TypeScript bindings via napi-rs
blazen-cli CLI tool for scaffolding workflows and pipelines