Event-driven AI workflow engine with first-class LLM integration, written in Rust.
Type-safe events · 15+ LLM providers · Multi-workflow pipelines · Python & TypeScript bindings
Blazen gives you the primitives to build complex, production-ready AI pipelines with minimal boilerplate.
Type-safe events with zero boilerplate. Derive macros generate all the wiring so you focus on logic, not plumbing.
Compose sequential and parallel stages into pipelines. Pause, resume, and stream results across stages.
OpenAI, Claude, Gemini, Azure, fal.ai, OpenRouter, Groq, and more. Swap providers without changing workflow logic.
Versioned prompt templates with {{variable}} interpolation. Load from YAML or JSON registries.
Conditional branching, parallel fan-out, and real-time streaming. Build complex DAGs with simple step definitions.
Embedded redb for local storage, Redis/ValKey for distributed. MessagePack serialization for compact, fast state.
Native Rust performance with Python (PyO3) and Node.js/TypeScript (napi-rs) bindings. Use your preferred language.
blazen-cli scaffolds workflows, pipelines, and prompt registries. Get productive in seconds, not hours.
Steps can pause for human feedback via callbacks. Build durable approval workflows with first-class support.
Native bindings for Rust, Python, and TypeScript. Same concepts, idiomatic APIs.
use blazen::prelude::*;
#[derive(Debug, Clone, Serialize, Deserialize, Event)]
struct GreetEvent {
name: String,
}
#[step]
async fn parse_input(event: StartEvent, _ctx: Context) -> Result<GreetEvent, WorkflowError> {
let name = event.data["name"].as_str().unwrap_or("World").to_string();
Ok(GreetEvent { name })
}
#[step]
async fn greet(event: GreetEvent, _ctx: Context) -> Result<StopEvent, WorkflowError> {
Ok(StopEvent {
result: serde_json::json!({ "greeting": format!("Hello, {}!", event.name) }),
})
}
#[tokio::main]
async fn main() -> anyhow::Result<()> {
let workflow = WorkflowBuilder::new("greeter")
.step(parse_input_registration())
.step(greet_registration())
.build()?;
let result = workflow.run(serde_json::json!({ "name": "Blazen" })).await?.result().await?;
println!("{}", result.to_json());
Ok(())
}let workflow = WorkflowBuilder::new("assistant")
.step(ai_step_registration())
.step(review_step_registration())
.input_handler(Arc::new(|request| Box::pin(async move {
println!("AI asks: {}", request.prompt);
let answer = get_user_input().await;
Ok(InputResponseEvent {
request_id: request.request_id,
response: serde_json::json!(answer),
})
})))
.build()?;
let result = workflow.run(input).await?.result().await?;Install Blazen for your language of choice.
cargo add blazen pip install blazen pnpm add blazen cargo install blazen-cli Use only what you need. Each crate is independently versioned and focused on a single responsibility.
| Crate | Description |
|---|---|
blazen | Umbrella crate with prelude re-exports |
blazen-events | Event trait, StartEvent, StopEvent, derive macro support |
blazen-macros | Procedural macros: #[step], #[derive(Event)] |
blazen-core | Workflow runtime, context, event loop, step registry |
blazen-llm | LLM provider abstraction, 15+ integrations |
blazen-pipeline | Multi-workflow pipelines, sequential/parallel stages |
blazen-prompts | Prompt template engine, variable interpolation, registries |
blazen-persist | Persistence layer: redb, Redis/ValKey, MessagePack |
blazen-py | Python bindings via PyO3 |
blazen-node | Node.js/TypeScript bindings via napi-rs |
blazen-cli | CLI tool for scaffolding workflows and pipelines |