logo

Build reliable AI Workflows
with humans in the loop

Build versioned durable AI workflows that pause for human input for minutes or days, and resume from where they left off.

Read and write from your own APIs and databases with programming primitives you're already familiar with.

Inferable Platform Workflow

Production-ready LLM primitives

All the boring things you have to build, but don't want to

1import { Inferable } from "inferable";
2
3const inferable = new Inferable({
4  apiSecret: require("./cluster.json").apiKey,
5});
6
7const workflow = inferable.workflows.create({
8  name: "customerDataProcessor",
9  inputSchema: z.object({
10    executionId: z.string(),
11    customerId: z.string(),
12  }),
13});
14
15// Initial version of the workflow
16workflow.version(1).define(async (ctx, input) => {
17  const customerData = await fetchCustomerData(input.customerId);
18  
19  // Process the data with a simple analysis
20  const analysis = await ctx.llm.structured({
21    input: JSON.stringify(customerData),
22    schema: z.object({
23      riskLevel: z.enum(["low", "medium", "high"]),
24      summary: z.string(),
25    }),
26  });
27  
28  return { analysis };
29});
30
31// Enhanced version with more detailed analysis
32workflow.version(2).define(async (ctx, input) => {
33  const customerData = await fetchCustomerData(input.customerId);
34  const transactionHistory = 
35    await fetchTransactionHistory(input.customerId);
36  
37  // Process the data with more advanced analysis
38  const analysis = await ctx.llm.structured({
39    input: JSON.stringify({ customerData, transactionHistory }),
40    schema: z.object({
41      riskLevel: z.enum(["low", "medium", "high"]),
42      summary: z.string(),
43      recommendations: z.array(z.string()),
44      factors: z.array(z.object({
45        name: z.string(),
46        impact: z.enum(["positive", "negative", "neutral"]),
47        weight: z.number(),
48      })),
49    }),
50  });
51  
52  return { 
53    analysis,
54    version: 2,
55    processedAt: new Date().toISOString()
56  };
57});

Workflow Versioning

Evolve your long running workflows over time in a backwards compatible way with versioning. Define multiple versions of the same workflow as your requirements change.

Workflow executions maintain version affinity - if a workflow version is updated while an execution is in progress, the execution will continue using the original version until it completes.

Each version can add new features, improve processing logic, or fix bugs without disrupting existing executions.

Managed State

Inferable handles all the state required for the durable workflows. No need to provision and manage databases. Everything is API-driven.

Observability

Get end-to-end observability with the developer console. Plug in your existing observability stack.

On-premise Execution

Your workflows run on your own infrastructure, no deployment step required.

No Inbound Connections

Enhanced security with outbound-only connections. Your infrastructure remains secure with no need to open inbound ports or expose internal services.

Open Source

Inferable is completely open source, giving you full transparency and control over the codebase.

Self-Hostable

Deploy Inferable on your own infrastructure for complete control over your data and compute.

No Frameworks to Learn

Inferable doesn't invert your programming model. It works with the existing programmatic primitives for control flow.

Talk to a founder

We've run thousands of workflows in production for startups and scale-ups. Talk with us about your specific use cases, our roadmap and how we can help you build with Inferable.

  • Production experience with real-world AI implementations
  • Technical guidance for your specific use cases
  • Skip the sales pitch and talk directly with builders

Inferable is completely open source and can be self-hosted on your own infrastructure for complete control over your data and compute.