Skip to main content

Add the LangGraph Adapter

FleetForge ships a LangGraph adapter so existing graphs can run under the same deterministic runtime, budgets, and guardrails. This guide shows how to wire it into your project.

Prerequisites

  • FleetForge runtime running locally (just demo) or in your target cluster.
  • Python SDK installed: pip install -e sdk/python.
  • LangGraph project that you want to orchestrate.

1. Install the adapter package

pip install -e sdk/python

The adapter lives under sdk/python/fleetforge/langgraph_adapter/. If you want to see a runnable example without any dependencies, run examples/adapters/langgraph/app.py.

2. Wrap your graph

Expose a callable that returns your LangGraph app, e.g.:

# my_project/graph.py
from my_project import build_workflow

def create() -> object:
return build_workflow()

Invoke the FleetForge runner module with the entrypoint you just exposed:

python -m fleetforge.langgraph_adapter.runner --entrypoint my_project.graph:create

Pipe the initial state JSON (the adapter expects an initial_state key, falling back to state for legacy payloads):

printf '{"initial_state": {"task": "Launch the determinism dashboard"}}' \
| python -m fleetforge.langgraph_adapter.runner --entrypoint my_project.graph:create

Prefer calling it directly from Python? Import the helper and await it (wrap with asyncio.run if you are in a synchronous script):

import asyncio

from fleetforge.langgraph_adapter.runner import run_graph


async def main() -> None:
result = await run_graph(
"my_project.graph:create",
initial_state={"brief": "Launch the determinism dashboard"},
)
print(result["output"])


if __name__ == "__main__":
asyncio.run(main())

The runner loads runtime settings from the environment:

  • FLEETFORGE_API_HTTP
  • FLEETFORGE_API_TOKEN (reader or writer depending on the action)

3. Submit to the runtime

  1. Export the CLI token: export FLEETFORGE_API_TOKEN=demo-writer.
  2. Run the runner command from above. The adapter serialises each step into the canonical api/schemas/agent_run.json contract, preserving tool calls and human-in-the-loop triggers.
  3. Watch the run in the UI or via fleetforge-ctl runs get.

4. Add guardrails and budgets

  • Annotate steps with policy.guardrails (see the guardrail matrix).
  • Provide budget hints so the scheduler enforces token or cost ceilings.
  • Use replay to verify the orchestration is deterministic before promoting to production.

5. Promote through ChangeOps

Before shipping new graphs:

  • Capture a replay demonstrating deterministic output.
  • Run eval packs covering safety, budget, and novelty.
  • Attach both artifacts to the ChangeOps gate (fleetforge-ctl gates check).