LangChain / LangGraph Integration

Use keep as reflective memory within LangChain and LangGraph applications. Provides a LangGraph BaseStore, LangChain tools, a retriever, and middleware that auto-injects memory context into every LLM call.

Installation

pip install keep-skill[langchain]

Or install the discovery shim langchain-keep (pulls in everything):

pip install langchain-keep

You still need an embedding provider configured — see CLI Quick Start.

KeepStore — LangGraph BaseStore

Maps LangGraph's namespace/key model to Keep's document model:

from keep.langchain import KeepStore

store = KeepStore()                    # default store (~/.keep)
store = KeepStore(store="~/.keep")     # explicit path
store = KeepStore(keeper=my_keeper)    # existing Keeper instance

Use with LangGraph:

from langgraph.graph import StateGraph

graph = StateGraph(...)
graph.compile(store=store)

Use with langmem:

from langmem import create_manage_memory_tool, create_search_memory_tool

tools = [
    create_manage_memory_tool(namespace=("memories", "{user_id}")),
    create_search_memory_tool(namespace=("memories", "{user_id}")),
]

Namespace-to-tag mapping

LangGraph uses hierarchical namespace tuples like ("memories", "alice"). Keep uses flat key-value tags. The namespace_keys setting bridges these:

# Default: namespace_keys=["user"]
# ("alice",) → {"user": "alice"}

# Custom mapping:
store = KeepStore(namespace_keys=["category", "user"])
# ("memories", "alice") → {"category": "memories", "user": "alice"}

These become regular Keep tags — visible to CLI, searchable, filterable:

keep list --tag user=alice            # Find LangGraph-managed items
keep find "auth" -t category=memories # Search within a namespace

Or configure in keep.toml:

[tags]
namespace_keys = ["category", "user"]

Value mapping

KeepNotesToolkit — LangChain Tools

Four curated tools for LangChain agents:

from keep.langchain import KeepNotesToolkit

toolkit = KeepNotesToolkit(user_id="alice")
tools = toolkit.get_tools()
# → [remember, recall, get_context, update_context]
ToolDescription
rememberStore a fact, preference, decision, or note
recallSemantic search over stored notes
get_contextGet current working intentions (keep now)
update_contextUpdate current working intentions

KeepNotesRetriever — BaseRetriever

For RAG chains with optional now-context injection:

from keep.langchain import KeepNotesRetriever

retriever = KeepNotesRetriever(user_id="alice", include_now=True)
docs = retriever.invoke("authentication patterns")

KeepNotesMiddleware — LCEL Runnable

Auto-injects memory context into every LLM call. On each invocation it reads now context, searches memory using the last human message, and prepends a system message with results:

from keep.langchain import KeepNotesMiddleware
from langchain_openai import ChatOpenAI

middleware = KeepNotesMiddleware(user_id="alice")
chain = middleware.as_runnable() | ChatOpenAI(model="gpt-4o")
response = chain.invoke([HumanMessage(content="What's my schedule?")])

The middleware is fail-open by default: if keepnotes is slow or unavailable, it logs a warning and passes messages through unchanged. Memory enhances; it never blocks.

Multi-user scoping

For multi-user applications, use user_id to scope all operations:

store = KeepStore(user_id="alice")
# All put/search/list operations auto-filter by user=alice

Combined with required_tags in config, this enforces per-user isolation:

[tags]
required = ["user"]

With this config, put() calls without a user tag raise ValueError. Scoped set_now(scope="alice") auto-tags user=alice, satisfying the requirement.

Limitations

See also