Back to selected work

Local AI infrastructure / semantic memory

Memory-Pi

A local vector memory layer for an ecosystem of AI-assisted services running close to the edge.

Domain

Local AI infrastructure / semantic memory

Role / ownership

Architecture, memory model, service orchestration, local AI integration, and the UI that developers actually use to interact with it.

Stack

Node.js · Fastify · TypeScript · sqlite-vec · Ollama · OpenRouter · Vite UI

01 — Challenge

Challenge

A useful local AI ecosystem needs memory, retrieval, ingestion, and operational simplicity. Most off-the-shelf AI demos do not solve that in a local-first way.

The system had to be practical, multimodal, and lightweight enough to support a broader Pi ecosystem.

02 — Solution & architecture

Solution & architecture

A local semantic memory service: sqlite-vec for vector search, local embeddings, structured ingestion, and a UI to capture and audit what the system remembers.

Its role inside the wider Pi ecosystem makes it especially strong: it is infrastructure, not just an isolated experiment.

03 — Tradeoffs

Tradeoffs

Built for practical retrieval and ingestion, not sci-fi agent demos. Less viral on Twitter, more useful at 2 AM when something breaks.

Kept the stack lightweight so it runs on a Pi without melting. That meant saying no to heavier vector databases and cloud dependencies.

04 — Outcome & proof

Outcome & proof

Documented API shape

The README exposes storage, search, and management flows clearly enough to prove this is a working system, not just a concept.

Ecosystem role

It is the memory layer the rest of the Pi services rely on. Without it, the other tools have no context.

05 — What it proves

What it proves

Pietro builds infrastructure, not just front ends.

He thinks in systems, lifecycles, and real-world use — not just surface features.