100% local · No cloud · No telemetry

Stop re-explaining
yourself to AI.

lurk silently captures your work context across VS Code, Chrome, Slack, and 30+ apps — so every AI conversation starts with full context, not from zero.

lurk context — gemini

Every AI conversation starts from zero

You've been deep in a problem for two hours. Then you open an AI chat and have to reconstruct all of that from memory.

Without lurk
  • Re-explain your project every time
  • Forget half the context
  • Get generic, unhelpful answers
  • Waste 5 min before the real question
  • Repeat the same corrections
With lurk
  • Full context auto-assembled
  • Decisions, people, files all tracked
  • AI understands your work instantly
  • Paste and ask — zero preamble
  • Context improves over time

How It Works

From screen to context in seconds

lurk watches your desktop and builds a running understanding of what you're doing — automatically.

01

Observe

A native macOS daemon watches your desktop every 3 seconds — active app, window title, screen content via OCR.

02

Enrich

Raw events are parsed into structured context — file names, projects, languages, conversation participants, decisions.

03

Cluster

Activity is grouped into workstreams using a local LLM. Work across apps is connected because it's about the same thing.

04

Synthesize

When you ask for context, lurk generates a natural language prompt from the active workstream — adapted to your role.

Features

Context that works for everyone

Whether you're a developer, PM, designer, or marketer — lurk adapts to what you're doing.

Cold-Start Elimination

lurk context gives you a prompt with everything an AI needs — what you're working on, key decisions, who's involved, what's changed.

Workstream Awareness

lurk doesn't just know you're in VS Code. It knows you're implementing JWT auth, decided on RS256, and Sarah approved the token policy.

Cross-App Intelligence

A Slack conversation, a Google Doc, a code editor, and a browser tab — if they're about the same thing, lurk connects them.

Works for Any Role

Developers get file paths and diffs. PMs get stakeholder names and decision logs. Designers get Figma references. No configuration needed.

MCP Integration

Connect directly to Claude Code, Cursor, and other AI tools. Agents get context automatically — and can feed decisions back.

100% Local & Private

Everything runs on your machine. No cloud, no accounts, no telemetry. Your data never leaves your disk.

See It In Action

One command. Full context.

Terminal — zsh

$ lurk context -p


I'm working on implementing JWT authentication for the API.


Current state: The middleware is partially complete — token validation works but the refresh endpoint isn't written yet.


Key context:

- Using JWT with RS256 over session auth because the API is stateless

- Token policy: 1-hour access tokens, 30-day refresh (agreed with Sarah in Slack)

- Key files: auth/middleware.ts, server/http.ts

- Asked Claude Code to scaffold the middleware — it built token validation but skipped refresh

- Discussed token rotation approach with Gemini, decided on sliding window


People involved: Sarah Chen (API design), Mike (will review PR)


[What I need help with: ]

Copy to clipboard with lurk context -c — paste into any AI chat.

Privacy

Your data stays yours

lurk is 100% local. Everything runs on your machine. Nothing is sent anywhere. There is no “other copy.”

100% Local

SQLite at ~/.lurk/store.db — on your disk, nowhere else.

Zero Network

Zero outbound connections. No cloud services, no APIs called.

No Telemetry

No analytics, no crash reports, no usage tracking. Ever.

No Accounts

No sign-up, no login, no cloud dashboard. Just install and run.

Localhost Only

MCP and HTTP servers run on localhost — not exposed to the network.

Full Control

Pause, inspect, export, or delete everything. lurk purge wipes it all.

Get Started

Two commands. That's it.

One install, one start. lurk handles the rest — daemon, observers, everything.

# install lurk

$ pip install 'lurk[all]'

# start everything

$ lurk

# get your context — paste into any AI

$ lurk context -c

View on GitHub

Requires macOS 13+ and Python 3.11+