Find starred repos by memory, not by name
GitStarRecall is a local-first web app that turns your GitHub stars into a searchable memory system. Ask it like you’d ask a human:- “I starred a GraphQL security testing repo months ago, what was it?”
- “Show me TypeScript auth projects with clean architecture vibes.”
- “Recommend the best-fit repos from my stars for my use case.”
This project exists because starred repos are great until your brain says, “I know what it does, but not what it is called.”
Why GitStarRecall?
People star a lot of useful repos. Later, they remember functionality, not names. GitHub search is good, but semantic memory search is better for this exact problem.Privacy-First Storage
SQLite WASM + OPFS keeps everything on-device. Your data stays in the browser unless you explicitly opt into remote LLM usage.
Security-Aligned
OAuth PKCE flow, token isolation, strict CSP, and explicit LLM consent controls protect your data.
Incremental Sync
Checksum-based diff sync handles 1k+ stars efficiently without full re-indexing.
Sessioned Recall
Each query becomes a chat session, so you can refine and revisit ideas over time.
How It Works
GitStarRecall builds a local semantic index of your starred repositories:Fetch Your Stars
Syncs your starred repositories including private stars (if token scope allows). Manual sync via
Fetch Stars button gives you control.Pull README Content
Fetches README content and metadata from each repository with adaptive batched ingestion pipeline.
Generate Embeddings Locally
Chunks and embeds content locally using browser workers with WebGPU acceleration (fallback to WASM). Optional Ollama backend for faster processing.
Search in Natural Language
Query your stars using natural language. Vector search runs entirely on local embeddings.
Core Principles
- Local-first by default - Data stays in browser unless you opt into remote providers
- Security before convenience - OAuth PKCE, CSP hardening, and threat-model-driven design
- Explainability over magic - Clear diagnostics and transparent provider selection
- Practical performance - Optimized for real star counts (1k+ repos)
What Stays Local vs. Remote
By default, everything stays local. Remote LLM usage is opt-in only.
Local by Default
- GitHub star metadata
- README content
- Chunks and embeddings
- Chat sessions and message history
- Vector search results
Remote (Opt-in Only)
- Prompt context sent to remote LLM provider when you enable it
- OAuth token exchange through backend endpoint (keeps client secret secure)