Overview
Each search creates a persistent chat session that maintains context across multiple questions and answers.Session Lifecycle
Session Storage
Chat sessions are stored in SQLite with automatic backups:Session Schema
Sessions are backed up to localStorage for recovery after browser data clearing.
Message Ordering
Messages are ordered using a combination ofcreated_at timestamp and sequence number:
Message Sorting
Context Management
Chat responses are grounded in search results to prevent hallucination:Context Construction
Building Context
Message Types
Chat sessions support three message roles:User Messages
User Messages
Questions and prompts from you
- Displayed with right alignment
- Plain text rendering
- Submitted via Enter key or send button
Assistant Messages
Assistant Messages
Responses from the LLM
- Displayed with left alignment
- Markdown rendering with syntax highlighting
- Streamed token-by-token as they’re generated
System Messages
System Messages
Internal context instructions (not displayed)
- Contains repository context
- Defines assistant behavior
- Not shown in the UI
Streaming Responses
Assistant responses stream in real-time as the LLM generates tokens:Streaming Implementation
Cancellation
Streaming can be cancelled mid-generation:- Click the Cancel button
- Uses AbortController to terminate the request
- Partial response is preserved in the session
Session History
Access previous chat sessions from the history panel:Session List
All sessions ordered by most recently updatedShows:
- Original search query
- Last updated timestamp
- Number of messages
Session Restore
Click any session to restore itRestores:
- Search results
- All messages
- Active filters
Backup & Recovery
Automatic Backup
Chat sessions survive browser refresh and tab closure. Clear browser data will trigger backup recovery.
Best Practices
Effective Prompting
Reference Results
“Which of these repositories is best for X?”Forces the LLM to compare repositories in the current context.
Ask for Specifics
“What authentication methods does this library support?”Directs the LLM to extract specific details from README content.
Managing Sessions
Clear Context
Start a new search to load different repositories into context
Refine Filters
Adjust filters to change which repositories are in context without losing chat history
Session Cleanup
Delete old sessions from history to free storage space
Export Conversations
Copy useful conversations from the UI (no built-in export yet)
Provider Configuration
Configure which LLM provider to use for chat:- Remote (OpenAI)
- Local (Ollama)
- Browser (WebLLM)
- Requires API key
- High quality responses
- Usage costs apply
Provider settings are accessible via the gear icon in the chat composer.
Privacy & Security
Local Storage
All chat sessions stored locally in SQLiteNever sent to GitStarRecall servers
Remote Providers
OpenAI and other remote providers receive:
- Your questions
- Repository context snippets
Local Providers
Ollama and WebLLM never leave your device100% private, no network requests
API Keys
Stored in browser localStorage onlyNever transmitted except to configured provider