Skip to content

Providers and settings

ProviderTypeDefault modelDefault URL
OpenAIexternalgpt-4.1-minihttps://api.openai.com/v1
Anthropicexternalclaude-3-5-haiku-latesthttps://api.anthropic.com/v1
OpenRouterexternalopenai/gpt-4.1-minihttps://openrouter.ai/api/v1
Azure OpenAIexternalgpt-4.1-minino fixed default endpoint
Ollamalocalllama3.1:8bhttp://127.0.0.1:11434
LM Studiolocallocal-modelhttp://127.0.0.1:1234/v1
  • OpenAI, OpenRouter: apiKey, baseUrl, model
  • Anthropic: apiKey, baseUrl, model
  • Azure OpenAI: apiKey, endpoint, deployment, apiVersion
  • Ollama, LM Studio: baseUrl, model

If configuration is incomplete, the extension falls back to a heuristic provider that prefers review outcomes over risky automatic moves.

Turns automatic processing of newly created bookmarks on or off.

  • auto_move
  • review
  • metadata_only
  • metadata_excerpt
  • full_content

Allows rich content to be sent to external providers. Local providers such as Ollama and LM Studio can already use full content locally.

  • auto_create
  • suggest_only
  • existing_only
  • one_level
  • two_levels
  • three_levels
  • unlimited

Excluded folders can be matched by path or ID and are removed from both source and destination handling.