| Commit message (Collapse) | Author | Age | Files | Lines |
| | |
|
| |
|
|
|
|
|
|
| |
Add `tag:` as an alternative syntax to `#` for tag search queries,
and `!` as an alternative to `-` for negating qualifiers. This provides
more intuitive syntax options for users who prefer text-based qualifiers
over special characters.
Co-authored-by: Claude <noreply@anthropic.com>
|
| |
|
|
|
|
|
|
|
| |
* Enhance OLLAMA_KEEP_ALIVE variable description
Expanded the explanation for OLLAMA_KEEP_ALIVE variable to include examples for better clarity. Added examples of -1m for indefinite, 0 for instant.
* Enhance OLLAMA_KEEP_ALIVE variable description
Updated the description of the OLLAMA_KEEP_ALIVE variable to include examples for better clarity.
|
| | |
|
| |
|
|
|
|
|
|
|
| |
examples (#2355)
* Update CRAWLER_ALLOWED_INTERNAL_HOSTNAMES documentation with tailscale examples
* docs: Update CRAWLER_ALLOWED_INTERNAL_HOSTNAMES documentation with tailscale examples
Added to general documentation, not just version v0.30.0 docs
|
| | |
|
| |
|
|
|
|
|
|
|
| |
* fix(docs): update Ollama instructions and clarify AI provider options
* fix(docs): correct formatting of note on random string generation
---------
Co-authored-by: mark <7497389+Mxrk@users.noreply.github.com>
|
| | |
|
| | |
|
| | |
|
| | |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* feat: add OpenTelemetry tracing infrastructure
Introduce distributed tracing capabilities using OpenTelemetry:
- Add @opentelemetry packages to shared-server for tracing
- Create tracing utility module with span helpers (withSpan, addSpanEvent, etc.)
- Add tRPC middleware for automatic span creation on API calls
- Initialize tracing in API and workers entry points
- Add demo instrumentation to bookmark creation and crawler worker
- Add configuration options (OTEL_TRACING_ENABLED, OTEL_EXPORTER_OTLP_ENDPOINT, etc.)
- Document tracing configuration in environment variables docs
When enabled, traces are collected for tRPC calls, bookmark creation flow,
and crawler operations, with support for any OTLP-compatible backend (Jaeger, Tempo, etc.)
* refactor: remove tracing from workers for now
Keep tracing infrastructure but remove worker instrumentation:
- Remove tracing initialization from workers entry point
- Remove tracing instrumentation from crawler worker
- Fix formatting in tracing files
The tracing infrastructure remains available for future use.
* add hono and next tracing
* remove extra span logging
* more fixes
* update config
* some fixes
* upgrade packages
* remove unneeded packages
---------
Co-authored-by: Claude <noreply@anthropic.com>
|
| |
|
|
|
|
|
|
|
|
|
| |
* feat: support archiving as pdf
* add supprot for manually triggering pdf downloads
* fix submenu
* menu cleanup
* fix store pdf
|
| |
|
|
|
|
|
|
|
|
|
|
|
| |
client (#2231)
* Add OPENAI_PROXY_URL configuration and support for proxy in OpenAIInferenceClient
* docs: add OPENAI_PROXY_URL configuration for proxy support in OpenAI API requests
* format
---------
Co-authored-by: Mohamed Bassem <me@mbassem.com>
|
| | |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* docs: add RSS feeds integration documentation
Add detailed documentation for RSS feed integration covering:
- Publishing lists as RSS feeds with authentication tokens
- Consuming external RSS feeds with automatic bookmark creation
- Feed scheduling and management features
- Configuration options and troubleshooting guides
- API access and best practices
* changes
---------
Co-authored-by: Claude <noreply@anthropic.com>
|
| | |
|
| | |
|
| | |
|
| | |
|
| | |
|
| | |
|
| | |
|
| | |
|
| | |
|
| |
|
|
|
|
| |
- Added ASSET_PREPROCESSING_JOB_TIMEOUT_SEC environment variable with default of 60 seconds (increased from hardcoded 30 seconds)
- Updated worker to use the configurable timeout from serverConfig
- Added documentation for the new configuration option
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Add a new search qualifier `is:broken` that allows users to filter bookmarks
with broken or failed links. This matches the functionality on the broken links
settings page, where a link is considered broken if:
- crawlStatus is "failure"
- crawlStatusCode is less than 200
- crawlStatusCode is greater than 299
The qualifier supports negation with `-is:broken` to find working links.
Changes:
- Add brokenLinks matcher type definition
- Update search query parser to handle is:broken qualifier
- Implement query execution logic for broken links filtering
- Add autocomplete support with translations
- Add parser tests
- Update search query language documentation
Co-authored-by: Claude <noreply@anthropic.com>
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* Update 13-community-projects.md
Add Karakeep integration for Home Assistant
* Update docs/docs/13-community-projects.md
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
* Update docs/docs/13-community-projects.md
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
* Update 13-community-projects.md
---------
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
|
| | |
|
| |
|
|
|
|
|
| |
* Add LOG_LEVEL to configuration options
Added LOG_LEVEL configuration option for application logging.
* Add missing trailing pipe
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* Update 13-community-projects.md
Updated the Raycast Extension link
* tests: Add a test for listing lists
* fix link
* empty
* empty
---------
Co-authored-by: Mohamed Bassem <me@mbassem.com>
|
| | |
|
| | |
|
| | |
|
| | |
|
| | |
|
| |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* fix: Stricter SSRF validation
* skip dns resolution if running in proxy context
* more fixes
* Add LRU cache
* change the env variable for internal hostnames
* make dns resolution timeout configerable
* upgrade ipaddr
* handle ipv6
* handle proxy bypass for request interceptor
|
| |
|
|
|
|
|
|
|
|
|
| |
* docs: Add Azure configuration details for OpenAI-compatible API
* Update docs/docs/14-guides/05-different-ai-providers.md
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
---------
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
|
| | |
|
| |
|
|
|
| |
* refactor: generalize admin maintenance queue
* more fixes
|
| |
|
|
|
|
|
|
|
|
| |
(#2000)
* fix: update OpenAI API to use max_completion_tokens instead of max_tokens
The OpenAI API has deprecated max_tokens in favor of max_completion_tokens
for newer models. This change updates both text and image model calls.
* feat: add support for max_completion_tokens in OpenAI inference configuration
|
| |
|
|
|
|
|
| |
* feat: support passing multiple proxy values
* fix typo
* trim and filter
|
| |
|
| |
Added a new community project for syncing links from Hacker News and Reddit to Karakeep.
|
| |
|
|
| |
This adds documentation for `WORKERS_HOST` and clarifies token
requirements for `PROMETHEUS_AUTH_TOKEN`.
|
| | |
|
| |
|
|
|
|
|
| |
* fix(search): include link titles in title matcher
* docs(search): add title qualifier
* docs: remove title qualifier from v0.27 guide
|
| | |
|
| | |
|
| | |
|
| |
|
|
|
|
|
|
|
|
|
| |
* feat: Add cookie support for browser page access
Implemented cookie functionality for browser page access, including BROWSER_COOKIE_PATH configuration to specify the cookies JSON file path.
* fix the docs
---------
Co-authored-by: lizz <lizong1204@gmail.com>
|