aboutsummaryrefslogtreecommitdiffstats
path: root/packages/shared/config.ts (follow)
Commit message (Collapse)AuthorAgeFilesLines
* feat: Add LLM-based OCR as alternative to Tesseract (#2442)Mohamed Bassem2026-02-011-0/+2
| | | | | | | | | | | | | | | | | | | | | | | | | * feat(ocr): add LLM-based OCR support alongside Tesseract Add support for using configured LLM inference providers (OpenAI or Ollama) for OCR text extraction from images as an alternative to Tesseract. Changes: - Add OCR_USE_LLM environment variable flag (default: false) - Add buildOCRPrompt function for LLM-based text extraction - Add readImageTextWithLLM function in asset preprocessing worker - Update extractAndSaveImageText to route between Tesseract and LLM OCR - Update documentation with the new configuration option When OCR_USE_LLM is enabled, the system uses the configured inference model to extract text from images. If no inference provider is configured, it falls back to Tesseract. https://claude.ai/code/session_01Y7h7kDAmqXKXEWDmWbVkDs * format --------- Co-authored-by: Claude <noreply@anthropic.com>
* feat: add openai service tier configuration option (#2339)Robert Rosca2026-01-031-0/+2
|
* feat: Add open telemetry (#2318)Mohamed Bassem2025-12-291-0/+12
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | * feat: add OpenTelemetry tracing infrastructure Introduce distributed tracing capabilities using OpenTelemetry: - Add @opentelemetry packages to shared-server for tracing - Create tracing utility module with span helpers (withSpan, addSpanEvent, etc.) - Add tRPC middleware for automatic span creation on API calls - Initialize tracing in API and workers entry points - Add demo instrumentation to bookmark creation and crawler worker - Add configuration options (OTEL_TRACING_ENABLED, OTEL_EXPORTER_OTLP_ENDPOINT, etc.) - Document tracing configuration in environment variables docs When enabled, traces are collected for tRPC calls, bookmark creation flow, and crawler operations, with support for any OTLP-compatible backend (Jaeger, Tempo, etc.) * refactor: remove tracing from workers for now Keep tracing infrastructure but remove worker instrumentation: - Remove tracing initialization from workers entry point - Remove tracing instrumentation from crawler worker - Fix formatting in tracing files The tracing infrastructure remains available for future use. * add hono and next tracing * remove extra span logging * more fixes * update config * some fixes * upgrade packages * remove unneeded packages --------- Co-authored-by: Claude <noreply@anthropic.com>
* feat: add the ability to specify a different changelog versionMohamed Bassem2025-12-291-0/+2
|
* feat: support archiving as pdf (#2309)Mohamed Bassem2025-12-271-0/+2
| | | | | | | | | | | * feat: support archiving as pdf * add supprot for manually triggering pdf downloads * fix submenu * menu cleanup * fix store pdf
* feat: add OPENAI_PROXY_URL configuration and support for proxy in OpenAI ↵rzxczxc2025-12-271-0/+2
| | | | | | | | | | | | | client (#2231) * Add OPENAI_PROXY_URL configuration and support for proxy in OpenAIInferenceClient * docs: add OPENAI_PROXY_URL configuration for proxy support in OpenAI API requests * format --------- Co-authored-by: Mohamed Bassem <me@mbassem.com>
* feat: Add user settings to disable auto tagging/summarization (#2275)Mohamed Bassem2025-12-221-0/+2
| | | | | | | | | | | | | | | | | | | | | | | | | | * feat: Add per-user settings to disable auto-tagging and auto-summarization This commit adds user-level controls for AI features when they are enabled on the server. Users can now toggle auto-tagging and auto-summarization on/off from the AI Settings page. Changes: - Added autoTaggingEnabled and autoSummarizationEnabled fields to user table - Updated user settings schemas and API endpoints to handle new fields - Modified inference workers to check user preferences before processing - Added toggle switches to AI Settings page (only visible when server has features enabled) - Generated database migration for new fields - Exposed enableAutoTagging and enableAutoSummarization in client config The settings default to null (use server default). When explicitly set to false, the user's bookmarks will skip the respective AI processing. * revert migration * i18n --------- Co-authored-by: Claude <noreply@anthropic.com>
* feat: Add limits on number of rss feeds and webhooks per userMohamed Bassem2025-12-131-0/+6
|
* feat: make asset preprocessing worker timeout configurableClaude2025-12-101-0/+2
| | | | | | - Added ASSET_PREPROCESSING_JOB_TIMEOUT_SEC environment variable with default of 60 seconds (increased from hardcoded 30 seconds) - Updated worker to use the configurable timeout from serverConfig - Added documentation for the new configuration option
* feat: add support for turnstile on signupMohamed Bassem2025-11-301-0/+22
|
* fix: making serverConfig readonlyMohamed Bassem2025-11-281-1/+3
|
* feat: add crawler domain rate limiting (#2115)Mohamed Bassem2025-11-091-0/+10
|
* feat: Make search job timeout configurableMohamed Bassem2025-11-021-0/+2
|
* fix: Stricter SSRF validation (#2082)Mohamed Bassem2025-11-021-1/+24
| | | | | | | | | | | | | | | | | | | * fix: Stricter SSRF validation * skip dns resolution if running in proxy context * more fixes * Add LRU cache * change the env variable for internal hostnames * make dns resolution timeout configerable * upgrade ipaddr * handle ipv6 * handle proxy bypass for request interceptor
* feat: Allow configuring inline asset size thresholdMohamed Bassem2025-10-261-0/+2
|
* fix: update OpenAI API to use max_completion_tokens instead of max_tokens ↵Benjamin Michaelis2025-10-251-0/+2
| | | | | | | | | | (#2000) * fix: update OpenAI API to use max_completion_tokens instead of max_tokens The OpenAI API has deprecated max_tokens in favor of max_completion_tokens for newer models. This change updates both text and image model calls. * feat: add support for max_completion_tokens in OpenAI inference configuration
* feat: support passing multiple proxy values (#2039)Mohamed Bassem2025-10-121-2/+18
| | | | | | | * feat: support passing multiple proxy values * fix typo * trim and filter
* feat: Add cookie support for browser page accessMohamed Bassem2025-09-071-0/+2
| | | | | | | | | | | * feat: Add cookie support for browser page access Implemented cookie functionality for browser page access, including BROWSER_COOKIE_PATH configuration to specify the cookies JSON file path. * fix the docs --------- Co-authored-by: lizz <lizong1204@gmail.com>
* feat(workers): add worker enable/disable lists (#1885)Mohamed Bassem2025-09-071-0/+20
|
* feat: Export prometheus metrics from the workersMohamedBassem2025-08-221-0/+8
|
* feat: generate a random prometheus token on startupMohamedBassem2025-08-221-1/+3
|
* fix: Trim trailing slashes from nextauth urls. Fixes #1799MohamedBassem2025-08-031-1/+5
|
* fix: fix hidden env variable parse error. fixes #1790MohamedBassem2025-07-271-167/+164
|
* feat: Support NO_COLOR for logging. Fixes #1778MohamedBassem2025-07-271-0/+2
|
* refactor: Extract meilisearch as a pluginMohamedBassem2025-07-271-8/+0
|
* feat: Hide AI settings tab if inference is not configured. #1781Mohamed Bassem2025-07-261-3/+2
|
* feat: Add a max output tokens env variableMohamed Bassem2025-07-201-0/+2
|
* feat: Allow setting browserless crawling per userMohamed Bassem2025-07-191-4/+8
|
* feat: Allow enabling journaling mode on the dbMohamed Bassem2025-07-191-0/+6
|
* fix: Rename the proxy settings such that they don't interfer with other requestsMohamed Bassem2025-07-191-6/+6
|
* feat: Add stripe based subscriptionsMohamed Bassem2025-07-131-0/+28
|
* feat: Add proper proxy support. fixes #1265Mohamed Bassem2025-07-131-0/+10
|
* feat: Add API ratelimitsMohamed Bassem2025-07-101-0/+6
|
* feat: Add support for email verificationMohamed Bassem2025-07-101-117/+151
|
* feat: Add prometheus monitoring. Fixes #758Mohamed Bassem2025-07-061-0/+6
|
* feat(workers): Allow custmoizing max parallelism for a bunch of workers. ↵Mohamed Bassem2025-07-051-6/+22
| | | | Fixes #724
* feat: Add support for S3 as an asset storage layer (#1703)Mohamed Bassem2025-07-041-0/+21
| | | | | | | | | * feat: Add support for S3 as an asset storage layer. Fixes #305 * some minor fixes * use bulk deletion api * stream the file to s3
* feat: Add support for public lists (#1511)Mohamed Bassem2025-06-011-0/+7
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | * WIP: public lists * Drop viewing modes * Add the public endpoint for assets * regen the openapi spec * proper handling for different asset types * Add num bookmarks and a no bookmark banner * Correctly set page title * Add a not-found page * merge the RSS and public list endpoints * Add e2e tests for the public endpoints * Redesign the share list modal * Make NEXTAUTH_SECRET not required * propery render text bookmarks * rebase migration * fix public token tests * Add more tests
* feat: Generate RSS feeds from lists (#1507)Mohamed Bassem2025-05-311-0/+5
| | | | | | | | | | | * refactor: Move bookmark utils from shared-react to shared * Expose RSS feeds for lists * Add e2e tests * Slightly improve the look of the share dialog * allow specifying a limit in the rss endpoint
* feat: Disable the AI summary button if AI is not configured. Fixes #649Mohamed Bassem2025-05-181-0/+3
|
* feat: Add AI auto summarization. Fixes #1163Mohamed Bassem2025-05-181-0/+4
|
* feat: Change default text model to 4.1-miniMohamed Bassem2025-04-151-1/+1
|
* fix: Allow using JSON mode for ollama users. Fixes #1160Mohamed Bassem2025-04-131-2/+17
|
* feat: Allow storing assets in a separate directory. Fixes #1091Mohamed Bassem2025-03-291-0/+3
|
* feat(workers): Add CRAWLER_SCREENSHOT_TIMEOUT_SEC (#1155)Chang-Yen Tseng2025-03-271-0/+2
|
* feat(auth): Added env variable for OAuth timeout (#1136)Kaio Cesar2025-03-221-0/+2
| | | | | | | * feat(auth): add configurable OAuth timeout option * fix(config): change OAUTH_TIMEOUT to use z.coerce.number for better type handling * docs: Added instructions for OAUTH_TIMEOUT flag
* feat(workers): allows videoWorker to use ytdlp command line arguments ↵erik-nilcoast2025-03-161-0/+5
| | | | specified in the config. Fixes #775 #792 (#1117)
* fix: Move away from JSON outputs to structured outputs. Fixes #1047Mohamed Bassem2025-03-021-0/+2
|
* fix: custom fetch wrapper for ollama inference. Fixes #656 (#1032)Gavin Mogan2025-02-161-0/+2
| | | | | | | | | | | | | | | | * Add configurable fetch timeout for Ollama client * Worker service needs access to the .env file * repair typescript types * Update customFetch.ts * update the config docs --------- Co-authored-by: sbarbett <shane@barbetta.me> Co-authored-by: Mohamed Bassem <me@mbassem.com>
* fix: Loosen the restriction about the browser address env variable. Fixes #1000Mohamed Bassem2025-02-091-2/+2
|