aboutsummaryrefslogtreecommitdiffstats
path: root/packages/shared/inference.ts (follow)
Commit message (Collapse)AuthorAgeFilesLines
* fix: update OpenAI API to use max_completion_tokens instead of max_tokens ↵Benjamin Michaelis2025-10-251-2/+6
| | | | | | | | | | (#2000) * fix: update OpenAI API to use max_completion_tokens instead of max_tokens The OpenAI API has deprecated max_tokens in favor of max_completion_tokens for newer models. This change updates both text and image model calls. * feat: add support for max_completion_tokens in OpenAI inference configuration
* feat: Add a max output tokens env variableMohamed Bassem2025-07-201-1/+3
|
* feat: Add karakeep metadata to openai callsMohamed Bassem2025-05-171-0/+4
|
* fix: Allow using JSON mode for ollama users. Fixes #1160Mohamed Bassem2025-04-131-15/+40
|
* fix: Move away from JSON outputs to structured outputs. Fixes #1047Mohamed Bassem2025-03-021-9/+21
|
* fix: custom fetch wrapper for ollama inference. Fixes #656 (#1032)Gavin Mogan2025-02-161-0/+2
| | | | | | | | | | | | | | | | * Add configurable fetch timeout for Ollama client * Worker service needs access to the .env file * repair typescript types * Update customFetch.ts * update the config docs --------- Co-authored-by: sbarbett <shane@barbetta.me> Co-authored-by: Mohamed Bassem <me@mbassem.com>
* fix: Fix missing handling for AbortSignal in inference clientMohamed Bassem2025-02-011-28/+53
|
* fix: Abort all IO when workers timeout instead of detaching. Fixes #742Mohamed Bassem2025-02-011-12/+37
|
* feat: Add support for embeddings in the inference interface (#403)Mohammed Farghal2024-12-291-0/+32
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | * support embeddings generation in inference.ts (cherry picked from commit 9ae8773ad13ed87af8f72f167bdd56e02ea66f15) * make AI worker generate embeddings for text bookmark * make AI worker generate embeddings for text bookmark * fix unintentional change -- inference image model * support embeddings for PDF bookmarks * Upgrade drizzle-kit Existing version is not working with the upgraded version of drizzle-orm. I removed the "driver" to the match the new schema of the Config. Quoting from their Config: * `driver` - optional param that is responsible for explicitly providing a driver to use when accessing a database * *Possible values*: `aws-data-api`, `d1-http`, `expo`, `turso`, `pglite` * If you don't use AWS Data API, D1, Turso or Expo - ypu don't need this driver. You can check a driver strategy choice here: https://orm. * fix formatting and lint * add comments about truncate content * Revert "Upgrade drizzle-kit" This reverts commit 08a02c8df4ea403de65986ed1265940c6c994a20. * revert keep alive field in Ollama * change the interface to accept multiple inputs * docs --------- Co-authored-by: Mohamed Bassem <me@mbassem.com>
* docs: fix typo in inference.ts (#640)Ikko Eltociear Ashimine2024-11-171-1/+1
| | | successfull -> successful
* feature: Add a summarize with AI button for linksMohamed Bassem2024-10-271-8/+39
|
* refactor: Move inference to the shared packageMohamed Bassem2024-10-261-0/+155