| Age | Commit message | Author | Files | +/- |
|---|---|---|---|---|
| chore: worker tracing (#2321) | Mohamed Bassem | 12 | -821/+1030 | |
feat: Add automated bookmark backup feature (#2182) …* feat: Add automated bookmark backup system
Implements a comprehensive automated backup feature for user bookmarks with the following capabilities:
Database Schema:
- Add backupSettings table to store user backup preferences (enabled, frequency, retention)
- Add backups table to track backup records with status and metadata
- Add BACKUP asset type for storing compressed backup files
- Add migration 0066_add_backup_tables.sql
Background Workers:
- Implement BackupSchedulingWorker cron job (runs daily at midnight UTC)
- Create BackupWorker to process individual backup jobs
- Deterministic scheduling spreads backup jobs across 24 hours based on user ID hash
- Support for daily and weekly backup frequencies
- Automated retention cleanup to delete old backups based on user settings
Export & Compression:
- Reuse existing export functionality for bookmark data
- Compress exports using Node.js built-in zlib (gzip level 9)
- Store compressed backups as assets with proper metadata
- Track backup size and bookmark count for statistics
tRPC API:
- backups.getSettings - Retrieve user backup configuration
- backups.updateSettings - Update backup preferences
- backups.list - List all user backups with metadata
- backups.get - Get specific backup details
- backups.delete - Delete a backup
- backups.download - Download backup file (base64 encoded)
- backups.triggerBackup - Manually trigger backup creation
UI Components:
- BackupSettings component with configuration form
- Enable/disable automatic backups toggle
- Frequency selection (daily/weekly)
- Retention period configuration (1-365 days)
- Backup list table with download and delete actions
- Manual backup trigger button
- Display backup stats (size, bookmark count, status)
- Added backups page to settings navigation
Technical Details:
- Uses Restate queue system for distributed job processing
- Implements idempotency keys to prevent duplicate backups
- Background worker concurrency: 2 jobs at a time
- 10-minute timeout for large backup exports
- Proper error handling and logging throughout
- Type-safe implementation with Zod schemas
* refactor: simplify backup settings and asset handling
- Move backup settings from separate table to user table columns
- Update BackupSettings model to use static methods with users table
- Remove download mutation in favor of direct asset links
- Implement proper quota checks using QuotaService.checkStorageQuota
- Update UI to use new property names and direct asset downloads
- Update shared types to match new schema
Key changes:
- backupSettingsTable removed, settings now in users table
- Backup downloads use direct /api/assets/{id} links
- Quota properly validated before creating backup assets
- Cleaner separation of concerns in tRPC models
* migration
* use zip instead of gzip
* fix drizzle
* fix settings
* streaming json
* remove more dead code
* add e2e tests
* return backup
* poll for backups
* more fixes
* more fixes
* fix test
* fix UI
* fix delete asset
* fix ui
* redirect for backup download
* cleanups
* fix idempotency
* fix tests
* add ratelimit
* add error handling for background backups
* i18n
* model changes
---------
Co-authored-by: Claude <noreply@anthropic.com>
|
Mohamed Bassem | 32 | -8/+5697 | |
refactor: generalize tidy assets queue into admin maintenance (#2059) …* refactor: generalize admin maintenance queue * more fixes |
Mohamed Bassem | 10 | -159/+227 | |
feat: Restate-based queue plugin (#2011) …* WIP: Initial restate integration * add retry * add delay + idempotency * implement concurrency limits * add admin stats * add todos * add id provider * handle onComplete failures * add tests * add pub key and fix logging * add priorities * fail call after retries * more fixes * fix retries left * some refactoring * fix package.json * upgrade sdk * some test cleanups |
Mohamed Bassem | 24 | -13/+1049 | |
| refactor: Move callsites to liteque to be behind a plugin | Mohamed Bassem | 39 | -405/+707 | |
| feat(workers): add worker enable/disable lists (#1885) | Mohamed Bassem | 3 | -44/+71 | |
| feat: Export prometheus metrics from the workers | MohamedBassem | 17 | -34/+181 | |
| refactor: Extract meilisearch as a plugin | MohamedBassem | 26 | -155/+524 | |
| feat: Add AI auto summarization. Fixes #1163 | Mohamed Bassem | 34 | -2639/+4843 | |
feat: Implement generic rule engine (#1318) …* Add schema for the new rule engine * Add rule engine backend logic * Implement the worker logic and event firing * Implement the UI changesfor the rule engine * Ensure that when a referenced list or tag are deleted, the corresponding event/action is * Dont show smart lists in rule engine events * Add privacy validations for attached tag and list ids * Move the rules logic into a models |
Mohamed Bassem | 42 | -40/+5787 | |
| chore: Rename hoarder packages to karakeep | MohamedBassem | 230 | -644/+654 | |
feat(webhook): Implement webhook functionality for bookmark events (#852) …* feat(webhook): Implement webhook functionality for bookmark events - Added WebhookWorker to handle webhook requests. - Integrated webhook triggering in crawlerWorker after video processing. - Updated main worker initialization to include WebhookWorker. - Enhanced configuration to support webhook URLs, token, and timeout. - Documented webhook configuration options in the documentation. - Introduced zWebhookRequestSchema for validating webhook requests. * feat(webhook): Update webhook handling and configuration - Changed webhook operation type from "create" to "crawled" in crawlerWorker and documentation. - Enhanced webhook retry logic in WebhookWorker to support multiple attempts. - Updated Docker configuration to include new webhook environment variables. - Improved validation for webhook configuration in shared config. - Adjusted zWebhookRequestSchema to reflect the new operation type. - Updated documentation to clarify webhook configuration options and usage. * minor modifications --------- Co-authored-by: Mohamed Bassem <me@mbassem.com> |
玄猫 | 6 | -11/+237 | |
| refactor: Move asset preprocessing to its own worker out of the inference worker | Mohamed Bassem | 7 | -120/+258 | |
| feature: Schedule RSS feed refreshes every hour | Mohamed Bassem | 5 | -11/+66 | |
| feature: Add support for subscribing to RSS feeds. Fixes #202 | Mohamed Bassem | 16 | -3/+2280 | |
feature: Archive videos using yt-dlp. Fixes #215 (#525) …* Allow downloading more content from a webpage and index it #215 Added a worker that allows downloading videos depending on the environment variables refactored the code a bit added new video asset updated documentation * Some tweaks * Drop the dependency on the yt-dlp wrapper * Update openapi specs * Dont log an error when the url is not supported * Better handle supported websites that dont download anything --------- Co-authored-by: Mohamed Bassem <me@mbassem.com> |
kamtschatka | 17 | -71/+403 | |
| feature: Introduce a mechanism to cleanup dangling assets | MohamedBassem | 10 | -8/+351 | |
| fix(workers): Shutdown workers on SIGTERM | MohamedBassem | 2 | -0/+9 | |
| refactor: Replace the usage of bullMQ with the hoarder sqlite-based queue (#309) | Mohamed Bassem | 13 | -344/+128 | |
| feature: Include server version in the admin UI. Fixes #66 | MohamedBassem | 8 | -14/+92 | |
| format: Add missing lint and format, and format the entire repo | MohamedBassem | 57 | -192/+255 | |
| fix(workers): Fix the leaky browser instances in workers during development | MohamedBassem | 3 | -29/+46 | |
| structure: Create apps dir and copy tooling dir from t3-turbo repo | MohamedBassem | 396 | -9511/+10350 | |
| feature: Add full text search support | MohamedBassem | 17 | -12/+440 | |
| db: Migrate from prisma to drizzle | MohamedBassem | 41 | -975/+2177 | |
| fix: Attempt to improve the openai prompt a bit | MohamedBassem | 2 | -4/+7 | |
| fix: Fix build for workers package and add it to CI | MohamedBassem | 8 | -70/+106 | |
| Migrating away from bun to yarn | MohamedBassem | 15 | -105/+5148 | |
| [refactor] Move the different packages to the package subdir | MohamedBassem | 128 | -2716/+2713 | |
| [feature] Add openAI integration for extracting tags from articles | MohamedBassem | 9 | -19/+239 | |
| [refactor] Rename the crawlers package to workers | MohamedBassem | 8 | -126/+126 | |
| Implement metadata fetching logic in the crawler | MohamedBassem | 29 | -264/+439 |