From 5cbce67fdae7ef697dd999b0f1e3cc6ed9c53e3f Mon Sep 17 00:00:00 2001 From: MohamedBassem Date: Wed, 27 Mar 2024 14:50:01 +0000 Subject: docs: Add a warning around the experimental nature of ollama, and add update instructions --- docs/docs/02-installation.md | 16 +++++++++++++++- docs/docs/03-configuration.md | 5 +++++ 2 files changed, 20 insertions(+), 1 deletion(-) (limited to 'docs') diff --git a/docs/docs/02-installation.md b/docs/docs/02-installation.md index 50069e31..70fc3bb1 100644 --- a/docs/docs/02-installation.md +++ b/docs/docs/02-installation.md @@ -31,6 +31,8 @@ MEILI_MASTER_KEY=another_random_string You **should** change the random strings. You can use `openssl rand -base64 36` to generate the random strings. +Using `HOARDER_VERSION=release` will pull the latest stable version. You might want to pin the version instead to control the upgrades (e.g. `HOARDER_VERSION=0.10.0`). Check the latest versions [here](https://github.com/MohamedBassem/hoarder-app/pkgs/container/hoarder-web). + Persistent storage and the wiring between the different services is already taken care of in the docker compose file. Keep in mind that every time you change the `.env` file, you'll need to re-run `docker compose up`. @@ -47,7 +49,9 @@ To enable automatic tagging, you'll need to configure OpenAI. This is optional t Learn more about the costs of using openai [here](/openai).
- If you want to use Ollama (https://ollama.com/) instead for local inference. + [EXPERIMENTAL] If you want to use Ollama (https://ollama.com/) instead for local inference. + + **Note:** The quality of the tags you'll get will depend on the quality of the model you choose. Running local models is a recent addition and not as battle tested as using openai, so proceed with care (and potentially expect a bunch of inference failures). - Make sure ollama is running. - Set the `OLLAMA_BASE_URL` env variable to the address of the ollama API. @@ -55,6 +59,7 @@ Learn more about the costs of using openai [here](/openai). - Set `INFERENCE_IMAGE_MODEL` to the model you want to use for image inference in ollama (for example: `llava`) - Make sure that you `ollama pull`-ed the models that you want to use. +
### 5. Start the service @@ -64,3 +69,12 @@ Start the service by running: ``` docker compose up -d ``` + +Then visit `http://localhost:3000` and you should be greated with the Sign In page. + + +## Updating + +Updating hoarder will depend on what you used for the `HOARDER_VERSION` env variable. +- If you pinned the app to a specific version, bump the version and re-run `docker compose up -d`. This should pull the new version for you. +- If you used `HOARDER_VERSION=release`, you'll need to force docker to pull the latest version by running `docker compose up --pull always -d`. diff --git a/docs/docs/03-configuration.md b/docs/docs/03-configuration.md index f18bd647..0300717f 100644 --- a/docs/docs/03-configuration.md +++ b/docs/docs/03-configuration.md @@ -17,6 +17,11 @@ The app is mainly configured by environment variables. All the used environment Either `OPENAI_API_KEY` or `OLLAMA_BASE_URL` need to be set for automatic tagging to be enabled. Otherwise, automatic tagging will be skipped. +:::warning +- The quality of the tags you'll get will depend on the quality of the model you choose. +- Running local models is a recent addition and not as battle tested as using OpenAI, so proceed with care (and potentially expect a bunch of inference failures). +::: + | Name | Required | Default | Description | | --------------------- | -------- | -------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | OPENAI_API_KEY | No | Not set | The OpenAI key used for automatic tagging. More on that in [here](/openai). | -- cgit v1.2.3-70-g09d2