aboutsummaryrefslogtreecommitdiffstats
path: root/docs/versioned_docs/version-v0.30.0/03-configuration/01-environment-variables.md
diff options
context:
space:
mode:
Diffstat (limited to 'docs/versioned_docs/version-v0.30.0/03-configuration/01-environment-variables.md')
-rw-r--r--docs/versioned_docs/version-v0.30.0/03-configuration/01-environment-variables.md2
1 files changed, 1 insertions, 1 deletions
diff --git a/docs/versioned_docs/version-v0.30.0/03-configuration/01-environment-variables.md b/docs/versioned_docs/version-v0.30.0/03-configuration/01-environment-variables.md
index 5584e620..a8bf8ba6 100644
--- a/docs/versioned_docs/version-v0.30.0/03-configuration/01-environment-variables.md
+++ b/docs/versioned_docs/version-v0.30.0/03-configuration/01-environment-variables.md
@@ -92,7 +92,7 @@ Either `OPENAI_API_KEY` or `OLLAMA_BASE_URL` need to be set for automatic taggin
| Name | Required | Default | Description |
| ------------------------------------ | -------- | ---------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
-| OPENAI_API_KEY | No | Not set | The OpenAI key used for automatic tagging. More on that in [here](../integrations/openai). |
+| OPENAI_API_KEY | No | Not set | The OpenAI key used for automatic tagging. More on that in [here](../administration/openai). |
| OPENAI_BASE_URL | No | Not set | If you just want to use OpenAI you don't need to pass this variable. If, however, you want to use some other openai compatible API (e.g. azure openai service), set this to the url of the API. |
| OPENAI_PROXY_URL | No | Not set | HTTP proxy server URL for OpenAI API requests (e.g., `http://proxy.example.com:8080`). |
| OLLAMA_BASE_URL | No | Not set | If you want to use ollama for local inference, set the address of ollama API here. |