Skip to content

Commit

Permalink
πŸ“ authentication section in docs
Browse files Browse the repository at this point in the history
  • Loading branch information
sestinj committed Feb 29, 2024
1 parent c580a2f commit 8a82097
Showing 1 changed file with 21 additions and 0 deletions.
21 changes: 21 additions & 0 deletions docs/docs/model-setup/configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,27 @@ For many cases, either Continue will have a built-in provider or the API you use

However, if neither of these are the case, you will need to wire up a new LLM object. Learn how to do this [here](#defining-a-custom-llm-provider).

## Authentication

If you need to send custom headers for authentication, you may use the `requestOptions.headers` property like in this example with Ollama:

```json title="~/.continue/config.json"
{
"models": [
{
"title": "Ollama",
"provider": "ollama",
"model": "llama2-7b",
"requestOptions": {
"headers": {
"Authorization": "Bearer xxx"
}
}
}
]
}
```

## Customizing the Chat Template

Most open-source models expect a specific chat format, for example llama2 and codellama expect the input to look like `"[INST] How do I write bubble sort in Rust? [/INST]"`. Continue will automatically attempt to detect the correct prompt format based on the `model`value that you provide, but if you are receiving nonsense responses, you can use the`template`property to explicitly set the format that you expect. The options are:`["llama2", "alpaca", "zephyr", "phind", "anthropic", "chatml", "openchat", "neural-chat", "none"]`.
Expand Down

0 comments on commit 8a82097

Please sign in to comment.