-
Notifications
You must be signed in to change notification settings - Fork 2k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
Showing
34 changed files
with
278 additions
and
176 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,8 @@ | ||
## 0.0.53 - 2024-07-10 | ||
### Added | ||
* Support for .prompt files | ||
* New onboarding experience | ||
### Fixed | ||
* Indexing fixes from VS Code versions merged into IntelliJ | ||
* Improved codebase indexing reliability and testing | ||
* Fixes for autocomplete text positioning and timing |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,6 @@ | ||
## 0.0.54 - 2024-07-13 | ||
### Added | ||
* Partial autocomplete acceptance | ||
* Autocomplete status bar spinner | ||
### Fixed | ||
* Fixed duplicate completion bug and others |
This file was deleted.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,16 @@ | ||
## 0.8.42 - 2024-07-02 | ||
|
||
### Added | ||
|
||
- Support for Gemini 1.5 Pro | ||
- Link to code in the sidebar when using codebase retrieval | ||
- Smoother onboarding experience | ||
- .prompt files, a way of saving and sharing slash commands | ||
- Support for Claude 3.5 Sonnet, Deepseek Coder v2, and other new models | ||
- Support for comments in config.json | ||
- Specify multiple autocomplete models and switch between them | ||
- Improved bracket matching strategy reduces noisy completions | ||
|
||
### Fixed | ||
|
||
- Numerous reliability upgrades to codebase indexing |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,4 @@ | ||
## 0.8.43 - 2024-07-08 | ||
### Added | ||
* Improved indexing reliability and testing | ||
* Quick Actions: use CodeLens to quickly take common actions like adding docstrings |
This file was deleted.
Oops, something went wrong.
This file was deleted.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,2 +1,3 @@ | ||
extensions/vscode/continue_rc_schema.json | ||
**/.continueignore | ||
**/.continueignore | ||
CHANGELOG.md |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file was deleted.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,94 @@ | ||
--- | ||
title: Using Llama 3.1 with Continue | ||
description: How to use Llama 3.1 with Continue | ||
keywords: [llama, meta, togetherai, ollama, replicate] | ||
--- | ||
|
||
# Using Llama 3.1 with Continue | ||
|
||
Continue makes it easy to code with the latest open-source models, including the entire Llama 3.1 family of models. | ||
|
||
If you haven't already installed Continue, you can do that [here for VS Code](https://marketplace.visualstudio.com/items?itemName=Continue.continue) or [here for JetBrains](https://plugins.jetbrains.com/plugin/22707-continue). For more general information on customizing Continue, read [our customization docs](../customization/overview.md). | ||
|
||
Below we share some of the easiest ways to get up and running, depending on your use-case. | ||
|
||
## Ollama | ||
|
||
Ollama is the fastest way to get up and running with local language models. We recommend trying Llama 3.1 8b, which is impressive for its size and will perform well on most hardware. | ||
|
||
1. Download Ollama [here](https://ollama.ai/) (it should walk you through the rest of these steps) | ||
2. Open a terminal and run `ollama run llama3.1-8b` | ||
3. Change your Continue config file like this: | ||
|
||
```json title="~/.continue/config.json" | ||
{ | ||
"models": [ | ||
{ | ||
"title": "Llama 3.1 8b", | ||
"provider": "ollama", | ||
"model": "llama3.1-8b" | ||
} | ||
] | ||
} | ||
``` | ||
|
||
## Groq | ||
|
||
Groq provides the fastest available inference for open-source language models, including the entire Llama 3.1 family. | ||
|
||
1. Obtain an API key [here](https://console.groq.com/keys) | ||
2. Update your Continue config file like this: | ||
|
||
```json title="~/.continue/config.json" | ||
{ | ||
"models": [ | ||
{ | ||
"title": "Llama 3.1 405b", | ||
"provider": "groq", | ||
"model": "llama3.1-405b", | ||
"apiKey": "<API_KEY>" | ||
} | ||
] | ||
} | ||
``` | ||
|
||
## Together AI | ||
|
||
Together AI provides fast and reliable inference of open-source models. You'll be able to run the 405b model with good speed. | ||
|
||
1. Create an account [here](https://api.together.xyz/signup) | ||
2. Copy your API key that appears on the welcome screen | ||
3. Update your Continue config file like this: | ||
|
||
```json title="~/.continue/config.json" | ||
{ | ||
"models": [ | ||
{ | ||
"title": "Llama 3.1 405b", | ||
"provider": "together", | ||
"model": "llama3.1-405b", | ||
"apiKey": "<API_KEY>" | ||
} | ||
] | ||
} | ||
``` | ||
|
||
## Replicate | ||
|
||
Replicate makes it easy to host and run open-source AI with an API. | ||
|
||
1. Get your Replicate API key [here](https://replicate.ai/) | ||
2. Change your Continue config file like this: | ||
|
||
```json title="~/.continue/config.json" | ||
{ | ||
"models": [ | ||
{ | ||
"title": "Llama 3.1 405b", | ||
"provider": "replicate", | ||
"model": "llama3.1-405b", | ||
"apiKey": "<API_KEY>" | ||
} | ||
] | ||
} | ||
``` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.