Skip to content

Commit

Permalink
Add Windows support + docs, closes #22, closes #25 (#23)
Browse files Browse the repository at this point in the history
* feat: Add Windows Support, closes #22

* ci: GitHub Actions Runner

* Update scripts/models.ps1

Co-authored-by: sourcery-ai[bot] <58596630+sourcery-ai[bot]@users.noreply.github.com>

* Update scripts/build/debug.sh

Co-authored-by: sourcery-ai[bot] <58596630+sourcery-ai[bot]@users.noreply.github.com>

* ci: Create `release.ps1`

* ci: Exclude `cl` build on Ubuntu

* ci: Remove hardcoded `cl` location

* docs: Fix documentation, closes #25

* ci: Build release `llama.cpp` in `release.sh`

* fix: Enable WebView 2

* Add `WebView2` NuGet Package

* ci: Create run scripts for local testing

* fix: Force package installation by script

---------

Co-authored-by: sourcery-ai[bot] <58596630+sourcery-ai[bot]@users.noreply.github.com>
  • Loading branch information
greynewell and sourcery-ai[bot] authored Sep 25, 2024
1 parent ad8b792 commit a5d917f
Show file tree
Hide file tree
Showing 21 changed files with 287 additions and 234 deletions.
24 changes: 17 additions & 7 deletions .github/workflows/cmake.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,27 +13,37 @@ jobs:
strategy:
fail-fast: false
matrix:
os: [ubuntu-latest, macos-latest]
os: [ubuntu-latest, macos-latest, windows-latest]
build_type: [Release]
c_compiler: [gcc, clang]
c_compiler: [clang, cl]
include:
- os: ubuntu-latest
c_compiler: gcc
cpp_compiler: g++
- os: ubuntu-latest
c_compiler: clang
cpp_compiler: clang++
- os: macos-latest
c_compiler: clang
cpp_compiler: clang++
- os: windows-latest
c_compiler: cl
cpp_compiler: cl
exclude:
- os: ubuntu-latest
c_compiler: cl
- os: macos-latest
c_compiler: gcc
c_compiler: cl
- os: windows-latest
c_compiler: clang

steps:
- uses: actions/checkout@v4
with:
submodules: 'true'

- name: Build (Linux/macOS)
- name: Build on Windows
if: runner.os == 'Windows'
run: .\scripts\build\release.ps1
shell: pwsh

- name: Build on Linux/macOS
if: runner.os != 'Windows'
run: ./scripts/build/release.sh
8 changes: 4 additions & 4 deletions .gitmodules
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[submodule "JUCE"]
path = JUCE
url = [email protected]:juce-framework/JUCE.git
[submodule "llama.cpp"]
path = llama.cpp
url = [email protected]:ggerganov/llama.cpp.git
url = https://github.com/ggerganov/llama.cpp.git
[submodule "JUCE"]
path = JUCE
url = https://github.com/juce-framework/JUCE.git
155 changes: 99 additions & 56 deletions CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,8 @@ juce_add_plugin(musegpt
PLUGIN_CODE Muse
FORMATS AAX AU AUv3 VST3 Standalone
PRODUCT_NAME "musegpt"
COPY_PLUGIN_AFTER_BUILD TRUE)
COPY_PLUGIN_AFTER_BUILD TRUE
NEEDS_WEBVIEW2 TRUE)
juce_generate_juce_header(musegpt)

target_sources(musegpt
Expand All @@ -24,6 +25,7 @@ target_compile_definitions(musegpt
JUCE_LOGGING=1 # Enable logging
JUCE_STRICT_REFCOUNTEDPOINTER=1
JUCE_WEB_BROWSER=1
JUCE_USE_WIN_WEBVIEW2_WITH_STATIC_LINKING=1
JUCE_USE_CURL=0
JUCE_VST3_CAN_REPLACE_VST2=0)

Expand Down Expand Up @@ -68,58 +70,99 @@ PRIVATE
$<$<CONFIG:Release>:-O3>
)

# Add llama-server as a binary resource
add_custom_command(
OUTPUT ${CMAKE_BINARY_DIR}/llama-server
COMMAND ${CMAKE_COMMAND} -E copy
${CMAKE_SOURCE_DIR}/build/llama.cpp/bin/llama-server
${CMAKE_BINARY_DIR}/llama-server
DEPENDS ${CMAKE_SOURCE_DIR}/build/llama.cpp/bin/llama-server
)
add_custom_target(copy_llama_server ALL DEPENDS ${CMAKE_BINARY_DIR}/llama-server)

# Ensure the executable is copied into the bundle
set_source_files_properties(${CMAKE_BINARY_DIR}/llama-server PROPERTIES MACOSX_PACKAGE_LOCATION Resources)
target_sources(musegpt PRIVATE ${CMAKE_BINARY_DIR}/llama-server)

# Copy llama-server to VST plugin format's output directory
add_custom_command(TARGET musegpt POST_BUILD
COMMAND ${CMAKE_COMMAND} -E copy
${CMAKE_BINARY_DIR}/llama-server
$<TARGET_FILE_DIR:musegpt>/VST3/musegpt.vst3/Contents/Resources/llama-server
)

# Copy llama-server to Standalone plugin format's output directory
add_custom_command(TARGET musegpt POST_BUILD
COMMAND ${CMAKE_COMMAND} -E copy
${CMAKE_BINARY_DIR}/llama-server
$<TARGET_FILE_DIR:musegpt>/Standalone/musegpt.app/Contents/Resources/llama-server
)

# Copy model weights to VST plugin format's output directory
add_custom_command(TARGET musegpt POST_BUILD
COMMAND ${CMAKE_COMMAND} -E copy
${CMAKE_SOURCE_DIR}/models/gemma-2b-it.fp16.gguf
$<TARGET_FILE_DIR:musegpt>/VST3/musegpt.vst3/Contents/Resources/gemma-2b-it.fp16.gguf
)

# Copy model weights to Standalone plugin format's output directory
add_custom_command(TARGET musegpt POST_BUILD
COMMAND ${CMAKE_COMMAND} -E copy
${CMAKE_SOURCE_DIR}/models/gemma-2b-it.fp16.gguf
$<TARGET_FILE_DIR:musegpt>/Standalone/musegpt.app/Contents/Resources/gemma-2b-it.fp16.gguf
)

# Copy model weights to AAX plugin format's output directory
add_custom_command(TARGET musegpt POST_BUILD
COMMAND ${CMAKE_COMMAND} -E copy
${CMAKE_SOURCE_DIR}/models/gemma-2b-it.fp16.gguf
$<TARGET_FILE_DIR:musegpt>/AAX/musegpt.aaxplugin/Contents/Resources/gemma-2b-it.fp16.gguf
)

# Copy model weights to AU plugin format's output directory
add_custom_command(TARGET musegpt POST_BUILD
COMMAND ${CMAKE_COMMAND} -E copy
${CMAKE_SOURCE_DIR}/models/gemma-2b-it.fp16.gguf
$<TARGET_FILE_DIR:musegpt>/AU/musegp.component/Contents/Resources/gemma-2b-it.fp16.gguf
)
# Add llama-server as a binary resource and copy files
if(WIN32)
# Windows-specific commands
add_custom_command(
OUTPUT ${CMAKE_BINARY_DIR}/$<CONFIG>/llama-server.exe
COMMAND ${CMAKE_COMMAND} -E copy
${CMAKE_SOURCE_DIR}/build/llama.cpp/bin/$<CONFIG>/llama-server.exe
${CMAKE_BINARY_DIR}/$<CONFIG>/llama-server.exe
DEPENDS ${CMAKE_SOURCE_DIR}/build/llama.cpp/bin/$<CONFIG>/llama-server.exe
)
add_custom_target(copy_llama_server ALL DEPENDS ${CMAKE_BINARY_DIR}/$<CONFIG>/llama-server.exe)
set_source_files_properties(${CMAKE_BINARY_DIR}/$<CONFIG>/llama-server.exe PROPERTIES MACOSX_PACKAGE_LOCATION Resources)
target_sources(musegpt PRIVATE ${CMAKE_BINARY_DIR}/$<CONFIG>/llama-server.exe)

# Copy llama-server to VST plugin format's output directory
add_custom_command(TARGET musegpt POST_BUILD
COMMAND ${CMAKE_COMMAND} -E copy
${CMAKE_BINARY_DIR}/$<CONFIG>/llama-server.exe
$<TARGET_FILE_DIR:musegpt>/VST3/musegpt.vst3/Contents/Resources/llama-server.exe
)

# Copy llama-server to Standalone plugin format's output directory
add_custom_command(TARGET musegpt POST_BUILD
COMMAND ${CMAKE_COMMAND} -E copy
${CMAKE_BINARY_DIR}/$<CONFIG>/llama-server.exe
$<TARGET_FILE_DIR:musegpt>/musegpt.exe/llama-server.exe
)

# Copy model weights to VST plugin format's output directory
add_custom_command(TARGET musegpt POST_BUILD
COMMAND ${CMAKE_COMMAND} -E copy
${CMAKE_SOURCE_DIR}/models/gemma-2b-it.fp16.gguf
$<TARGET_FILE_DIR:musegpt>/VST3/musegpt.vst3/Contents/Resources/gemma-2b-it.fp16.gguf
)

# Copy model weights to Standalone plugin format's output directory
add_custom_command(TARGET musegpt POST_BUILD
COMMAND ${CMAKE_COMMAND} -E copy
${CMAKE_SOURCE_DIR}/models/gemma-2b-it.fp16.gguf
$<TARGET_FILE_DIR:musegpt>/musegpt.exe/gemma-2b-it.fp16.gguf
)
else()
# Non-Windows commands
add_custom_command(
OUTPUT ${CMAKE_BINARY_DIR}/llama-server
COMMAND ${CMAKE_COMMAND} -E copy
${CMAKE_SOURCE_DIR}/build/llama.cpp/bin/llama-server
${CMAKE_BINARY_DIR}/llama-server
DEPENDS ${CMAKE_SOURCE_DIR}/build/llama.cpp/bin/llama-server
)
add_custom_target(copy_llama_server ALL DEPENDS ${CMAKE_BINARY_DIR}/llama-server)
set_source_files_properties(${CMAKE_BINARY_DIR}/llama-server PROPERTIES MACOSX_PACKAGE_LOCATION Resources)
target_sources(musegpt PRIVATE ${CMAKE_BINARY_DIR}/llama-server)

# Copy llama-server to VST plugin format's output directory
add_custom_command(TARGET musegpt POST_BUILD
COMMAND ${CMAKE_COMMAND} -E copy
${CMAKE_BINARY_DIR}/llama-server
$<TARGET_FILE_DIR:musegpt>/VST3/musegpt.vst3/Contents/Resources/llama-server
)

# Copy llama-server to Standalone plugin format's output directory
add_custom_command(TARGET musegpt POST_BUILD
COMMAND ${CMAKE_COMMAND} -E copy
${CMAKE_BINARY_DIR}/llama-server
$<TARGET_FILE_DIR:musegpt>/Standalone/musegpt.app/Contents/Resources/llama-server
)

# Copy model weights to VST plugin format's output directory
add_custom_command(TARGET musegpt POST_BUILD
COMMAND ${CMAKE_COMMAND} -E copy
${CMAKE_SOURCE_DIR}/models/gemma-2b-it.fp16.gguf
$<TARGET_FILE_DIR:musegpt>/VST3/musegpt.vst3/Contents/Resources/gemma-2b-it.fp16.gguf
)

# Copy model weights to Standalone plugin format's output directory
add_custom_command(TARGET musegpt POST_BUILD
COMMAND ${CMAKE_COMMAND} -E copy
${CMAKE_SOURCE_DIR}/models/gemma-2b-it.fp16.gguf
$<TARGET_FILE_DIR:musegpt>/Standalone/musegpt.app/Contents/Resources/gemma-2b-it.fp16.gguf
)

# Copy model weights to AAX plugin format's output directory
add_custom_command(TARGET musegpt POST_BUILD
COMMAND ${CMAKE_COMMAND} -E copy
${CMAKE_SOURCE_DIR}/models/gemma-2b-it.fp16.gguf
$<TARGET_FILE_DIR:musegpt>/AAX/musegpt.aaxplugin/Contents/Resources/gemma-2b-it.fp16.gguf
)

# Copy model weights to AU plugin format's output directory
add_custom_command(TARGET musegpt POST_BUILD
COMMAND ${CMAKE_COMMAND} -E copy
${CMAKE_SOURCE_DIR}/models/gemma-2b-it.fp16.gguf
$<TARGET_FILE_DIR:musegpt>/AU/musegp.component/Contents/Resources/gemma-2b-it.fp16.gguf
)
endif()
96 changes: 9 additions & 87 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,19 +1,16 @@
# [musegpt](https://github.com/greynewell/musegpt) [![GitHub Repo stars](https://img.shields.io/github/stars/greynewell/musegpt)](https://github.com/greynewell/musegpt/stargazers)

[![CMake](https://github.com/greynewell/musegpt/actions/workflows/cmake.yml/badge.svg?branch=main)](https://github.com/greynewell/musegpt/actions/workflows/cmake.yml) [![License: AGPL v3](https://img.shields.io/badge/License-AGPL%20v3-blue.svg)](https://www.gnu.org/licenses/agpl-3.0) [![Platform Support](https://img.shields.io/badge/platform-macOS%20%7C%20Linux-blue)](#supported-platforms) [![C++](https://img.shields.io/badge/c++-17-%2300599C.svg?logo=c%2B%2B&logoColor=white)](https://isocpp.org/) [![JUCE](https://img.shields.io/badge/JUCE-8-8DC63F&logo=juce&logoColor=white)](https://juce.com/) [![llama.cpp](https://img.shields.io/badge/llama.cpp-feff4aa-violet&logoColor=white)](https://github.com/ggerganov/llama.cpp/commit/feff4aa8461da7c432d144c11da4802e41fef3cf)
[![CMake](https://github.com/greynewell/musegpt/actions/workflows/cmake.yml/badge.svg?branch=main)](https://github.com/greynewell/musegpt/actions/workflows/cmake.yml) [![License: AGPL v3](https://img.shields.io/badge/License-AGPL%20v3-blue.svg)](https://github.com/greynewell/musegpt/blob/main/LICENSE) [![Platform Support](https://img.shields.io/badge/platform-Windows%20%7C%20macOS%20%7C%20Linux-blue)](https://musegpt.org/requirements.html) [![C++](https://img.shields.io/badge/c++-17-%2300599C.svg?logo=c%2B%2B&logoColor=white)](https://musegpt.org/requirements.html) [![JUCE](https://img.shields.io/badge/JUCE-8-8DC63F&logo=juce&logoColor=white)](https://musegpt.org/requirements.html) [![llama.cpp](https://img.shields.io/badge/llama.cpp-feff4aa-violet&logoColor=white)](https://musegpt.org/requirements.html)

Run local Large Language Models (LLMs) in your Digital Audio Workstation (DAW) to create music.
Run local Large Language Models (LLMs) in your Digital Audio Workstation (DAW) to provide inspiration, instructions, and analysis for your music creation.

## Table of Contents

- [Features](#features)
- [Demo](#demo)
- [Installation](#installation)
- [Getting Started](#getting-started)
- [Requirements](#requirements)
- [Installation](#installation)
- [Usage](#usage)
- [Supported Platforms](#supported-platforms)
- [Supported Models](#supported-models)
- [Architecture](#architecture)
- [Contributing](#contributing)
- [License](#license)
Expand All @@ -36,94 +33,19 @@ For more information about plans for upcoming features, check out the [Roadmap o

*Click the image above to watch a demo of musegpt in action.*

## Installation

To install `musegpt`, you can download the latest binaries from [Releases](https://github.com/greynewell/musegpt/releases).

If you want to build from source, follow these steps:

1. **Clone the repository:**

```bash
git clone --recurse-submodules -j2 https://github.com/greynewell/musegpt.git
cd musegpt
```

2. **Install dependencies:**

Ensure you have the required dependencies installed. See [Requirements](#requirements) for details.

3. **Build the project:**

Run the shell build script:

```bash
./scripts/build/debug.sh
```

or

```bash
./scripts/build/release.sh
```

4. **Install the plugin:**

CMake will automatically copy the built VST3, AU, or AAX plugin to your DAW's plugin directory.
- **macOS:** `~/Library/Audio/Plug-Ins/VST3/`
- **Linux:** `~/.vst3/`
## Getting Started
After installing musegpt, open your DAW and rescan for new plugins. Load `musegpt` as a plugin and start interacting with the LLM to enhance your music creation process!
## System Prompt
Feel free to experiment with the system prompt to customize the behavior of the LLM. Here's a suggestion to get you started:

> You are a helpful assistant that lives inside a musician's Digital Audio Workstation. Help them by giving them step-by-step instructions about music—composition, writing, performance, production, and engineering—in a creative and methodical way.
## Requirements

- **Operating System:**
- macOS 10.11 or later
- Linux (mainstream distributions)
- **DAW Support:** Any DAW that supports VST3 plugins (Ableton Live, FL Studio, Logic Pro, Pro Tools, etc.)
- **Dependencies:**
- [JUCE](https://juce.com/) (Audio application framework)
- [llama.cpp](https://github.com/ggerganov/llama.cpp) (LLM inference library)
- C++17 compatible compiler (e.g., GCC 7+, Clang 5+, MSVC 2017+)
- [CMake](https://cmake.org/) 3.15 or later
## Usage
1. **Load the Plugin:**
In your DAW, add musegpt as a VST3 plugin on a track.
You'll need a C++17 compatible compiler, CMake, and Python 3.10 or later. See [Requirements](https://musegpt.org/requirements.html) for more details on supported Operating Systems, models, DAWs, and more.

2. **Interact with the LLM:**
Use the plugin's interface to chat with the integrated LLM. You can input MIDI or audio data for analysis (features under development).

3. **Create Music:**

Leverage the power of AI to inspire new musical ideas, assist with composition, or generate creative suggestions.

## Supported Platforms

musegpt is cross-platform and supports the following operating systems:

- **macOS:** macOS 10.11 or later
- **Linux:** Mainstream distributions
## Installation

## Supported Models
To install `musegpt`, you can download the latest binaries from [Releases](https://github.com/greynewell/musegpt/releases).

musegpt currently supports the following models:
If you want to build from source, follow the [Installation](https://musegpt.org/installation.html) instructions.

- **gemma-2b-it.fp16.gguf**
## Usage

Any model compatible with `llama.cpp` should work with `musegpt`. Feel free to experiment with different models to find the best one for your needs—and raise a pull request!
Please refer to the [Usage](https://musegpt.org/usage.html) section of the documentation.

## Architecture

Expand Down
6 changes: 5 additions & 1 deletion docs/features.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
# Features [![GitHub Repo stars](https://img.shields.io/github/stars/greynewell/musegpt)](https://github.com/greynewell/musegpt/stargazers)

**musegpt** allows you to run local Large Language Models directly within your DAW, enhancing your music creation process by providing AI-powered assistance.

Current features include:

- ✅ LLM chat
- ✅ VST3 plugin
- ✅ MIDI input
Expand All @@ -9,7 +13,7 @@
- ❌ MIDI generation (Upcoming)
- ❌ Audio generation (Upcoming)

**musegpt** allows you to run local Large Language Models directly within your DAW, enhancing your music creation process by providing AI-powered assistance.
To see upcoming features, check out the [GitHub issues](https://github.com/greynewell/musegpt/issues) and [Roadmap on GitHub Projects](https://github.com/greynewell/musegpt/projects/1).

---

Expand Down
Loading

0 comments on commit a5d917f

Please sign in to comment.