Skip to content

Commit

Permalink
[Docs] Add documentation page
Browse files Browse the repository at this point in the history
  • Loading branch information
Neet-Nestor committed Dec 9, 2024
1 parent 3c46e33 commit c2a471c
Show file tree
Hide file tree
Showing 16 changed files with 932 additions and 1 deletion.
39 changes: 39 additions & 0 deletions .github/workflows/build-doc.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
name: Build Docs

on:
push:
branches:
- main

jobs:
test_linux:
name: Deploy Docs
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v2
with:
submodules: recursive

- name: Configuring build Environment
run: |
sudo apt-get update
python -m pip install -U pip wheel
- name: Setup Ruby
uses: ruby/setup-ruby@v1
with:
ruby-version: '3.0'

- name: Installing dependencies
run: |
python -m pip install -r docs/requirements.txt
gem install jekyll jekyll-remote-theme
- name: Deploying on GitHub Pages
if: github.ref == 'refs/heads/main'
run: |
git remote set-url origin https://x-access-token:${{ secrets.MLC_GITHUB_TOKEN }}@github.com/$GITHUB_REPOSITORY
git config --global user.email "mlc-gh-actions-bot@nomail"
git config --global user.name "mlc-gh-actions-bot"
./scripts/gh_deploy_site.sh
20 changes: 20 additions & 0 deletions docs/Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# Minimal makefile for Sphinx documentation
#

# You can set these variables from the command line, and also
# from the environment for the first two.
SPHINXOPTS ?=
SPHINXBUILD ?= python -m sphinx
SOURCEDIR = .
BUILDDIR = _build

# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)

.PHONY: help Makefile

# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
30 changes: 30 additions & 0 deletions docs/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
# MLC-LLM Documentation

The documentation was built upon [Sphinx](https://www.sphinx-doc.org/en/master/).

## Dependencies

Run the following command in this directory to install dependencies first:

```bash
pip3 install -r requirements.txt
```

## Build the Documentation

Then you can build the documentation by running:

```bash
make html
```

## View the Documentation

Run the following command to start a simple HTTP server:

```bash
cd _build/html
python3 -m http.server
```

Then you can view the documentation in your browser at `http://localhost:8000` (the port can be customized by appending ` -p PORT_NUMBER` in the python command above).
87 changes: 87 additions & 0 deletions docs/_static/img/mlc-logo-with-text-landscape.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
102 changes: 102 additions & 0 deletions docs/conf.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,102 @@
# -*- coding: utf-8 -*-
import os
import sys

import tlcpack_sphinx_addon

# -- General configuration ------------------------------------------------

sys.path.insert(0, os.path.abspath("../python"))
sys.path.insert(0, os.path.abspath("../"))
autodoc_mock_imports = ["torch"]

# General information about the project.
project = "web-llm"
author = "WebLLM Contributors"
copyright = "2023, %s" % author

# Version information.

version = "0.2.77"
release = "0.2.77"

extensions = [
"sphinx_tabs.tabs",
"sphinx_toolbox.collapse",
"sphinxcontrib.httpdomain",
"sphinx.ext.autodoc",
"sphinx.ext.napoleon",
"sphinx_reredirects",
]

redirects = {"get_started/try_out": "../index.html#getting-started"}

source_suffix = [".rst"]

language = "en"

exclude_patterns = ["_build", "Thumbs.db", ".DS_Store"]

# The name of the Pygments (syntax highlighting) style to use.
pygments_style = "sphinx"

# A list of ignored prefixes for module index sorting.
# If true, `todo` and `todoList` produce output, else they produce nothing.
todo_include_todos = False

# -- Options for HTML output ----------------------------------------------

# The theme is set by the make target
import sphinx_rtd_theme

html_theme = "sphinx_rtd_theme"
html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]

templates_path = []

html_static_path = []

footer_copyright = "© 2023 MLC LLM"
footer_note = " "

html_logo = "_static/img/mlc-logo-with-text-landscape.svg"

html_theme_options = {
"logo_only": True,
}

header_links = [
("Home", "https://webllm.mlc.ai/"),
("Github", "https://github.com/mlc-ai/web-llm"),
("Discord Server", "https://discord.gg/9Xpy2HGBuD"),
]

header_dropdown = {
"name": "Other Resources",
"items": [
("WebLLM Chat", "https://chat.webllm.ai/"),
("MLC Course", "https://mlc.ai/"),
("MLC Blog", "https://blog.mlc.ai/"),
("MLC LLM", "https://llm.mlc.ai/"),
],
}

html_context = {
"footer_copyright": footer_copyright,
"footer_note": footer_note,
"header_links": header_links,
"header_dropdown": header_dropdown,
"display_github": True,
"github_user": "mlc-ai",
"github_repo": "mlc-llm",
"github_version": "main/docs/",
"theme_vcs_pageview_mode": "edit",
# "header_logo": "/path/to/logo",
# "header_logo_link": "",
# "version_selecter": "",
}


# add additional overrides
templates_path += [tlcpack_sphinx_addon.get_templates_path()]
html_static_path += [tlcpack_sphinx_addon.get_static_path()]
6 changes: 6 additions & 0 deletions docs/developer/add_models.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
Adding Models
=============

WebLLM allows you to compile custom language models using `MLC LLM <https://llm.mlc.ai/>`_ and then serve compiled model through WebLLM.

For instructions of how to compile and add custom models to WebLLM, check the `MLC LLM documentation here <https://llm.mlc.ai/docs/deploy/webllm.html>`_.
35 changes: 35 additions & 0 deletions docs/developer/building_from_source.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
Building From Source
====================

Clone the Repository
---------------------
.. code-block:: bash
git clone https://github.com/mlc-ai/web-llm.git
cd web-llm
Install Dependencies
---------------------
.. code-block:: bash
npm install
Build the Project
-----------------
.. code-block:: bash
npm run build
Test Changes
------------

To test you changes, you can reuse any existing example or create a new example for your new functionality to test.

Then, to test the effects of your code change in an example, inside ``examples/<example>/package.json``, change from ``"@mlc-ai/web-llm": "^0.2.xx"`` to ``"@mlc-ai/web-llm": ../...`` to let it reference you local code.

.. code-block:: bash
cd examples/<example>
# Modify the package.json
npm install
npm start
35 changes: 35 additions & 0 deletions docs/index.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
👋 Welcome to WebLLM
====================

`GitHub <https://github.com/mlc-ai/web-llm>`_ | `WebLLM Chat <https://chat.webllm.ai/>`_ | `NPM <https://www.npmjs.com/package/@mlc-ai/web-llm>`_ | `Discord <https://discord.gg/9Xpy2HGBuD>`_

WebLLM is a high-performance in-browser language model inference engine that brings large language models (LLMs) to web browsers with hardware acceleration. With WebGPU support, it allows developers to build AI-powered applications directly within the browser environment, removing the need for server-side processing and ensuring privacy.

It provides a specialized runtime for the web backend of MLCEngine, leverages
`WebGPU <https://www.w3.org/TR/webgpu/>`_ for local acceleration, offers OpenAI-compatible API,
and provides built-in support for web workers to separate heavy computation from the UI flow.

Key Features
------------
- 🌐 In-Browser Inference: Run LLMs directly in the browser
- 🚀 WebGPU Acceleration: Leverage hardware acceleration for optimal performance
- 🔄 OpenAI API Compatibility: Seamless integration with standard AI workflows
- 📦 Multiple Model Support: Works with Llama, Phi, Gemma, Mistral, and more

Start exploring WebLLM by `chatting with WebLLM Chat <https://chat.webllm.ai/>`_, and start building webapps with high-performance local LLM inference with the following guides and tutorials.

.. toctree::
:maxdepth: 2
:caption: User Guide

user/get_started.rst
user/basic_usage.rst
user/advanced_usage.rst
user/api_reference.rst

.. toctree::
:maxdepth: 2
:caption: Developer Guide

developer/building_from_source.rst
developer/add_models.rst
35 changes: 35 additions & 0 deletions docs/make.bat
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
@ECHO OFF

pushd %~dp0

REM Command file for Sphinx documentation

if "%SPHINXBUILD%" == "" (
set SPHINXBUILD=sphinx-build
)
set SOURCEDIR=.
set BUILDDIR=_build

%SPHINXBUILD% >NUL 2>NUL
if errorlevel 9009 (
echo.
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
echo.installed, then set the SPHINXBUILD environment variable to point
echo.to the full path of the 'sphinx-build' executable. Alternatively you
echo.may add the Sphinx directory to PATH.
echo.
echo.If you don't have Sphinx installed, grab it from
echo.https://www.sphinx-doc.org/
exit /b 1
)

if "%1" == "" goto help

%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%
goto end

:help
%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS% %O%

:end
popd
8 changes: 8 additions & 0 deletions docs/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
sphinx-tabs == 3.4.1
sphinx-rtd-theme
sphinx == 5.2.3
sphinx-toolbox == 3.4.0
tlcpack-sphinx-addon==0.2.2
sphinxcontrib_httpdomain==1.8.1
sphinxcontrib-napoleon==0.7
sphinx-reredirects==0.1.2
Loading

0 comments on commit c2a471c

Please sign in to comment.