Skip to content

Commit

Permalink
More docs
Browse files Browse the repository at this point in the history
  • Loading branch information
samwillis committed Aug 5, 2024
1 parent 1108bcb commit 5f16f03
Show file tree
Hide file tree
Showing 10 changed files with 322 additions and 67 deletions.
4 changes: 2 additions & 2 deletions cibuild.sh
Original file line number Diff line number Diff line change
Expand Up @@ -388,12 +388,12 @@ do

mkdir -p /tmp/web/dist
mkdir -p /tmp/web/examples
mkdir -p /tmp/web/benchmarks
mkdir -p /tmp/web/benchmark

PGLITE=$(pwd)/packages/pglite
cp -r ${PGLITE}/dist/* /tmp/web/dist/
cp -r ${PGLITE}/examples/* /tmp/web/examples/
cp -r ${WORKSPACE}/packages/benchmark/dist/* /tmp/web/benchmarks/
cp -r ${WORKSPACE}/packages/benchmark/dist/* /tmp/web/benchmark/
;;
esac
shift
Expand Down
22 changes: 21 additions & 1 deletion docs/benchmarks.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,14 +18,32 @@

# Benchmarks

There are two sets of benchmarks, one testing [round trip time](#round-trip-time-benchmarks) for both PGlite and wa-sqlite, and [another](#sqlite-benchmark-suite) based on the [wa-sqlite bechmarks](https://rhashimoto.github.io/wa-sqlite/demo/benchmarks.html).
There are two sets of micro-benchmarks, one testing [round trip time](#round-trip-time-benchmarks) for both PGlite and wa-sqlite, and [another](#sqlite-benchmark-suite) based on the [SQLite speed test](https://sqlite.org/src/file?name=tool/speedtest.tcl&ci=trunk) which was ported for the [wa-sqlite benchmarks](https://rhashimoto.github.io/wa-sqlite/demo/benchmarks.html).

We also have a set of [native baseline](#native-baseline) results where we have compared native SQLite (via the Node better-sqlite3 package) to full Postgres.

Comparing Postgres to SQlite is a little difficult as they are quite different databases, particularly when you then throw in the complexities of WASM. Therefore these benchmarks provide a view of performance only as a starting point to investigate the difference between the two and the improvements we can make going forward.

The other thing to consider when analysing the speed is the performance of various different VFS implementations providing persistance to both PGlite and wa-sqlite, the the performance of the underlying storage.

The key finding are:

1. wa-sqlite is a little faster than PGlite when run purely in memory. This is be expected as it is a simpler database with fewer features, its designed to go fast. Having said that, PGlite is not slow, its well withing the range you would expect when [comparing native SQLite to Postgres](#native-baseline).

2. For single row CRUD inserts and updates, PGlite is faster then wa-sqlite. This is likely due to PGlite using the Posrgres WAL, whereas wa-sqlite is only using the SQLite rollback journal mode and not its WAL.

3. An fsync or flush to the underlying storage can be quite slow, particularly in the browser with IndexedDB for PGlite, or OPFS for wa-sqlite. Both offer some level of "relaxed durability" that can be used to accelerate these queriers, and is likely suitable for many embedded use cases.

We are going to continue to use these micro-benchmarks to feed back into the development of PGlite, and update them and the findings as we move forward.

These results below were run on a M2 Macbook Air.

## Round-trip-time benchmarks

These tests run a series of inserts/updates/deletes to find the average time to execute the type of CRUD operations that are regularly used in an app.

Values are average ms - lower is better.

![](./public/img/benckmark/rtt.svg)

| Test | PGlite Memory | PGlite IDB | PGlite IDB<br>_relaxed durability_ | PGlite OPFS AHP | PGlite OPFS AHP<br>_relaxed durability_ | SQLite Memory | SQLite IDB | SQLite IDB<br>_relaxed durability_ | SQLite IDB BatchAtomic | SQLite IDB BatchAtomic<br>_relaxed durability_ | SQLite OPFS | SQLite OPFS AHP |
Expand All @@ -47,6 +65,8 @@ These tests run a series of inserts/updates/deletes to find the average time to

The SQLite benchmark suite, converted to web for wa-sqlite, performs a number of large queries to test the performance of the sql engin.

Values are seconds to complete the test - lower is better.

![](./public/img/benckmark/sqlite-suite.svg)

| Test | PGlite<br>Memory | PGlite<br>IDB FS | PGlite<br>IDB FS<br>_relaxed durability_ | PGlite<br>OPFS Access Handle Pool | PGlite<br>OPFS Access Handle Pool<br>_relaxed durability_ | wa-sqlite<br>Memory (sync) | wa-sqlite<br>Memory (async) | wa-sqlite<br>DB Minimal | wa-sqlite<br>IDB Minimal<br>_relaxed durability_ | wa-sqlite<br>IDB Batch Atomic | wa-sqlite<br>IDB Batch Atomic<br>_relaxed durability_ | wa-sqlite<br>OPFS | wa-sqlite<br>OPFS Access Handle Pool |
Expand Down
38 changes: 37 additions & 1 deletion docs/docs/about.md
Original file line number Diff line number Diff line change
@@ -1 +1,37 @@
# What is PGlite
# What is PGlite

PGlite is a WASM Postgres build packaged into a TypeScript/JavaScript client library that enables you to run Postgres in the browser, Node.js and Bun, with no need to install any other dependencies. It's under 3mb gzipped, and has support for many [Postgres extensions](../extensions/), including [pgvector](../extensions/#pgvector).

Unlike previous "Postgres in the browser" projects, PGlite does not use a Linux virtual machine - it is simply Postgres in WASM.

It's being developed by [ElectricSQL](https://electric-sql.com/) for our use case of embedding into applications, either locally or at the edge, allowing users to sync a subset of their Postgres database.

However, there are many more use cases for PGlite beyond it's use as an embedded application databases:

- Unit and CI testing<br>
PGlite is very fast to start and tare down, perfect for unit tests, you can a unique fresh Postgres for each test.

- Local development<br>
You can use PGlite as an alternative to a full local Postgres for local development, masivly simplifyinf your development environmant.

- Remote development, or local web containers<br>
As PGlite is so light weight it can be easily embedded into remote containerised development environments, or in-browser [web containers](https://webcontainers.io).

- On-device or edge AI and RAG<br>
PGlite has full support for [pgvector](../extensions/#pgvector), enabling a local or edge retrieval augmented generation (RAG) workflow.

We are very keen to establish PGlite as an open source, and open contribution, project, working to build a community around it to develop its capabilities for all use cases.

Getting started with PGlite is super easy, just install and import the NPM package, then create a your embded database:

```js
import { PGlite } from "@electric-sql/pglite";

const db = new PGlite();
await db.query("select 'Hello world' as message;");
// -> { rows: [ { message: "Hello world" } ] }
```

It can be used as an ephemeral in-memory database, or with persistence either to the file system (Node/Bun) or indexedDB (Browser).

Read more in our [getting started guide](./index.md).
119 changes: 97 additions & 22 deletions docs/docs/index.md
Original file line number Diff line number Diff line change
@@ -1,46 +1,46 @@
# Getting started with PGlite

PGlite is a WASM Postgres build packaged into a TypeScript client library that enables you to run Postgres in the browser, Node.js and Bun, with no need to install any other dependencies. It is only 2.6mb gzipped.
PGlite can be used in both Node/Bun or the browser, and cen be used with any JavaScript framework.

```js
import { PGlite } from "@electric-sql/pglite";
## Install and start in Node/Bun

const db = new PGlite();
await db.query("select 'Hello world' as message;");
// -> { rows: [ { message: "Hello world" } ] }
```
Install into your project:

It can be used as an ephemeral in-memory database, or with persistence either to the file system (Node/Bun) or indexedDB (Browser).
::: code-group

Unlike previous "Postgres in the browser" projects, PGlite does not use a Linux virtual machine - it is simply Postgres in WASM.
```bash [npm]
npm install @electric-sql/pglite
```

## Node/Bun
```bash [pnpm]
pnpm install @electric-sql/pglite
```

Install into your project:
```bash [yarn]
yarn add @electric-sql/pglite
```

```bash
npm install @electric-sql/pglite
```bash [bun]
bun install @electric-sql/pglite
```

:::

To use the in-memory Postgres:

```js

import { PGlite } from "@electric-sql/pglite";

const db = new PGlite();
await db.query("select 'Hello world' as message;");
// -> { rows: [ { message: "Hello world" } ] }
```

or to persist to the filesystem:
or to persist to the native filesystem:

```js

const db = new PGlite("./path/to/pgdata");
```

## Browser
## Install and start in the browser

It can be installed and imported using your usual package manager:

Expand All @@ -57,12 +57,87 @@ Then for an in-memory Postgres:

```js
const db = new PGlite()
await db.query("select 'Hello world' as message;")
// -> { rows: [ { message: "Hello world" } ] }
```

or to persist the database to indexedDB:

```js
const db = new PGlite("idb://my-pgdata");
```
```

## Making a query

There are two method for querying the database, `.query` and `.exec`, the former support parameters, and the latter multiple statements.

First, lets crate a table and insert some test data using the `.exec` method:

```js
await db.exec(`
CREATE TABLE IF NOT EXISTS todo (
id SERIAL PRIMARY KEY,
task TEXT,
done BOOLEAN DEFAULT false
);
INSERT INTO todo (task, done) VALUES ('Install PGlite from NPM', true);
INSERT INTO todo (task, done) VALUES ('Load PGlite', true);
INSERT INTO todo (task, done) VALUES ('Create a table', true);
INSERT INTO todo (task, done) VALUES ('Insert some data', true);
INSERT INTO todo (task) VALUES ('Update a task');
`)
```

The `.exec` method is perfect for migrations, or batch inserts with raw SQL.

Now, lets retrieve an item using `.query` method:

```js
const ret = await db.query(`
SELECT * from todo WHERE id = 1;
`)
console.log(ret.rows)

// Output:
[
{
id: 1,
task: "Install PGlite from NPM"
}
]
```

## Using parametrised queries

When working with user supplied values its always best to use parametrised queries, these are supported on the `.query` method.

We can use this to update a task:

```js
const ret = await db.query(
"UPDATE todo SET task = $2, done = $3 WHERE id = $1",
[
5,
"Update a task using parametrised queries",
true
]
)
```

## What next?

- To learn more about [querying](./api.md#query) and [transactions](./api.md#transaction) you can read the main [PGlite API documentation](./api.md).

- There is also a [live-query extension](./live-queries.md) that enables reactive queries to update a UI when the underlying database changes.

- PGlite has a number of built in [virtual file systems](./filesystems.md) to provided persistance to the database.

- There are [framework hooks](./framework-hooks.md) to make working with PGlite within React and Vue much easer with less boilerplate.

- As PGlite only has single exclusive connection to the database, we provide a [multi-tab worker](./multi-tab-worker.md) to enable sharing a PGlite instance between multiple browser tabs.

- There is a [REPL component](./repl.md) that can be easily embedded into a web-app to aid in debugging and development, or as part of a database application itself.

- We maintain a [list of ORMs and query builders](./orm-support.md) that support PGlite.

- PGlite supports both Postgres extensions and PGlite Plugins via its [extensions API](./api.md#optionsextensions), and there is a list of [supported extensions](../extensions/).

- We have a [page of examples](../examples.md) that you can open to test out PGlite in the browser.
22 changes: 22 additions & 0 deletions packages/pglite/examples/dump-data-dir.html
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
<!doctype html>
<html>
<head>
<title>PGlite Dump Datadir Example</title>
<link rel="stylesheet" href="./styles.css" />
<script src="./utils.js"></script>
<script type="importmap">
{
"imports": {
"@electric-sql/pglite": "../dist/index.js"
}
}
</script>
</head>
<body>
<h1>PGlite Dump Datadir Example</h1>
<div class="script-plus-log">
<script type="module" src="./dumpDataDir.js"></script>
<div id="log"></div>
</div>
</body>
</html>
32 changes: 32 additions & 0 deletions packages/pglite/examples/dump-data-dir.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
import { PGlite } from "../dist/index.js";

const pg = new PGlite();
await pg.exec(`
CREATE TABLE IF NOT EXISTS test (
id SERIAL PRIMARY KEY,
name TEXT
);
`);
await pg.exec("INSERT INTO test (name) VALUES ('test');");

const file = await pg.dumpDataDir();

if (typeof window !== "undefined") {
// Download the dump
const url = URL.createObjectURL(file);
const a = document.createElement("a");
a.href = url;
a.download = file.name;
a.click();
} else {
// Save the dump to a file using node fs
const fs = await import("fs");
fs.writeFileSync(file.name, await file.arrayBuffer());
}

const pg2 = new PGlite({
loadDataDir: file,
});

const rows = await pg2.query("SELECT * FROM test;");
console.log(rows);
Loading

0 comments on commit 5f16f03

Please sign in to comment.