From bbfab003a9d6ba28589581dbd2b32dbed4962645 Mon Sep 17 00:00:00 2001 From: Mateusz Baran Date: Mon, 2 Dec 2024 20:07:32 +0100 Subject: [PATCH] Restructure docs --- docs/src/basic_usage.md | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/docs/src/basic_usage.md b/docs/src/basic_usage.md index 643c1d0..eb1bf5c 100644 --- a/docs/src/basic_usage.md +++ b/docs/src/basic_usage.md @@ -1,7 +1,7 @@ # Basic usage You can calculate Riemannian gradient of a function defined in its embedding in multiple ways. -For example, `rb_onb_fd51` corresponds to a finite differencing scheme and `rb_onb_fwdd` calculates gradient using [`ForwardDiff.jl`](https://github.com/JuliaDiff/ForwardDiff.jl). [`DifferentiationInterface.jl`](https://github.com/JuliaDiff/DifferentiationInterface.jl) is used to select the backend. +[`DifferentiationInterface.jl`](https://github.com/JuliaDiff/DifferentiationInterface.jl) can be used to select the backend. ```@example using ManifoldDiff @@ -24,4 +24,6 @@ println(ManifoldDiff.gradient(s2, f, q, rb_onb_fwdd)) println(ManifoldDiff.gradient(s2, f, q, rb_proj_zyg)) ``` +In this example `rb_onb_fd51` corresponds to a finite differencing scheme, `rb_onb_fwdd` calculates gradient using [`ForwardDiff.jl`](https://github.com/JuliaDiff/ForwardDiff.jl) and `rb_proj_zyg` uses [`Zygote.jl`](https://github.com/FluxML/Zygote.jl) for reverse mode automatic differentiation. + [`TangentDiffBackend`](@ref) reduces dimensionality of the problem to the intrinsic dimension of the manifold, while [`RiemannianProjectionBackend`](@ref) relies on converting Euclidean gradient in the embedding to the Riemannian one.