Skip to content

Commit

Permalink
Merge pull request #147 from sathvikbhagavan/sb/docs
Browse files Browse the repository at this point in the history
Few docs cleanup
  • Loading branch information
Vaibhavdixit02 authored Jan 9, 2024
2 parents 3ae0794 + 5a5a967 commit ffe5474
Show file tree
Hide file tree
Showing 8 changed files with 24 additions and 40 deletions.
4 changes: 4 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -3,3 +3,7 @@
*.jl.mem
Manifest.toml
*.DS_Store

# Build artifacts for creating documentation generated by the Documenter package
docs/build/
docs/site/
2 changes: 2 additions & 0 deletions docs/make.jl
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ cp("./docs/Manifest.toml", "./docs/src/assets/Manifest.toml", force = true)
cp("./docs/Project.toml", "./docs/src/assets/Project.toml", force = true)

ENV["GKSwstype"] = "100"
ENV["JULIA_DEBUG"] = "Documenter"
using Plots, CairoMakie

include("pages.jl")
Expand All @@ -12,6 +13,7 @@ makedocs(sitename = "GlobalSensitivity.jl",
authors = "Vaibhav Kumar Dixit",
modules = [GlobalSensitivity],
clean = true, doctest = false, linkcheck = true,
linkcheck_ignore = [r"https://www.sciencedirect.com/*"],
warnonly = [:missing_docs],
format = Documenter.HTML(assets = ["assets/favicon.ico"],
canonical = "https://docs.sciml.ai/GlobalSensitivity/stable/"),
Expand Down
20 changes: 7 additions & 13 deletions docs/src/tutorials/juliacon21.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,6 @@ p = [1.5,1.0,3.0,1.0]
prob = ODEProblem(f,u0,tspan,p)
t = collect(range(0, stop=10, length=200))
f1 = function (p)
prob1 = remake(prob;p=p)
sol = solve(prob1,Tsit5();saveat=t)
Expand All @@ -36,30 +35,30 @@ Colorbar(fig[2, 2], hm)
fig
```

![heatmapreg](https://user-images.githubusercontent.com/23134958/127019339-607b8d0b-6c38-4a18-b62e-e3ea0ae40941.png)

```@example lv
using StableRNGs
_rng = StableRNG(1234)
morris_sens = gsa(f1, Morris(), bounds, rng = _rng)
fig = Figure(resolution = (600, 400))
scatter(fig[1,1], [1,2,3,4], morris_sens.means_star[1,:], color = :green, axis = (xticksvisible = false, xticklabelsvisible = false, title = "Prey",))
scatter(fig[1,2], [1,2,3,4], morris_sens.means_star[2,:], color = :red, axis = (xticksvisible = false, xticklabelsvisible = false, title = "Predator",))
CairoMakie.scatter(fig[1,1], [1,2,3,4], morris_sens.means_star[1,:], color = :green, axis = (xticksvisible = false, xticklabelsvisible = false, title = "Prey",))
CairoMakie.scatter(fig[1,2], [1,2,3,4], morris_sens.means_star[2,:], color = :red, axis = (xticksvisible = false, xticklabelsvisible = false, title = "Predator",))
fig
```

![morrisscat](https://user-images.githubusercontent.com/23134958/127019346-2b5548c5-f4ec-4547-9f8f-af3e4b4c317c.png)

```@example lv
sobol_sens = gsa(f1, Sobol(), bounds, samples=500)
efast_sens = gsa(f1, eFAST(), bounds, samples=500)
```@example lv
fig = Figure(resolution = (600, 400))
barplot(fig[1,1], [1,2,3,4], sobol_sens.S1[1, :], color = :green, axis = (xticksvisible = false, xticklabelsvisible = false, title = "Prey (Sobol)", ylabel = "First order"))
barplot(fig[2,1], [1,2,3,4], sobol_sens.ST[1, :], color = :green, axis = (xticksvisible = false, xticklabelsvisible = false, ylabel = "Total order"))
barplot(fig[1,2], [1,2,3,4], efast_sens.S1[1, :], color = :red, axis = (xticksvisible = false, xticklabelsvisible = false, title = "Prey (eFAST)"))
barplot(fig[2,2], [1,2,3,4], efast_sens.ST[1, :], color = :red, axis = (xticksvisible = false, xticklabelsvisible = false))
fig
```

```@example lv
fig = Figure(resolution = (600, 400))
barplot(fig[1,1], [1,2,3,4], sobol_sens.S1[2, :], color = :green, axis = (xticksvisible = false, xticklabelsvisible = false, title = "Predator (Sobol)", ylabel = "First order"))
barplot(fig[2,1], [1,2,3,4], sobol_sens.ST[2, :], color = :green, axis = (xticksvisible = false, xticklabelsvisible = false, ylabel = "Total order"))
Expand All @@ -68,9 +67,6 @@ barplot(fig[2,2], [1,2,3,4], efast_sens.ST[2, :], color = :red, axis = (xticksvi
fig
```

![sobolefastprey](https://user-images.githubusercontent.com/23134958/127019361-8d625107-7f9c-44b5-a0dc-489bd512b7dc.png)
![sobolefastpred](https://user-images.githubusercontent.com/23134958/127019358-8bd0d918-e6fd-4929-96f1-d86330d91c69.png)

```@example lv
using QuasiMonteCarlo
samples = 500
Expand All @@ -80,7 +76,6 @@ sampler = SobolSample()
A,B = QuasiMonteCarlo.generate_design_matrices(samples,lb,ub,sampler)
sobol_sens_desmat = gsa(f1,Sobol(),A,B)
f_batch = function (p)
prob_func(prob,i,repeat) = remake(prob;p=p[:,i])
ensemble_prob = EnsembleProblem(prob,prob_func=prob_func)
Expand Down Expand Up @@ -126,6 +121,5 @@ CairoMakie.scatter!(fig[2,2], sobol_sens.S1[4][2,2:end], label = "Predator", mar
title = Label(fig[0,:], "First order Sobol indices")
legend = Legend(fig[2,3], ax)
fig
```

![timeseriessobollv](https://user-images.githubusercontent.com/23134958/156987652-85958bde-ae73-4f71-8555-318f779257ad.png)
2 changes: 0 additions & 2 deletions docs/src/tutorials/parallelized_gsa.md
Original file line number Diff line number Diff line change
Expand Up @@ -96,8 +96,6 @@ p2_ = bar(["a","b","c","d"],sobol_result.S1[2,:],title="First Order Indices pred
plot(p1,p2,p1_,p2_)
```

![sobolbars](https://user-images.githubusercontent.com/23134958/127019349-686f968d-7c8a-4dc4-abdc-c70f58b043dd.png)

## Parallelizing the Global Sensitivity Analysis

In all the previous examples, `f(p)` was calculated serially. However, we can parallelize our computations
Expand Down
30 changes: 8 additions & 22 deletions docs/src/tutorials/shapley.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ trained using the [SciML ecosystem](https://sciml.ai/).

As the first step let's generate the dataset.

```julia
```@example shapley
using GlobalSensitivity, OrdinaryDiffEq, Flux, SciMLSensitivity, LinearAlgebra
using Optimization, OptimizationOptimisers, Distributions, Copulas, CairoMakie
Expand All @@ -38,7 +38,7 @@ a 2-layer neural network with 10 hidden units in the first layer and the second
We will use the `Chain` function from `Flux` to define our NN. A detailed tutorial on
is available [here](https://docs.sciml.ai/SciMLSensitivity/stable/examples/neural_ode/neural_ode_flux/).

```julia
```@example shapley
dudt2 = Flux.Chain(x -> x .^ 3,
Flux.Dense(2, 10, tanh),
Flux.Dense(10, 2))
Expand Down Expand Up @@ -88,7 +88,7 @@ for the parameters. We will use the standard `Normal` distribution for all the p
First let's assume no correlation between the parameters. Hence the covariance matrix
is passed as the identity matrix.

```julia
```@example shapley
d = length(θ)
mu = zeros(Float32, d)
#covariance matrix for the copula
Expand All @@ -113,28 +113,21 @@ end
shapley_effects = gsa(batched_loss_n_ode, Shapley(;n_perms = 100, n_var = 100, n_outer = 10), input_distribution, batch = true)
```

```julia
[ Info: Since `n_perms` wasn't set the exact version of Shapley will be used
GlobalSensitivity.ShapleyResult{Vector{Float64}, Vector{Float64}}([0.11597691741361442, 0.10537266345858425, -0.011525418832504125, 0.019080490852392638, 0.0670556101993216, -0.0008750631360554604, -0.06145135053766362, 0.04681820267596843, -0.052194422120816236, 0.003179470815545183 … 0.017116063071811214, -0.01996361592991698, 0.04992156377132031, -0.026031285685145327, -0.05478798810114203, -0.08297800907245817, -0.0007407741548139723, -0.004732287469108539, 0.0387866269216672, 0.003387278477023375], [0.24632058593157738, 0.2655832791843761, 0.2501184872448763, 0.24778944872968212, 0.24095751488197525, 0.25542101332993983, 0.23147124018855053, 0.2559000952299014, 0.2445090237211431, 0.24422366968866593 … 0.26334364579925296, 0.2533883745742706, 0.278493369461382, 0.251080668076158, 0.2513237168712358, 0.2565579384455956, 0.24086196097600907, 0.23509270698557308, 0.2424725703788857, 0.245598352488518], [-0.3668114310122772, -0.4151705637427929, -0.5017576538324616, -0.4665868286577843, -0.40522111896934987, -0.5015002492627375, -0.5151349813072227, -0.45474598397463833, -0.5314321086142567, -0.47549892177424 … -0.4990374826947246, -0.5166048300954874, -0.49592544037298836, -0.5181493951144149, -0.5473824731687641, -0.5858315684258255, -0.47283021766779176, -0.46551399316083175, -0.43645961102094877, -0.4779854924004719], [0.5987652658395061, 0.6259158906599613, 0.47870681616745336, 0.5047478103625695, 0.5393323393679931, 0.4997501229906266, 0.39223228023189544, 0.5483823893265752, 0.4270432643726243, 0.48185786340533043 … 0.5332696088383471, 0.4766775982356534, 0.5957685679156289, 0.4660868237441243, 0.43780649696648005, 0.4198755502809092, 0.4713486693581638, 0.4560494182226147, 0.5140328648642831, 0.4847600493545186])
```
```julia
```@example shapley
fig = Figure(resolution = (600, 400))
ax = barplot(fig[1,1], collect(1:54), shapley_effects.shapley_effects, color = :green)
ylims!(ax.axis, 0.0, 0.2)
CairoMakie.ylims!(ax.axis, 0.0, 0.2)
ax.axis.xticks = (1:54, ["θ$i" for i in 1:54])
ax.axis.ylabel = "Shapley Indices"
ax.axis.xlabel = "Parameters"
ax.axis.xticklabelrotation = 1
display(fig)
```

![shapnocorr](https://github.com/SciML/GlobalSensitivity.jl/assets/23134958/d102a91a-a4ed-4850-ae0b-dea19acf38f7)
Now let's assume some correlation between the parameters. We will use a correlation of 0.09 between
all the parameters.

```julia
```@example shapley
Corrmat = fill(0.09f0, d, d)
for i in 1:d
Corrmat[i, i] = 1.0f0
Expand All @@ -146,20 +139,13 @@ input_distribution = SklarDist(copula, marginals)
shapley_effects = gsa(batched_loss_n_ode, Shapley(;n_perms = 100, n_var = 100, n_outer = 100), input_distribution, batch = true)
```

```julia
[ Info: Since `n_perms` was set the random version of Shapley will be used
GlobalSensitivity.ShapleyResult{Vector{Float64}, Vector{Float64}}([0.05840668971922802, 0.11052100820850451, 0.04371662911807708, 0.004023059190511713, 0.062410433686220866, 0.02585247272606055, 0.02522310040104824, -0.002943508605003048, 0.0274450019985079, -0.00470104493101865 … 0.03521661871178529, -0.0029363975954207434, -0.01733340691251318, 0.030698550673315273, 0.004089097508271804, 0.01120562788725685, 0.020296678638214393, 0.05371236372397007, 0.041498171895387174, -0.0013082313559600266], [0.17059528049109549, 0.17289231112530734, 0.15914829962624458, 0.1353652701868229, 0.16197334249699738, 0.14706777933327114, 0.13542553650272246, 0.13442651137664902, 0.1187565412647469, 0.14591351295876914 … 0.14961042326057278, 0.11239722253111033, 0.14973761270843763, 0.15863981406666988, 0.14738095559109785, 0.11703801097937833, 0.1573849698258368, 0.16993657676693683, 0.1501327274957186, 0.1755876094816281], [-0.27596006004331913, -0.22834792159709788, -0.2682140381493623, -0.2612928703756612, -0.255057317607894, -0.2624003747671509, -0.24021095114428775, -0.2664194709032351, -0.205317818880396, -0.29069153033020617 … -0.2580198108789374, -0.223234953756397, -0.3108191278210509, -0.28023548489735767, -0.28477757545028, -0.21818887363232467, -0.28817786222042574, -0.2793633267392261, -0.25276197399622125, -0.3454599459399511], [0.3927734394817752, 0.4493899380141069, 0.3556472963855164, 0.2693389887566846, 0.3798781849803357, 0.314105320219272, 0.29065715194638425, 0.260532453693229, 0.2602078228774118, 0.2812894404681689 … 0.32845304830250793, 0.2173621585655555, 0.2761523139960246, 0.34163258624398823, 0.2929557704668236, 0.24060012940683836, 0.3287712194968545, 0.3867880541871663, 0.3357583177869956, 0.34284348322803104])
```
```julia
```@example shapley
fig = Figure(resolution = (600, 400))
ax = barplot(fig[1,1], collect(1:54), shapley_effects.shapley_effects, color = :green)
ylims!(ax.axis, 0.0, 0.2)
CairoMakie.ylims!(ax.axis, 0.0, 0.2)
ax.axis.xticks = (1:54, ["θ$i" for i in 1:54])
ax.axis.ylabel = "Shapley Indices"
ax.axis.xlabel = "Parameters"
ax.axis.xticklabelrotation = 1
display(fig)
```
![shapcorr](https://github.com/SciML/GlobalSensitivity.jl/assets/23134958/c48be7e3-811a-49de-8388-4af1e03d0663)
2 changes: 1 addition & 1 deletion joss/paper.bib
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ @article{Campolongo2007

@article{Morris1991,
abstract = {A computational model is a representation of some physical or other system of interest, first expressed mathematically and then implemented in the form of a computer program; it may be viewed as a function of inputs that, when evaluated, produces outputs. Motivation for this article comes from computational models that are deterministic, complicated enough to make classical mathematical analysis impractical and that have a moderate-to-large number of inputs. The problem of designing computational experiments to determine which inputs have important effects on an output is considered. The proposed experimental plans are composed of individually randomized one-factor-at-a-time designs, and data analysis is based on the resulting random sample of observed elementary effects, those changes in an output due solely to changes in a particular input. Advantages of this approach include a lack of reliance on assumptions of relative sparsity of important inputs, monotonicity of outputs with respect to inputs, or adequacy of a low-order polynomial as an approximation to the computational model.},
annote = {Contains useful justifaction of distinction of Morris method from LHS and similar methods.},
annotate = {Contains useful justifaction of distinction of Morris method from LHS and similar methods.},
author = {Morris, Max D},
doi = {10.2307/1269043},
issn = {0040-1706},
Expand Down
2 changes: 1 addition & 1 deletion src/DGSM_sensitivity.jl
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ The DGSM method takes a probability distribution for each of the
parameters, and samples are obtained from the distributions to create
random parameter sets. Derivatives of the function being analyzed are
then computed at the sampled parameters and specific statistics of those
derivatives are used. The paper by [Sobol and Kucherenko](http://www.sciencedirect.com/science/article/pii/S0378475409000354)
derivatives are used. The paper by [Sobol and Kucherenko](https://www.sciencedirect.com/science/article/pii/S0378475409000354)
discusses the relationship between the DGSM results, `tao` and
`sigma` and the Morris elementary effects and Sobol Indices.
Expand Down
2 changes: 1 addition & 1 deletion src/eFAST_sensitivity.jl
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ x_{i}(s) = G_{i}(sin ω_{i}s), ∀ i=1,2 ,..., N
```
where s is a scalar variable varying over the range ``-∞ < s < +∞``, ``G_{i}`` are transformation functions
and ``{ω_{i}}, ∀ i=1,2,...,N`` is a set of different (angular) frequencies, to be properly selected, associated with each factor for all ``N`` (`samples`) number of parameter sets.
For more details, on the transformation used and other implementation details you can go through [ A. Saltelli et al.](http://dx.doi.org/10.1080/00401706.1999.10485594).
For more details, on the transformation used and other implementation details you can go through [ A. Saltelli et al.](https://dx.doi.org/10.1080/00401706.1999.10485594).
## API
Expand Down

0 comments on commit ffe5474

Please sign in to comment.