Skip to content

Commit

Permalink
Built site for gh-pages
Browse files Browse the repository at this point in the history
  • Loading branch information
Quarto GHA Workflow Runner committed Nov 25, 2024
1 parent 9f420aa commit 41bdb63
Show file tree
Hide file tree
Showing 20 changed files with 76 additions and 76 deletions.
2 changes: 1 addition & 1 deletion .nojekyll
Original file line number Diff line number Diff line change
@@ -1 +1 @@
44379a01
c63b1e4b
4 changes: 2 additions & 2 deletions Aalto2023.html
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@

<meta name="viewport" content="width=device-width, initial-scale=1.0, user-scalable=yes">

<meta name="dcterms.date" content="2024-11-23">
<meta name="dcterms.date" content="2024-11-25">

<title>Bayesian Data Analysis course - Aalto 2023 – Bayesian Data Analysis course</title>
<style>
Expand Down Expand Up @@ -177,7 +177,7 @@ <h1 class="title">Bayesian Data Analysis course - Aalto 2023</h1>
<div>
<div class="quarto-title-meta-heading">Published</div>
<div class="quarto-title-meta-contents">
<p class="date">November 23, 2024</p>
<p class="date">November 25, 2024</p>
</div>
</div>

Expand Down
4 changes: 2 additions & 2 deletions Aalto2024.html
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@

<meta name="viewport" content="width=device-width, initial-scale=1.0, user-scalable=yes">

<meta name="dcterms.date" content="2024-11-23">
<meta name="dcterms.date" content="2024-11-25">

<title>Bayesian Data Analysis course - Aalto 2024 – Bayesian Data Analysis course</title>
<style>
Expand Down Expand Up @@ -206,7 +206,7 @@ <h1 class="title">Bayesian Data Analysis course - Aalto 2024</h1>
<div>
<div class="quarto-title-meta-heading">Published</div>
<div class="quarto-title-meta-contents">
<p class="date">November 23, 2024</p>
<p class="date">November 25, 2024</p>
</div>
</div>

Expand Down
4 changes: 2 additions & 2 deletions BDA3_notes.html
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@

<meta name="viewport" content="width=device-width, initial-scale=1.0, user-scalable=yes">

<meta name="dcterms.date" content="2024-11-23">
<meta name="dcterms.date" content="2024-11-25">

<title>Bayesian Data Analysis course - BDA3 notes – Bayesian Data Analysis course</title>
<style>
Expand Down Expand Up @@ -349,7 +349,7 @@ <h1 class="title">Bayesian Data Analysis course - BDA3 notes</h1>
<div>
<div class="quarto-title-meta-heading">Published</div>
<div class="quarto-title-meta-contents">
<p class="date">November 23, 2024</p>
<p class="date">November 25, 2024</p>
</div>
</div>

Expand Down
4 changes: 2 additions & 2 deletions FAQ.html
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@

<meta name="viewport" content="width=device-width, initial-scale=1.0, user-scalable=yes">

<meta name="dcterms.date" content="2024-11-23">
<meta name="dcterms.date" content="2024-11-25">

<title>Bayesian Data Analysis course - FAQ – Bayesian Data Analysis course</title>
<style>
Expand Down Expand Up @@ -215,7 +215,7 @@ <h1 class="title">Bayesian Data Analysis course - FAQ</h1>
<div>
<div class="quarto-title-meta-heading">Published</div>
<div class="quarto-title-meta-contents">
<p class="date">November 23, 2024</p>
<p class="date">November 25, 2024</p>
</div>
</div>

Expand Down
4 changes: 2 additions & 2 deletions assignments.html
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@

<meta name="viewport" content="width=device-width, initial-scale=1.0, user-scalable=yes">

<meta name="dcterms.date" content="2024-11-23">
<meta name="dcterms.date" content="2024-11-25">

<title>Bayesian Data Analysis course - Assignments – Bayesian Data Analysis course</title>
<style>
Expand Down Expand Up @@ -166,7 +166,7 @@ <h1 class="title">Bayesian Data Analysis course - Assignments</h1>
<div>
<div class="quarto-title-meta-heading">Published</div>
<div class="quarto-title-meta-contents">
<p class="date">November 23, 2024</p>
<p class="date">November 25, 2024</p>
</div>
</div>

Expand Down
2 changes: 1 addition & 1 deletion assignments/search.json
Original file line number Diff line number Diff line change
Expand Up @@ -158,7 +158,7 @@
"href": "template9.html#r2-model",
"title": "Notebook for Assignment 9",
"section": "3.3 R2 Model",
"text": "3.3 R2 Model\nEstimate the model, amend the code below\n\nfit2 &lt;- brm(Gmat ~ ., data = studentstd_Gmat,\n seed = SEED,\n normalize = FALSE,\n prior=c(prior(R2D2(mean_R2 = 0.5, prec_R2 = 1, cons_D2 = 1,\n autoscale = TRUE),class=b)),\n backend = \"cmdstanr\")\n\nStart sampling\n\n\nRunning MCMC with 4 sequential chains...\n\nChain 1 Iteration: 1 / 2000 [ 0%] (Warmup) \nChain 1 Iteration: 100 / 2000 [ 5%] (Warmup) \nChain 1 Iteration: 200 / 2000 [ 10%] (Warmup) \nChain 1 Iteration: 300 / 2000 [ 15%] (Warmup) \nChain 1 Iteration: 400 / 2000 [ 20%] (Warmup) \nChain 1 Iteration: 500 / 2000 [ 25%] (Warmup) \nChain 1 Iteration: 600 / 2000 [ 30%] (Warmup) \nChain 1 Iteration: 700 / 2000 [ 35%] (Warmup) \nChain 1 Iteration: 800 / 2000 [ 40%] (Warmup) \nChain 1 Iteration: 900 / 2000 [ 45%] (Warmup) \nChain 1 Iteration: 1000 / 2000 [ 50%] (Warmup) \nChain 1 Iteration: 1001 / 2000 [ 50%] (Sampling) \nChain 1 Iteration: 1100 / 2000 [ 55%] (Sampling) \nChain 1 Iteration: 1200 / 2000 [ 60%] (Sampling) \nChain 1 Iteration: 1300 / 2000 [ 65%] (Sampling) \nChain 1 Iteration: 1400 / 2000 [ 70%] (Sampling) \nChain 1 Iteration: 1500 / 2000 [ 75%] (Sampling) \nChain 1 Iteration: 1600 / 2000 [ 80%] (Sampling) \nChain 1 Iteration: 1700 / 2000 [ 85%] (Sampling) \nChain 1 Iteration: 1800 / 2000 [ 90%] (Sampling) \nChain 1 Iteration: 1900 / 2000 [ 95%] (Sampling) \nChain 1 Iteration: 2000 / 2000 [100%] (Sampling) \nChain 1 finished in 0.7 seconds.\nChain 2 Iteration: 1 / 2000 [ 0%] (Warmup) \nChain 2 Iteration: 100 / 2000 [ 5%] (Warmup) \nChain 2 Iteration: 200 / 2000 [ 10%] (Warmup) \nChain 2 Iteration: 300 / 2000 [ 15%] (Warmup) \nChain 2 Iteration: 400 / 2000 [ 20%] (Warmup) \nChain 2 Iteration: 500 / 2000 [ 25%] (Warmup) \nChain 2 Iteration: 600 / 2000 [ 30%] (Warmup) \nChain 2 Iteration: 700 / 2000 [ 35%] (Warmup) \nChain 2 Iteration: 800 / 2000 [ 40%] (Warmup) \nChain 2 Iteration: 900 / 2000 [ 45%] (Warmup) \nChain 2 Iteration: 1000 / 2000 [ 50%] (Warmup) \nChain 2 Iteration: 1001 / 2000 [ 50%] (Sampling) \nChain 2 Iteration: 1100 / 2000 [ 55%] (Sampling) \nChain 2 Iteration: 1200 / 2000 [ 60%] (Sampling) \nChain 2 Iteration: 1300 / 2000 [ 65%] (Sampling) \nChain 2 Iteration: 1400 / 2000 [ 70%] (Sampling) \nChain 2 Iteration: 1500 / 2000 [ 75%] (Sampling) \nChain 2 Iteration: 1600 / 2000 [ 80%] (Sampling) \nChain 2 Iteration: 1700 / 2000 [ 85%] (Sampling) \nChain 2 Iteration: 1800 / 2000 [ 90%] (Sampling) \nChain 2 Iteration: 1900 / 2000 [ 95%] (Sampling) \nChain 2 Iteration: 2000 / 2000 [100%] (Sampling) \nChain 2 finished in 0.6 seconds.\nChain 3 Iteration: 1 / 2000 [ 0%] (Warmup) \nChain 3 Iteration: 100 / 2000 [ 5%] (Warmup) \nChain 3 Iteration: 200 / 2000 [ 10%] (Warmup) \nChain 3 Iteration: 300 / 2000 [ 15%] (Warmup) \nChain 3 Iteration: 400 / 2000 [ 20%] (Warmup) \nChain 3 Iteration: 500 / 2000 [ 25%] (Warmup) \nChain 3 Iteration: 600 / 2000 [ 30%] (Warmup) \nChain 3 Iteration: 700 / 2000 [ 35%] (Warmup) \nChain 3 Iteration: 800 / 2000 [ 40%] (Warmup) \nChain 3 Iteration: 900 / 2000 [ 45%] (Warmup) \nChain 3 Iteration: 1000 / 2000 [ 50%] (Warmup) \nChain 3 Iteration: 1001 / 2000 [ 50%] (Sampling) \nChain 3 Iteration: 1100 / 2000 [ 55%] (Sampling) \nChain 3 Iteration: 1200 / 2000 [ 60%] (Sampling) \nChain 3 Iteration: 1300 / 2000 [ 65%] (Sampling) \nChain 3 Iteration: 1400 / 2000 [ 70%] (Sampling) \nChain 3 Iteration: 1500 / 2000 [ 75%] (Sampling) \nChain 3 Iteration: 1600 / 2000 [ 80%] (Sampling) \nChain 3 Iteration: 1700 / 2000 [ 85%] (Sampling) \nChain 3 Iteration: 1800 / 2000 [ 90%] (Sampling) \nChain 3 Iteration: 1900 / 2000 [ 95%] (Sampling) \nChain 3 Iteration: 2000 / 2000 [100%] (Sampling) \nChain 3 finished in 0.7 seconds.\nChain 4 Iteration: 1 / 2000 [ 0%] (Warmup) \nChain 4 Iteration: 100 / 2000 [ 5%] (Warmup) \nChain 4 Iteration: 200 / 2000 [ 10%] (Warmup) \nChain 4 Iteration: 300 / 2000 [ 15%] (Warmup) \nChain 4 Iteration: 400 / 2000 [ 20%] (Warmup) \nChain 4 Iteration: 500 / 2000 [ 25%] (Warmup) \nChain 4 Iteration: 600 / 2000 [ 30%] (Warmup) \nChain 4 Iteration: 700 / 2000 [ 35%] (Warmup) \nChain 4 Iteration: 800 / 2000 [ 40%] (Warmup) \nChain 4 Iteration: 900 / 2000 [ 45%] (Warmup) \nChain 4 Iteration: 1000 / 2000 [ 50%] (Warmup) \nChain 4 Iteration: 1001 / 2000 [ 50%] (Sampling) \nChain 4 Iteration: 1100 / 2000 [ 55%] (Sampling) \nChain 4 Iteration: 1200 / 2000 [ 60%] (Sampling) \nChain 4 Iteration: 1300 / 2000 [ 65%] (Sampling) \nChain 4 Iteration: 1400 / 2000 [ 70%] (Sampling) \nChain 4 Iteration: 1500 / 2000 [ 75%] (Sampling) \nChain 4 Iteration: 1600 / 2000 [ 80%] (Sampling) \nChain 4 Iteration: 1700 / 2000 [ 85%] (Sampling) \nChain 4 Iteration: 1800 / 2000 [ 90%] (Sampling) \nChain 4 Iteration: 1900 / 2000 [ 95%] (Sampling) \nChain 4 Iteration: 2000 / 2000 [100%] (Sampling) \nChain 4 finished in 0.7 seconds.\n\nAll 4 chains finished successfully.\nMean chain execution time: 0.7 seconds.\nTotal execution time: 3.0 seconds.\n\n\nVisualise Posteriors\n\n# Plot marginal posteriors\ndraws2 &lt;- as_draws_df(fit2, variable=paste0('b_',predictors)) |&gt;\n set_variables(predictors)\np &lt;- mcmc_areas(draws2,\n prob_outer=0.98, area_method = \"scaled height\") +\n xlim(c(-3,3))\n\nScale for x is already present.\nAdding another scale for x, which will replace the existing scale.\n\np &lt;- p + scale_y_discrete(limits = rev(levels(p$data$parameter)))\n\nScale for y is already present.\nAdding another scale for y, which will replace the existing scale.\n\np\n\n\n\n\n\n\n\n\nFind prior predictive R2. This should be equal to the prior you set on R2 if all predictors are scaled to have unit variance. Since binary predictors were not standardised, the prior predictive might look slightly different, let’s compute it for illustration\n\nppR2_2&lt;-numeric()\nsims &lt;- 4000\n\nfor (i in 1:sims) {\n sigma2 &lt;- rstudent_t(1,3,0,3)^2\n R2 &lt;- rbeta(1,1,2)\n tau2 &lt;- R2/(1-R2)\n psi &lt;- as.numeric(rdirichlet(1,rep(1,dim(X)[2])))\n beta &lt;- rnorm(dim(X)[2])*sqrt(sigma2*tau2*psi)\n mu &lt;- as.matrix(X)%*%as.vector(beta)\n muvar &lt;- var(mu)\n ppR2_2[i] &lt;- muvar/(muvar+sigma2)\n}\n\nNow, find the predictive R2 and compare to the posterior.\n\n# Prior vs posterior R2\ndata &lt;- data.frame(Prior=ppR2_2,Posterior=bayes_R2(fit2, summary=FALSE)) \nnames(data) &lt;- c(\"Prior\",\"Posterior\")\nmcmc_hist(data,\n breaks=seq(0,1,length.out=100),\n facet_args = list(nrow = 2)) +\n facet_text(size = 13) +\n scale_x_continuous(limits = c(0,1), expand = c(0, 0),\n labels = c(\"0\",\"0.25\",\"0.5\",\"0.75\",\"1\")) +\n theme(axis.line.y = element_blank()) +\n xlab(\"Bayesian R^2\")\n\n\n\n\n\n\n\n\nSummaries of R2 and LOO-R2\n\n# Bayes-R2 and LOO-R2\nbayes_R2(fit2) |&gt; as.data.frame() |&gt; tt()\n\n\n\n \n\n \n \n \n \n \n \n Estimate\n Est.Error\n Q2.5\n Q97.5\n \n \n \n \n \n 0.2527685\n 0.03639153\n 0.180655\n 0.3217405\n \n \n \n \n\n\nloo_R2(fit2) |&gt; as.data.frame() |&gt; tt()\n\n\n\n \n\n \n \n \n \n \n \n Estimate\n Est.Error\n Q2.5\n Q97.5\n \n \n \n \n \n 0.2153758\n 0.03280936\n 0.1509374\n 0.2779119",
"text": "3.3 R2 Model\nEstimate the model, amend the code below\n\nfit2 &lt;- brm(Gmat ~ ., data = studentstd_Gmat,\n seed = SEED,\n normalize = FALSE,\n prior=c(prior(R2D2(mean_R2 = 0.5, prec_R2 = 1, cons_D2 = 1,\n autoscale = TRUE),class=b)),\n backend = \"cmdstanr\")\n\nStart sampling\n\n\nRunning MCMC with 4 sequential chains...\n\nChain 1 Iteration: 1 / 2000 [ 0%] (Warmup) \nChain 1 Iteration: 100 / 2000 [ 5%] (Warmup) \nChain 1 Iteration: 200 / 2000 [ 10%] (Warmup) \nChain 1 Iteration: 300 / 2000 [ 15%] (Warmup) \nChain 1 Iteration: 400 / 2000 [ 20%] (Warmup) \nChain 1 Iteration: 500 / 2000 [ 25%] (Warmup) \nChain 1 Iteration: 600 / 2000 [ 30%] (Warmup) \nChain 1 Iteration: 700 / 2000 [ 35%] (Warmup) \nChain 1 Iteration: 800 / 2000 [ 40%] (Warmup) \nChain 1 Iteration: 900 / 2000 [ 45%] (Warmup) \nChain 1 Iteration: 1000 / 2000 [ 50%] (Warmup) \nChain 1 Iteration: 1001 / 2000 [ 50%] (Sampling) \nChain 1 Iteration: 1100 / 2000 [ 55%] (Sampling) \nChain 1 Iteration: 1200 / 2000 [ 60%] (Sampling) \nChain 1 Iteration: 1300 / 2000 [ 65%] (Sampling) \nChain 1 Iteration: 1400 / 2000 [ 70%] (Sampling) \nChain 1 Iteration: 1500 / 2000 [ 75%] (Sampling) \nChain 1 Iteration: 1600 / 2000 [ 80%] (Sampling) \nChain 1 Iteration: 1700 / 2000 [ 85%] (Sampling) \nChain 1 Iteration: 1800 / 2000 [ 90%] (Sampling) \nChain 1 Iteration: 1900 / 2000 [ 95%] (Sampling) \nChain 1 Iteration: 2000 / 2000 [100%] (Sampling) \nChain 1 finished in 0.7 seconds.\nChain 2 Iteration: 1 / 2000 [ 0%] (Warmup) \nChain 2 Iteration: 100 / 2000 [ 5%] (Warmup) \nChain 2 Iteration: 200 / 2000 [ 10%] (Warmup) \nChain 2 Iteration: 300 / 2000 [ 15%] (Warmup) \nChain 2 Iteration: 400 / 2000 [ 20%] (Warmup) \nChain 2 Iteration: 500 / 2000 [ 25%] (Warmup) \nChain 2 Iteration: 600 / 2000 [ 30%] (Warmup) \nChain 2 Iteration: 700 / 2000 [ 35%] (Warmup) \nChain 2 Iteration: 800 / 2000 [ 40%] (Warmup) \nChain 2 Iteration: 900 / 2000 [ 45%] (Warmup) \nChain 2 Iteration: 1000 / 2000 [ 50%] (Warmup) \nChain 2 Iteration: 1001 / 2000 [ 50%] (Sampling) \nChain 2 Iteration: 1100 / 2000 [ 55%] (Sampling) \nChain 2 Iteration: 1200 / 2000 [ 60%] (Sampling) \nChain 2 Iteration: 1300 / 2000 [ 65%] (Sampling) \nChain 2 Iteration: 1400 / 2000 [ 70%] (Sampling) \nChain 2 Iteration: 1500 / 2000 [ 75%] (Sampling) \nChain 2 Iteration: 1600 / 2000 [ 80%] (Sampling) \nChain 2 Iteration: 1700 / 2000 [ 85%] (Sampling) \nChain 2 Iteration: 1800 / 2000 [ 90%] (Sampling) \nChain 2 Iteration: 1900 / 2000 [ 95%] (Sampling) \nChain 2 Iteration: 2000 / 2000 [100%] (Sampling) \nChain 2 finished in 0.6 seconds.\nChain 3 Iteration: 1 / 2000 [ 0%] (Warmup) \nChain 3 Iteration: 100 / 2000 [ 5%] (Warmup) \nChain 3 Iteration: 200 / 2000 [ 10%] (Warmup) \nChain 3 Iteration: 300 / 2000 [ 15%] (Warmup) \nChain 3 Iteration: 400 / 2000 [ 20%] (Warmup) \nChain 3 Iteration: 500 / 2000 [ 25%] (Warmup) \nChain 3 Iteration: 600 / 2000 [ 30%] (Warmup) \nChain 3 Iteration: 700 / 2000 [ 35%] (Warmup) \nChain 3 Iteration: 800 / 2000 [ 40%] (Warmup) \nChain 3 Iteration: 900 / 2000 [ 45%] (Warmup) \nChain 3 Iteration: 1000 / 2000 [ 50%] (Warmup) \nChain 3 Iteration: 1001 / 2000 [ 50%] (Sampling) \nChain 3 Iteration: 1100 / 2000 [ 55%] (Sampling) \nChain 3 Iteration: 1200 / 2000 [ 60%] (Sampling) \nChain 3 Iteration: 1300 / 2000 [ 65%] (Sampling) \nChain 3 Iteration: 1400 / 2000 [ 70%] (Sampling) \nChain 3 Iteration: 1500 / 2000 [ 75%] (Sampling) \nChain 3 Iteration: 1600 / 2000 [ 80%] (Sampling) \nChain 3 Iteration: 1700 / 2000 [ 85%] (Sampling) \nChain 3 Iteration: 1800 / 2000 [ 90%] (Sampling) \nChain 3 Iteration: 1900 / 2000 [ 95%] (Sampling) \nChain 3 Iteration: 2000 / 2000 [100%] (Sampling) \nChain 3 finished in 0.7 seconds.\nChain 4 Iteration: 1 / 2000 [ 0%] (Warmup) \nChain 4 Iteration: 100 / 2000 [ 5%] (Warmup) \nChain 4 Iteration: 200 / 2000 [ 10%] (Warmup) \nChain 4 Iteration: 300 / 2000 [ 15%] (Warmup) \nChain 4 Iteration: 400 / 2000 [ 20%] (Warmup) \nChain 4 Iteration: 500 / 2000 [ 25%] (Warmup) \nChain 4 Iteration: 600 / 2000 [ 30%] (Warmup) \nChain 4 Iteration: 700 / 2000 [ 35%] (Warmup) \nChain 4 Iteration: 800 / 2000 [ 40%] (Warmup) \nChain 4 Iteration: 900 / 2000 [ 45%] (Warmup) \nChain 4 Iteration: 1000 / 2000 [ 50%] (Warmup) \nChain 4 Iteration: 1001 / 2000 [ 50%] (Sampling) \nChain 4 Iteration: 1100 / 2000 [ 55%] (Sampling) \nChain 4 Iteration: 1200 / 2000 [ 60%] (Sampling) \nChain 4 Iteration: 1300 / 2000 [ 65%] (Sampling) \nChain 4 Iteration: 1400 / 2000 [ 70%] (Sampling) \nChain 4 Iteration: 1500 / 2000 [ 75%] (Sampling) \nChain 4 Iteration: 1600 / 2000 [ 80%] (Sampling) \nChain 4 Iteration: 1700 / 2000 [ 85%] (Sampling) \nChain 4 Iteration: 1800 / 2000 [ 90%] (Sampling) \nChain 4 Iteration: 1900 / 2000 [ 95%] (Sampling) \nChain 4 Iteration: 2000 / 2000 [100%] (Sampling) \nChain 4 finished in 0.7 seconds.\n\nAll 4 chains finished successfully.\nMean chain execution time: 0.7 seconds.\nTotal execution time: 3.1 seconds.\n\n\nVisualise Posteriors\n\n# Plot marginal posteriors\ndraws2 &lt;- as_draws_df(fit2, variable=paste0('b_',predictors)) |&gt;\n set_variables(predictors)\np &lt;- mcmc_areas(draws2,\n prob_outer=0.98, area_method = \"scaled height\") +\n xlim(c(-3,3))\n\nScale for x is already present.\nAdding another scale for x, which will replace the existing scale.\n\np &lt;- p + scale_y_discrete(limits = rev(levels(p$data$parameter)))\n\nScale for y is already present.\nAdding another scale for y, which will replace the existing scale.\n\np\n\n\n\n\n\n\n\n\nFind prior predictive R2. This should be equal to the prior you set on R2 if all predictors are scaled to have unit variance. Since binary predictors were not standardised, the prior predictive might look slightly different, let’s compute it for illustration\n\nppR2_2&lt;-numeric()\nsims &lt;- 4000\n\nfor (i in 1:sims) {\n sigma2 &lt;- rstudent_t(1,3,0,3)^2\n R2 &lt;- rbeta(1,1,2)\n tau2 &lt;- R2/(1-R2)\n psi &lt;- as.numeric(rdirichlet(1,rep(1,dim(X)[2])))\n beta &lt;- rnorm(dim(X)[2])*sqrt(sigma2*tau2*psi)\n mu &lt;- as.matrix(X)%*%as.vector(beta)\n muvar &lt;- var(mu)\n ppR2_2[i] &lt;- muvar/(muvar+sigma2)\n}\n\nNow, find the predictive R2 and compare to the posterior.\n\n# Prior vs posterior R2\ndata &lt;- data.frame(Prior=ppR2_2,Posterior=bayes_R2(fit2, summary=FALSE)) \nnames(data) &lt;- c(\"Prior\",\"Posterior\")\nmcmc_hist(data,\n breaks=seq(0,1,length.out=100),\n facet_args = list(nrow = 2)) +\n facet_text(size = 13) +\n scale_x_continuous(limits = c(0,1), expand = c(0, 0),\n labels = c(\"0\",\"0.25\",\"0.5\",\"0.75\",\"1\")) +\n theme(axis.line.y = element_blank()) +\n xlab(\"Bayesian R^2\")\n\n\n\n\n\n\n\n\nSummaries of R2 and LOO-R2\n\n# Bayes-R2 and LOO-R2\nbayes_R2(fit2) |&gt; as.data.frame() |&gt; tt()\n\n\n\n \n\n \n \n \n \n \n \n Estimate\n Est.Error\n Q2.5\n Q97.5\n \n \n \n \n \n 0.2527685\n 0.03639153\n 0.180655\n 0.3217405\n \n \n \n \n\n\nloo_R2(fit2) |&gt; as.data.frame() |&gt; tt()\n\n\n\n \n\n \n \n \n \n \n \n Estimate\n Est.Error\n Q2.5\n Q97.5\n \n \n \n \n \n 0.2148029\n 0.03319469\n 0.1472044\n 0.2790236",
"crumbs": [
"Templates",
"Notebook for Assignment 9"
Expand Down
Binary file modified assignments/template3_files/figure-html/unnamed-chunk-10-2.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading

0 comments on commit 41bdb63

Please sign in to comment.