-
Notifications
You must be signed in to change notification settings - Fork 13
/
Copy pathmain.html
1902 lines (1813 loc) · 305 KB
/
main.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
<!DOCTYPE html>
<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<meta charset="utf-8" />
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
<meta name="generator" content="pandoc" />
<meta name="viewport" content="width=device-width, initial-scale=1">
<meta name="author" content="Luis Damiano (Universidad Nacional de Rosario), Brian Peterson (University of Washington), Michael Weylandt (Rice University)" />
<title>A Tutorial on Hidden Markov Models using Stan</title>
\(
% Math operators
\newcommand{\argmax}{\arg\!\max}
\newcommand{\argmin}{\arg\!\min}
\newcommand\ev[1]{E\left\langle#1\right\rangle}
\newcommand\vv[1]{V\left\langle#1\right\rangle}
% Math commands
\newcommand{\mat}[1]{\mathbf{#1}}
% Math symbols
\renewcommand{\NN}{\mathcal{N}}
\renewcommand{\UU}{\mathcal{U}}
\renewcommand{\LL}{\mathcal{L}}
\renewcommand{\RR}{\mathbb{R}}
\)
<style type="text/css">code{white-space: pre;}</style>
<style type="text/css">
div.sourceCode { overflow-x: auto; }
table.sourceCode, tr.sourceCode, td.lineNumbers, td.sourceCode {
margin: 0; padding: 0; vertical-align: baseline; border: none; }
table.sourceCode { width: 100%; line-height: 100%; }
td.lineNumbers { text-align: right; padding-right: 4px; padding-left: 4px; color: #aaaaaa; border-right: 1px solid #aaaaaa; }
td.sourceCode { padding-left: 5px; }
code > span.kw { color: #007020; font-weight: bold; } /* Keyword */
code > span.dt { color: #902000; } /* DataType */
code > span.dv { color: #40a070; } /* DecVal */
code > span.bn { color: #40a070; } /* BaseN */
code > span.fl { color: #40a070; } /* Float */
code > span.ch { color: #4070a0; } /* Char */
code > span.st { color: #4070a0; } /* String */
code > span.co { color: #60a0b0; font-style: italic; } /* Comment */
code > span.ot { color: #007020; } /* Other */
code > span.al { color: #ff0000; font-weight: bold; } /* Alert */
code > span.fu { color: #06287e; } /* Function */
code > span.er { color: #ff0000; font-weight: bold; } /* Error */
code > span.wa { color: #60a0b0; font-weight: bold; font-style: italic; } /* Warning */
code > span.cn { color: #880000; } /* Constant */
code > span.sc { color: #4070a0; } /* SpecialChar */
code > span.vs { color: #4070a0; } /* VerbatimString */
code > span.ss { color: #bb6688; } /* SpecialString */
code > span.im { } /* Import */
code > span.va { color: #19177c; } /* Variable */
code > span.cf { color: #007020; font-weight: bold; } /* ControlFlow */
code > span.op { color: #666666; } /* Operator */
code > span.bu { } /* BuiltIn */
code > span.ex { } /* Extension */
code > span.pp { color: #bc7a00; } /* Preprocessor */
code > span.at { color: #7d9029; } /* Attribute */
code > span.do { color: #ba2121; font-style: italic; } /* Documentation */
code > span.an { color: #60a0b0; font-weight: bold; font-style: italic; } /* Annotation */
code > span.cv { color: #60a0b0; font-weight: bold; font-style: italic; } /* CommentVar */
code > span.in { color: #60a0b0; font-weight: bold; font-style: italic; } /* Information */
</style>
<link href="data:text/css;charset=utf-8,body%20%7B%0Abackground%2Dcolor%3A%20%23fff%3B%0Amargin%3A%201em%20auto%3B%0Amax%2Dwidth%3A%20700px%3B%0Aoverflow%3A%20visible%3B%0Apadding%2Dleft%3A%202em%3B%0Apadding%2Dright%3A%202em%3B%0Afont%2Dfamily%3A%20%22Open%20Sans%22%2C%20%22Helvetica%20Neue%22%2C%20Helvetica%2C%20Arial%2C%20sans%2Dserif%3B%0Afont%2Dsize%3A%2014px%3B%0Aline%2Dheight%3A%201%2E35%3B%0A%7D%0A%23header%20%7B%0Atext%2Dalign%3A%20center%3B%0A%7D%0A%23TOC%20%7B%0Aclear%3A%20both%3B%0Amargin%3A%200%200%2010px%2010px%3B%0Apadding%3A%204px%3B%0Awidth%3A%20400px%3B%0Aborder%3A%201px%20solid%20%23CCCCCC%3B%0Aborder%2Dradius%3A%205px%3B%0Abackground%2Dcolor%3A%20%23f6f6f6%3B%0Afont%2Dsize%3A%2013px%3B%0Aline%2Dheight%3A%201%2E3%3B%0A%7D%0A%23TOC%20%2Etoctitle%20%7B%0Afont%2Dweight%3A%20bold%3B%0Afont%2Dsize%3A%2015px%3B%0Amargin%2Dleft%3A%205px%3B%0A%7D%0A%23TOC%20ul%20%7B%0Apadding%2Dleft%3A%2040px%3B%0Amargin%2Dleft%3A%20%2D1%2E5em%3B%0Amargin%2Dtop%3A%205px%3B%0Amargin%2Dbottom%3A%205px%3B%0A%7D%0A%23TOC%20ul%20ul%20%7B%0Amargin%2Dleft%3A%20%2D2em%3B%0A%7D%0A%23TOC%20li%20%7B%0Aline%2Dheight%3A%2016px%3B%0A%7D%0Atable%20%7B%0Amargin%3A%201em%20auto%3B%0Aborder%2Dwidth%3A%201px%3B%0Aborder%2Dcolor%3A%20%23DDDDDD%3B%0Aborder%2Dstyle%3A%20outset%3B%0Aborder%2Dcollapse%3A%20collapse%3B%0A%7D%0Atable%20th%20%7B%0Aborder%2Dwidth%3A%202px%3B%0Apadding%3A%205px%3B%0Aborder%2Dstyle%3A%20inset%3B%0A%7D%0Atable%20td%20%7B%0Aborder%2Dwidth%3A%201px%3B%0Aborder%2Dstyle%3A%20inset%3B%0Aline%2Dheight%3A%2018px%3B%0Apadding%3A%205px%205px%3B%0A%7D%0Atable%2C%20table%20th%2C%20table%20td%20%7B%0Aborder%2Dleft%2Dstyle%3A%20none%3B%0Aborder%2Dright%2Dstyle%3A%20none%3B%0A%7D%0Atable%20thead%2C%20table%20tr%2Eeven%20%7B%0Abackground%2Dcolor%3A%20%23f7f7f7%3B%0A%7D%0Ap%20%7B%0Amargin%3A%200%2E5em%200%3B%0A%7D%0Ablockquote%20%7B%0Abackground%2Dcolor%3A%20%23f6f6f6%3B%0Apadding%3A%200%2E25em%200%2E75em%3B%0A%7D%0Ahr%20%7B%0Aborder%2Dstyle%3A%20solid%3B%0Aborder%3A%20none%3B%0Aborder%2Dtop%3A%201px%20solid%20%23777%3B%0Amargin%3A%2028px%200%3B%0A%7D%0Adl%20%7B%0Amargin%2Dleft%3A%200%3B%0A%7D%0Adl%20dd%20%7B%0Amargin%2Dbottom%3A%2013px%3B%0Amargin%2Dleft%3A%2013px%3B%0A%7D%0Adl%20dt%20%7B%0Afont%2Dweight%3A%20bold%3B%0A%7D%0Aul%20%7B%0Amargin%2Dtop%3A%200%3B%0A%7D%0Aul%20li%20%7B%0Alist%2Dstyle%3A%20circle%20outside%3B%0A%7D%0Aul%20ul%20%7B%0Amargin%2Dbottom%3A%200%3B%0A%7D%0Apre%2C%20code%20%7B%0Abackground%2Dcolor%3A%20%23f7f7f7%3B%0Aborder%2Dradius%3A%203px%3B%0Acolor%3A%20%23333%3B%0Awhite%2Dspace%3A%20pre%2Dwrap%3B%20%0A%7D%0Apre%20%7B%0Aborder%2Dradius%3A%203px%3B%0Amargin%3A%205px%200px%2010px%200px%3B%0Apadding%3A%2010px%3B%0A%7D%0Apre%3Anot%28%5Bclass%5D%29%20%7B%0Abackground%2Dcolor%3A%20%23f7f7f7%3B%0A%7D%0Acode%20%7B%0Afont%2Dfamily%3A%20Consolas%2C%20Monaco%2C%20%27Courier%20New%27%2C%20monospace%3B%0Afont%2Dsize%3A%2085%25%3B%0A%7D%0Ap%20%3E%20code%2C%20li%20%3E%20code%20%7B%0Apadding%3A%202px%200px%3B%0A%7D%0Adiv%2Efigure%20%7B%0Atext%2Dalign%3A%20center%3B%0A%7D%0Aimg%20%7B%0Abackground%2Dcolor%3A%20%23FFFFFF%3B%0Apadding%3A%202px%3B%0Aborder%3A%201px%20solid%20%23DDDDDD%3B%0Aborder%2Dradius%3A%203px%3B%0Aborder%3A%201px%20solid%20%23CCCCCC%3B%0Amargin%3A%200%205px%3B%0A%7D%0Ah1%20%7B%0Amargin%2Dtop%3A%200%3B%0Afont%2Dsize%3A%2035px%3B%0Aline%2Dheight%3A%2040px%3B%0A%7D%0Ah2%20%7B%0Aborder%2Dbottom%3A%204px%20solid%20%23f7f7f7%3B%0Apadding%2Dtop%3A%2010px%3B%0Apadding%2Dbottom%3A%202px%3B%0Afont%2Dsize%3A%20145%25%3B%0A%7D%0Ah3%20%7B%0Aborder%2Dbottom%3A%202px%20solid%20%23f7f7f7%3B%0Apadding%2Dtop%3A%2010px%3B%0Afont%2Dsize%3A%20120%25%3B%0A%7D%0Ah4%20%7B%0Aborder%2Dbottom%3A%201px%20solid%20%23f7f7f7%3B%0Amargin%2Dleft%3A%208px%3B%0Afont%2Dsize%3A%20105%25%3B%0A%7D%0Ah5%2C%20h6%20%7B%0Aborder%2Dbottom%3A%201px%20solid%20%23ccc%3B%0Afont%2Dsize%3A%20105%25%3B%0A%7D%0Aa%20%7B%0Acolor%3A%20%230033dd%3B%0Atext%2Ddecoration%3A%20none%3B%0A%7D%0Aa%3Ahover%20%7B%0Acolor%3A%20%236666ff%3B%20%7D%0Aa%3Avisited%20%7B%0Acolor%3A%20%23800080%3B%20%7D%0Aa%3Avisited%3Ahover%20%7B%0Acolor%3A%20%23BB00BB%3B%20%7D%0Aa%5Bhref%5E%3D%22http%3A%22%5D%20%7B%0Atext%2Ddecoration%3A%20underline%3B%20%7D%0Aa%5Bhref%5E%3D%22https%3A%22%5D%20%7B%0Atext%2Ddecoration%3A%20underline%3B%20%7D%0A%0Acode%20%3E%20span%2Ekw%20%7B%20color%3A%20%23555%3B%20font%2Dweight%3A%20bold%3B%20%7D%20%0Acode%20%3E%20span%2Edt%20%7B%20color%3A%20%23902000%3B%20%7D%20%0Acode%20%3E%20span%2Edv%20%7B%20color%3A%20%2340a070%3B%20%7D%20%0Acode%20%3E%20span%2Ebn%20%7B%20color%3A%20%23d14%3B%20%7D%20%0Acode%20%3E%20span%2Efl%20%7B%20color%3A%20%23d14%3B%20%7D%20%0Acode%20%3E%20span%2Ech%20%7B%20color%3A%20%23d14%3B%20%7D%20%0Acode%20%3E%20span%2Est%20%7B%20color%3A%20%23d14%3B%20%7D%20%0Acode%20%3E%20span%2Eco%20%7B%20color%3A%20%23888888%3B%20font%2Dstyle%3A%20italic%3B%20%7D%20%0Acode%20%3E%20span%2Eot%20%7B%20color%3A%20%23007020%3B%20%7D%20%0Acode%20%3E%20span%2Eal%20%7B%20color%3A%20%23ff0000%3B%20font%2Dweight%3A%20bold%3B%20%7D%20%0Acode%20%3E%20span%2Efu%20%7B%20color%3A%20%23900%3B%20font%2Dweight%3A%20bold%3B%20%7D%20%20code%20%3E%20span%2Eer%20%7B%20color%3A%20%23a61717%3B%20background%2Dcolor%3A%20%23e3d2d2%3B%20%7D%20%0A" rel="stylesheet" type="text/css" />
</head>
<body>
<h1 class="title toc-ignore">A Tutorial on Hidden Markov Models using Stan</h1>
<h4 class="author"><em>Luis Damiano (Universidad Nacional de Rosario)<a href="#fn1" class="footnoteRef" id="fnref1"><sup>1</sup></a>, Brian Peterson (University of Washington), Michael Weylandt (Rice University)</em></h4>
<h4 class="date"><em>December 17th, 2017</em></h4>
<div id="TOC">
<ul>
<li><a href="#the-hidden-markov-model"><span class="toc-section-number">1</span> The Hidden Markov Model</a><ul>
<li><a href="#model-specification"><span class="toc-section-number">1.1</span> Model specification</a></li>
<li><a href="#the-generative-model"><span class="toc-section-number">1.2</span> The generative model</a></li>
<li><a href="#characteristics"><span class="toc-section-number">1.3</span> Characteristics</a></li>
<li><a href="#inference"><span class="toc-section-number">1.4</span> Inference</a></li>
<li><a href="#parameter-estimation"><span class="toc-section-number">1.5</span> Parameter estimation</a></li>
<li><a href="#worked-example"><span class="toc-section-number">1.6</span> Worked example</a></li>
</ul></li>
<li><a href="#the-input-output-hidden-markov-model"><span class="toc-section-number">2</span> The Input-Output Hidden Markov Model</a><ul>
<li><a href="#definitions"><span class="toc-section-number">2.1</span> Definitions</a></li>
<li><a href="#inference-1"><span class="toc-section-number">2.2</span> Inference</a></li>
<li><a href="#parameter-estimation-1"><span class="toc-section-number">2.3</span> Parameter estimation</a></li>
<li><a href="#a-simulation-example"><span class="toc-section-number">2.4</span> A simulation example</a></li>
</ul></li>
<li><a href="#a-markov-switching-garch-model"><span class="toc-section-number">3</span> A Markov Switching GARCH Model</a></li>
<li><a href="#acknowledgements"><span class="toc-section-number">4</span> Acknowledgements</a></li>
<li><a href="#original-computing-environment"><span class="toc-section-number">5</span> Original Computing Environment</a></li>
<li><a href="#references"><span class="toc-section-number">6</span> References</a></li>
</ul>
</div>
<style type="text/css">
body { max-width: 940px !important; }
.main-container {
max-width: 940px;
margin-left: auto;
margin-right: auto;
}
code {
/*color: inherit;
background-color: rgba(0, 0, 0, 0.04);*/
}
img {
max-width:100%;
height: auto;
border: 0px !important;
}
.tabbed-pane {
padding-top: 12px;
}
button.code-folding-btn:focus {
outline: none;
}
</style>
<p>This case study documents the implementation in <a href="http://mc-stan.org/">Stan</a> <span class="citation">(Carpenter et al. 2017)</span> of the Hidden Markov Model (HMM) for unsupervised learning <span class="citation">(Baum and Petrie 1966; Baum and Eagon 1967; Baum and Sell 1968; Baum et al. 1970; Baum 1972)</span>. Additionally, we present the adaptations needed for the Input-Output Hidden Markov Model (IOHMM). IOHMM is an architecture proposed by <span class="citation">Bengio and Frasconi (1994)</span> to map input sequences, sometimes called the control signal, to output sequences. Compared to HMM, it aims at being especially effective at learning long term memory, that is when input-output sequences span long points. Finally, we illustrate the use of HMMs as a component within more complex constructions with a volatility model taken from the econometrics literature. In all cases, we provide a fully Bayesian estimation of the model parameters and inference on hidden quantities, namely filtered and smoothed posterior distribution of the hidden states, and jointly most probable state path.</p>
<p><em>A Tutorial on Hidden Markov Models using Stan</em> is distributed under the <a href="cc-by-v4.0.md">Creative Commons Attribution 4.0 International Public License</a>. Accompanying code is distributed under the <a href="gnu-gpl-v3.0.md">GNU General Public License v3.0</a>. See the <a href="README.md">README</a> file for details. All files are available in the <a href="https://github.com/luisdamiano/stancon18">stancon18</a> GitHub repository.</p>
<hr />
<div id="the-hidden-markov-model" class="section level1">
<h1><span class="header-section-number">1</span> The Hidden Markov Model</h1>
<p>Real-world processes produce observable outputs characterized as signals. These can be discrete or continuous in nature, can be pure or embed uncertainty about the measurements and the explanatory model, come from a stationary or non-stationary source, among many other variations. These signals are modeled to allow for both theoretical descriptions and practical applications. The model itself can be deterministic or stochastic, in which case the signal is characterized as a parametric random process whose parameters can be estimated in a well-defined manner.</p>
<p>Many real-world signals exhibit significant autocorrelation and an extensive literature exists on different means to characterize and model different forms of autocorrelation. One of the simplest and most intuitive is the higher-order Markov process, which extends the “memory” of a standard Markov process beyond the single previous observation. The higher-order Markov process, unfortunately, is not as analytically tractable as its standard version and poses difficulties for statistical inference. A more parsimonious approach assumes that the observed sequence is a noisy observation of an underlying hidden process represented as a first-order Markov chain. In other terms, long-range dependencies between observations are mediated via latent variables. It is important to note that the Markov property is only assumed for the hidden states, and the observations are assumed conditionally independent given these latent states. While the observations may not exhibit any Markov behavior, the simple Markovian structure of the hidden states is sufficient to allow easy inference.</p>
<div id="model-specification" class="section level2">
<h2><span class="header-section-number">1.1</span> Model specification</h2>
<p>HMM involve two interconnected models. The state model consists of a discrete-time, discrete-state<a href="#fn2" class="footnoteRef" id="fnref2"><sup>2</sup></a> first-order Markov chain <span class="math inline">\(z_t \in \{1, \dots, K\}\)</span> that transitions according to <span class="math inline">\(p(z_t | z_{t-1})\)</span>. In turns, the observation model is governed by <span class="math inline">\(p(\mat{y}_t | z_t)\)</span>, where <span class="math inline">\(\mat{y}_t\)</span> are the observations, emissions or output.<a href="#fn3" class="footnoteRef" id="fnref3"><sup>3</sup></a> The corresponding joint distribution is</p>
<p><span class="math display">\[
p(\mat{z}_{1:T}, \mat{y}_{1:T})
= p(\mat{z}_{1:T}) p(\mat{y}_{1:T} | \mat{z}_{1:T})
= \left[ p(z_1) \prod_{t=2}^{T}{p(z_t | z_{t-1})} \right] \left[ \prod_{t=1}^{T}{p(\mat{y}_t | z_{t})} \right].
\]</span></p>
<p>This is a specific instance of the state space model family in which the latent variables are discrete. Each single time slice corresponds to a mixture distribution with component densities given by <span class="math inline">\(p(\mat{y}_t | z_t)\)</span>, thus HMM may be interpreted as an extension of a mixture model in which the choice of component for each observation is not selected independently but depends on the choice of component for the previous observation. In the case of a simple mixture model for an independent and identically distributed sample, the parameters of the transition matrix inside the <span class="math inline">\(i\)</span>-th column are the same, so that the conditional distribution <span class="math inline">\(p(z_t | z_{t-1})\)</span> is independent of <span class="math inline">\(z_{t-1}\)</span>.</p>
<p>When the output is discrete, the observation model commonly takes the form of an observation matrix</p>
<p><span class="math display">\[
p(\mat{y}_t | z_t = k, \mat{\theta}) = \text{Categorical}(\mat{y}_t | \mat{\theta}_k)
\]</span></p>
<p>Alternatively, if the output is continuous, the observation model is frequently a conditional Gaussian <span class="math display">\[
p(\mat{y}_t | z_t = k, \mat{\theta}) = \mathcal{N}(\mat{y}_t | \mat{\mu}_k, \mat{\Sigma}_k).
\]</span></p>
<p>The latter is equivalent to a Gaussian mixture model with cluster membership ruled by Markovian dynamics, also known as Markov Switching Models (MSM). In this context, multiple sequential observations tend to share the same location until they suddenly jump into a new cluster.</p>
<p>The non-stochastic quantities of the model are the length of the observed sequence <span class="math inline">\(T\)</span> and the number of hidden states <span class="math inline">\(K\)</span>. The observed sequence <span class="math inline">\(\mat{y}_t\)</span> is a stochastic known quantity. The parameters of the models are <span class="math inline">\(\mat{\theta} = (\mat{\pi}_1, \mat{\theta}_h, \mat{\theta}_o)\)</span>, where <span class="math inline">\(\mat{\pi}_1\)</span> is the initial state distribution, <span class="math inline">\(\mat{\theta}_h\)</span> are the parameters of the hidden model and <span class="math inline">\(\mat{\theta}_o\)</span> are the parameters of the state-conditional density function <span class="math inline">\(p(\mat{y}_t | z_t)\)</span>. The form of <span class="math inline">\(\mat{\theta}_h\)</span> and <span class="math inline">\(\mat{\theta}_o\)</span> depends on the specification of each model. In the case under study, state transition is characterized by the <span class="math inline">\(K \times K\)</span> sized transition matrix with simplex rows <span class="math inline">\(\mat{A} = \{a_{ij}\}\)</span> with <span class="math inline">\(a_{ij} = p(z_t = j | z_{t-1} = i)\)</span>.</p>
<p>The following Stan code illustrates the case of continuous observations where emissions are modeled as sampled from the Gaussian distribution with parameters <span class="math inline">\(\mu_k\)</span> and <span class="math inline">\(\sigma_k\)</span> for <span class="math inline">\(k \in \{1, \dots, K\}\)</span>. Adaptation for categorical observations should follow the guidelines outlined in the manual <span class="citation">(Stan Development Team 2017c, sec. 10.6)</span>.</p>
<pre class="stan"><code>data {
int<lower=1> T; // number of observations (length)
int<lower=1> K; // number of hidden states
real y[T]; // observations
}
parameters {
// Discrete state model
simplex[K] pi1; // initial state probabilities
simplex[K] A[K]; // transition probabilities
// A[i][j] = p(z_t = j | z_{t-1} = i)
// Continuous observation model
ordered[K] mu; // observation means
real<lower=0> sigma[K]; // observation standard deviations
}</code></pre>
</div>
<div id="the-generative-model" class="section level2">
<h2><span class="header-section-number">1.2</span> The generative model</h2>
<p>We write a routine in the R programming language for our generative model. Broadly speaking, this involves three steps:</p>
<ol style="list-style-type: decimal">
<li>The generation of parameters according to the priors <span class="math inline">\(\mat{\theta}^{(0)} \sim p(\mat{\theta})\)</span>.</li>
<li>The generation of the hidden path <span class="math inline">\(\mat{z}_{1:T}^{(0)}\)</span> according to the transition model parameters.</li>
<li>The generation of the observed quantities based on the sampling distribution <span class="math inline">\(\mat{y}_t^{(0)} \sim p(\mat{y}_t | \mat{z}_{1:T}^{(0)}, \mat{\theta}^{(0)})\)</span>.</li>
</ol>
<p>We break down the description of our code in these three steps.</p>
<div class="sourceCode"><pre class="sourceCode r"><code class="sourceCode r">runif_simplex <-<span class="st"> </span>function(T) {
x <-<span class="st"> </span>-<span class="kw">log</span>(<span class="kw">runif</span>(T))
x /<span class="st"> </span><span class="kw">sum</span>(x)
}
hmm_generate <-<span class="st"> </span>function(K, T) {
<span class="co"># 1. Parameters</span>
pi1 <-<span class="st"> </span><span class="kw">runif_simplex</span>(K)
A <-<span class="st"> </span><span class="kw">t</span>(<span class="kw">replicate</span>(K, <span class="kw">runif_simplex</span>(K)))
mu <-<span class="st"> </span><span class="kw">sort</span>(<span class="kw">rnorm</span>(K, <span class="dv">10</span> *<span class="st"> </span><span class="dv">1</span>:K, <span class="dv">1</span>))
sigma <-<span class="st"> </span><span class="kw">abs</span>(<span class="kw">rnorm</span>(K))
<span class="co"># 2. Hidden path</span>
z <-<span class="st"> </span><span class="kw">vector</span>(<span class="st">"numeric"</span>, T)
z[<span class="dv">1</span>] <-<span class="st"> </span><span class="kw">sample</span>(<span class="dv">1</span>:K, <span class="dt">size =</span> <span class="dv">1</span>, <span class="dt">prob =</span> pi1)
for (t in <span class="dv">2</span>:T)
z[t] <-<span class="st"> </span><span class="kw">sample</span>(<span class="dv">1</span>:K, <span class="dt">size =</span> <span class="dv">1</span>, <span class="dt">prob =</span> A[z[t -<span class="st"> </span><span class="dv">1</span>], ])
<span class="co"># 3. Observations</span>
y <-<span class="st"> </span><span class="kw">vector</span>(<span class="st">"numeric"</span>, T)
for (t in <span class="dv">1</span>:T)
y[t] <-<span class="st"> </span><span class="kw">rnorm</span>(<span class="dv">1</span>, mu[z[t]], sigma[z[t]])
<span class="kw">list</span>(<span class="dt">y =</span> y, <span class="dt">z =</span> z,
<span class="dt">theta =</span> <span class="kw">list</span>(<span class="dt">pi1 =</span> pi1, <span class="dt">A =</span> A,
<span class="dt">mu =</span> mu, <span class="dt">sigma =</span> sigma))
}</code></pre></div>
<div id="generating-parameters-from-the-priors" class="section level3">
<h3><span class="header-section-number">1.2.1</span> Generating parameters from the priors</h3>
<p>The parameters to be generated include the <span class="math inline">\(K\)</span>-sized initial state distribution vector <span class="math inline">\(\mat{\pi}_1\)</span> and the <span class="math inline">\(K \times K\)</span> transition matrix <span class="math inline">\(\mat{A}\)</span>. There are <span class="math inline">\((K-1)(K+1)\)</span> free parameters as the vector and each row of the matrix are simplexes.</p>
<p>We set up uniform priors for <span class="math inline">\(\mat{\pi}_1\)</span> and <span class="math inline">\(\mat{A}\)</span>, a weakly informative Gaussian for the location parameter <span class="math inline">\(\mu_k\)</span> and a weakly informative half-Gaussian that ensures positivity for the scale parameters <span class="math inline">\(\sigma_k\)</span>. An ordinal constraint is imposed on the location parameter to restrict the exploration of the symmetric, degenerate mixture posterior surface to a single ordering of the parameters, thus solving the non-identifiability issues inherent to the model density <span class="citation">(Betancourt 2017)</span>. In the simulation routine, the location parameters are adjusted to ensure that the observations are well-separated. We refer the reader to the Stan Development Team’s <a href="https://github.com/stan-dev/stan/wiki/Prior-Choice-Recommendations">Prior Choice Recommendations</a> Wiki article for advice on selecting priors which are simultaneously computationally efficient and statistically reasonable. Given the fixed quantity <span class="math inline">\(K\)</span>, we draw one sample from the prior distributions <span class="math inline">\(\mat{\theta}^{(0)} \sim p(\mat{\theta})\)</span>.</p>
</div>
<div id="generating-the-hidden-path" class="section level3">
<h3><span class="header-section-number">1.2.2</span> Generating the hidden path</h3>
<p>The initial hidden state is drawn from a multinomial distribution with one trial and event probabilities given by the initial state probability vector <span class="math inline">\(\mat{\pi}_1^{(0)}\)</span>. Given the fixed quantity <span class="math inline">\(T\)</span>, the transition probabilities for each of the following steps <span class="math inline">\(t \in \{2, \dots, T\}\)</span> are generated from a multinomial distribution with one trial and event probabilities given by the <span class="math inline">\(i\)</span>-th row of the transition matrix <span class="math inline">\(\mat{A}_1^{(0)}\)</span>, where <span class="math inline">\(i\)</span> is the state at the previous time step <span class="math inline">\(z_{t-1}^{(0)} = i\)</span>. The hidden states are subsequently sampled based on these transition probabilities.</p>
</div>
<div id="generating-data-from-the-sampling-distribution" class="section level3">
<h3><span class="header-section-number">1.2.3</span> Generating data from the sampling distribution</h3>
<p>The observations conditioned on the hidden states are drawn from a univariate Gaussian density with parameters <span class="math inline">\(\mu_k^{(0)}\)</span> and <span class="math inline">\(\sigma_k^{(0)}\)</span>.</p>
</div>
</div>
<div id="characteristics" class="section level2">
<h2><span class="header-section-number">1.3</span> Characteristics</h2>
<p>One of the most powerful properties of HMM is the ability to exhibit some degree of invariance to local warping of the time axis. Allowing for compression or stretching of the time, the model accommodates for variations in speed. By specification of the latent model, the density function of the duration <span class="math inline">\(\tau\)</span> in state <span class="math inline">\(i\)</span> is given by</p>
<p><span class="math display">\[
p_i(\tau) = (A_{ii})^{\tau} (1 - A_{ii}) \propto \exp (-\tau \ln A_{ii}),
\]</span></p>
<p>which represents the probability that a sequence spends precisely <span class="math inline">\(\tau\)</span> steps in state <span class="math inline">\(i\)</span>. The expected duration conditional on starting in that state is</p>
<p><span class="math display">\[
\bar{\tau}_i = \sum_{\tau = 1}^{\infty}{\tau p_i(\tau)} = \frac{1}{1 - A_{ii}}.
\]</span></p>
<p>The density is an exponentially decaying function of <span class="math inline">\(\tau\)</span>, thus longer durations are less probable than shorter ones. In applications where this proves unrealistic, the diagonal coefficients of the transition matrix <span class="math inline">\(A_{ii} \ \forall \ i\)</span> may be set to zero and each state <span class="math inline">\(i\)</span> is explicitly associated with a probability distribution of possible duration times <span class="math inline">\(p(\tau | i)\)</span> <span class="citation">(Rabiner 1990)</span>.</p>
</div>
<div id="inference" class="section level2">
<h2><span class="header-section-number">1.4</span> Inference</h2>
<p>There are several quantities of interest that can be inferred via different algorithms. Our code contains the implementation of the most relevant methods for unsupervised data: forward, forward-backward and Viterbi decoding algorithms. We acknowledge the authors of the Stan Manual for the thorough illustrations and code snippets, some of which served as a starting point for our own code. As estimation is treated later, we assume that model parameters <span class="math inline">\(\mat{\theta}\)</span> are known.</p>
<table style="width:100%;">
<caption>
Summary of the hidden quantities and their corresponding inference algorithm.
</caption>
<thead>
<tr>
<th style="text-align:left;">
Name
</th>
<th style="text-align:left;">
Hidden Quantity
</th>
<th style="text-align:left;">
Availability at
</th>
<th style="text-align:left;">
Algorithm
</th>
<th style="text-align:left;">
Complexity
</th>
</tr>
</thead>
<tbody>
<tr>
<td style="text-align:left;">
Filtering
</td>
<td style="text-align:left;">
<span class="math inline">\(p(z_t | \mat{y}_{1:t})\)</span>
</td>
<td style="text-align:left;">
<span class="math inline">\(t\)</span> (online)
</td>
<td style="text-align:left;">
Forward
</td>
<td style="text-align:left;">
<span class="math inline">\(O(K^2T)\)</span> <span class="math inline">\(O(KT)\)</span> if left-to-right
</td>
</tr>
<tr>
<td style="text-align:left;">
Smoothing
</td>
<td style="text-align:left;">
<span class="math inline">\(p(z_t | \mat{y}_{1:T})\)</span>
</td>
<td style="text-align:left;">
<span class="math inline">\(T\)</span> (offline)
</td>
<td style="text-align:left;">
Forward-backward
</td>
<td style="text-align:left;">
<span class="math inline">\(O(K^2T)\)</span> <span class="math inline">\(O(KT)\)</span> if left-to-right
</td>
</tr>
<tr>
<td style="text-align:left;">
Fixed lag smoothing
</td>
<td style="text-align:left;">
<span class="math inline">\(p(z_{t - \ell} | \mat{y}_{1:t})\)</span>, <span class="math inline">\(\ell \ge 1\)</span>
</td>
<td style="text-align:left;">
<span class="math inline">\(t + \ell\)</span> (lagged)
</td>
<td style="text-align:left;">
Forward-backward
</td>
<td style="text-align:left;">
<span class="math inline">\(O(K^2T)\)</span> <span class="math inline">\(O(KT)\)</span> if left-to-right
</td>
</tr>
<tr>
<td style="text-align:left;">
State prediction
</td>
<td style="text-align:left;">
<span class="math inline">\(p(z_{t+h} | \mat{y}_{1:t})\)</span>, <span class="math inline">\(h\ge 1\)</span>
</td>
<td style="text-align:left;">
<span class="math inline">\(t\)</span>
</td>
<td style="text-align:left;">
</td>
<td style="text-align:left;">
</td>
</tr>
<tr>
<td style="text-align:left;">
Observation prediction
</td>
<td style="text-align:left;">
<span class="math inline">\(p(y_{t+h} | \mat{y}_{1:t})\)</span>, <span class="math inline">\(h\ge 1\)</span>
</td>
<td style="text-align:left;">
<span class="math inline">\(t\)</span>
</td>
<td style="text-align:left;">
</td>
<td style="text-align:left;">
</td>
</tr>
<tr>
<td style="text-align:left;">
MAP Estimation
</td>
<td style="text-align:left;">
<span class="math inline">\(\argmax_{\mat{z}_{1:T}} p(\mat{z}_{1:T} | \mat{y}_{1:T})\)</span>
</td>
<td style="text-align:left;">
<span class="math inline">\(T\)</span>
</td>
<td style="text-align:left;">
Viterbi decoding
</td>
<td style="text-align:left;">
<span class="math inline">\(O(K^2T)\)</span>
</td>
</tr>
<tr>
<td style="text-align:left;">
Log likelihood
</td>
<td style="text-align:left;">
<span class="math inline">\(p(\mat{y}_{1:T})\)</span>
</td>
<td style="text-align:left;">
<span class="math inline">\(T\)</span>
</td>
<td style="text-align:left;">
Forward
</td>
<td style="text-align:left;">
<span class="math inline">\(O(K^2T)\)</span> <span class="math inline">\(O(KT)\)</span> if left-to-right
</td>
</tr>
</tbody>
</table>
<div id="filtering" class="section level3">
<h3><span class="header-section-number">1.4.1</span> Filtering</h3>
<p>A filter infers the posterior distribution of the hidden states at a given step <span class="math inline">\(t\)</span> based on all the information available up to that point <span class="math inline">\(p(z_t | \mat{y}_{1:t})\)</span>. It achieves better noise reduction than simply estimating the hidden state based on the current estimate <span class="math inline">\(p(z_t | \mat{y}_{t})\)</span>. The filtering process can be run online, or recursively, as new data streams in.</p>
<p>Filtered marginals can be computed recursively using the forward algorithm <span class="citation">(Baum and Eagon 1967)</span>. Let <span class="math inline">\(\psi_t(j) = p(\mat{y}_t | z_t = j)\)</span> be the local evidence at step <span class="math inline">\(t\)</span> and <span class="math inline">\(\Psi(i, j) = p(z_t = j | z_{t-1} = i)\)</span> be the transition probability. First, the one-step-ahead predictive density is computed</p>
<p><span class="math display">\[
p(z_t = j | \mat{y}_{1:t-1}) = \sum_{i}{\Psi(i, j) p(z_{t-1} = i | \mat{y}_{1:t-1})}.
\]</span></p>
<p>Acting as prior information, this quantity is updated with observed data at the step <span class="math inline">\(t\)</span> using the Bayes rule,</p>
<span class="math display">\[\begin{align*}
\label{eq:filtered-beliefstate}
\alpha_t(j)
& \triangleq p(z_t = j | \mat{y}_{1:t}) \\
&= p(z_t = j | \mat{y}_{t}, \mat{y}_{1:t-1}) \\
&= Z_t^{-1} \psi_t(j) p(z_t = j | \mat{y}_{1:t-1}) \\
\end{align*}\]</span>
<p>where the normalization constant is given by</p>
<p><span class="math display">\[
Z_t
\triangleq p(\mat{y}_t | \mat{y}_{1:t-1})
= \sum_{l=1}^{K}{p(\mat{y}_{t} | z_t = l) p(z_t = l | \mat{y}_{1:t-1})}
= \sum_{l=1}^{K}{\psi_t(l) p(z_t = l | \mat{y}_{1:t-1})}.
\]</span></p>
<p>This predict-update cycle results in the filtered belief states at step <span class="math inline">\(t\)</span>. As this algorithm only requires the evaluation of the quantities <span class="math inline">\(\psi_t(j)\)</span> for each value of <span class="math inline">\(z_t\)</span> for every <span class="math inline">\(t\)</span> and fixed <span class="math inline">\(\mat{y}_t\)</span>, the posterior distribution of the latent states is independent of the form of the observation density or indeed of whether the observed variables are continuous or discrete <span class="citation">(Jordan 2003)</span>. In other words, <span class="math inline">\(\alpha_t(j)\)</span> does not depend on the complete form of the density function <span class="math inline">\(p(\mat{y} | \mat{z})\)</span> but only on the point values <span class="math inline">\(p(\mat{y}_t | z_t = j)\)</span> for every <span class="math inline">\(\mat{y}_t\)</span> and <span class="math inline">\(z_t\)</span>.</p>
<p>Let <span class="math inline">\(\mat{\alpha}_t\)</span> be a <span class="math inline">\(K\)</span>-sized vector with the filtered belief states at step <span class="math inline">\(t\)</span>, <span class="math inline">\(\mat{\psi}_t(j)\)</span> be the <span class="math inline">\(K\)</span>-sized vector of local evidence at step <span class="math inline">\(t\)</span>, <span class="math inline">\(\mat{\Psi}\)</span> be the transition matrix and <span class="math inline">\(\mat{u} \odot \mat{v}\)</span> be the Hadamard product, representing element-wise vector multiplication. Then, the Bayesian updating procedure can be expressed in matrix notation as</p>
<p><span class="math display">\[
\mat{\alpha}_t \propto \mat{\psi}_t \odot (\mat{\Psi}^T \mat{\alpha}_{t-1}).
\]</span></p>
<p>In addition to computing the hidden states, the algorithm yields the log likelihood</p>
<p><span class="math display">\[
\LL = \log p(\mat{y}_{1:T} | \mat{\theta}) = \sum_{t=1}^{T}{\log p(\mat{y}_{t} | \mat{y}_{1:t-1})} = \sum_{t=1}^{T}{\log Z_t}.
\]</span></p>
<pre class="stan"><code>transformed parameters {
vector[K] logalpha[T];
{ // Forward algorithm log p(z_t = j | y_{1:t})
real accumulator[K];
logalpha[1] = log(pi1) + normal_lpdf(y[1] | mu, sigma);
for (t in 2:T) {
for (j in 1:K) { // j = current (t)
for (i in 1:K) { // i = previous (t-1)
// Murphy (2012) p. 609 eq. 17.48
// belief state + transition prob + local evidence at t
accumulator[i] = logalpha[t-1, i] + log(A[i, j]) + normal_lpdf(y[t] | mu[j], sigma[j]);
}
logalpha[t, j] = log_sum_exp(accumulator);
}
}
} // Forward
}</code></pre>
<p>The Stan code makes evident that the time complexity of the algorithm is <span class="math inline">\(O(K^2T)\)</span>: there are <span class="math inline">\(K \times K\)</span> iterations within each of the <span class="math inline">\(T\)</span> iterations of the outer loop. Brute-forcing through all possible hidden states <span class="math inline">\(K^T\)</span> would prove prohibitive for realistic problems as time complexity increases exponentially with sequence length <span class="math inline">\(O(K^TT)\)</span>.</p>
<p>The implementation is representative of the matrix notation in <span class="citation">Murphy (2012 eq. 17.48)</span>. The <code>accumulator</code> variable carries the element-wise operations for all possible previous states which are later combined as indicated by the matrix multiplication.</p>
<p>Since log domain should be preferred to avoid numerical underflow, multiplications are translated into sums of logs. Furthermore, we use Stan’s implementation of the linear sums on the log scale to prevent underflow and overflow in the exponentiation <span class="citation">(Stan Development Team 2017c, 192)</span>. In consequence, <code>logalpha</code> represents the forward quantity in log scale and needs to be exponentially normalized for interpretability.</p>
<pre class="stan"><code>generated quantities {
vector[K] alpha[T];
{ // Forward algortihm
for (t in 1:T)
alpha[t] = softmax(logalpha[t]);
} // Forward
}</code></pre>
<p>Since the unnormalized forward probability is sufficient to compute the posterior log density and estimate the parameters, it should be part of either the <code>model</code> or the <code>transformed parameters</code> blocks. We chose the latter to keep track of the estimates. We expand on estimation afterward.</p>
</div>
<div id="smoothing" class="section level3">
<h3><span class="header-section-number">1.4.2</span> Smoothing</h3>
<p>A smoother infers the posterior distribution of the hidden states at a given state based on all the observations or evidence <span class="math inline">\(p(z_t | \mat{y}_{1:T})\)</span>. Although noise and uncertainty are significantly reduced as a result of conditioning on past and future data, the smoothing process can only be run offline.</p>
<p>Inference can be done by means of the forward-backward algorithm, which also plays an important role in the Baum-Welch algorithm for learning model parameters <span class="citation">(Baum and Eagon 1967; Baum et al. 1970)</span>. Let <span class="math inline">\(\gamma_t(j)\)</span> be the desired smoothed posterior marginal,</p>
<p><span class="math display">\[
\gamma_t(j)
\triangleq p(z_t = j | \mat{y}_{1:T}),
\]</span></p>
<p><span class="math inline">\(\alpha_t(j)\)</span> be the filtered belief state at the step <span class="math inline">\(t\)</span> as defined previously, and <span class="math inline">\(\beta_t(j)\)</span> be the conditional likelihood of future evidence given that the hidden state at step <span class="math inline">\(t\)</span> is <span class="math inline">\(j\)</span>,</p>
<p><span class="math display">\[
\beta_t(j)
\triangleq p(\mat{y}_{t+1:T} | z_t = j).
\]</span></p>
<p>Then, the chain of smoothed marginals can be segregated into the past and the future components by conditioning on the belief state <span class="math inline">\(z_t\)</span>,</p>
<p><span class="math display">\[
\gamma_t(j)
= \frac{\alpha_t(j) \beta_t(j)}{p(\mat{y}_{1:T})}
\propto \alpha_t(j) \beta_t(j).
\]</span></p>
<p>The future component can be computed recursively from right to left:</p>
<span class="math display">\[\begin{align*}
\beta_{t-1}(i)
&= p(\mat{y}_{t:T} | z_{t-1} = i) \\
&= \sum_{j=1}^{K}{p(z_t =j, \mat{y}_{t}, \mat{y}_{t+1:T} | z_{t-1} = i)} \\
&= \sum_{j=1}^{K}{p(\mat{y}_{t+1:T} | z_t = j)p(z_t = j, \mat{y}_{t} | z_{t-1} = i)} \\
&= \sum_{j=1}^{K}{p(\mat{y}_{t+1:T} | z_t = j)p(\mat{y}_t | z_t = j)p(z_t = j | z_{t-1} = i)} \\
&= \sum_{j=1}^{K}{\beta_t(j) \psi_t(j) \Psi(i, j)}
\end{align*}\]</span>
<p>Let <span class="math inline">\(\mat{\beta}_t\)</span> be a <span class="math inline">\(K\)</span>-sized vector with the conditional likelihood of future evidence given the hidden state at step <span class="math inline">\(t\)</span>. Then, the backward procedure can be expressed in matrix notation as</p>
<p><span class="math display">\[
\mat{\beta}_{t-1} \propto \mat{\Psi} (\mat{\psi}_t \odot \mat{\beta}_{t}).
\]</span></p>
<p>At the last step, the base case is given by <span class="math display">\[
\beta_{T}(i)
= p(\mat{y}_{T+1:T} | z_{T} = i) = p(\varnothing | z_t = i) = 1.
\]</span></p>
<p>Intuitively, the forward-backward algorithm passes information from left to right and then from right to left, combining them at each node. A straightforward implementation of the algorithm runs in <span class="math inline">\(O(K^2 T)\)</span> time because of the <span class="math inline">\(K \times K\)</span> matrix multiplication at each step. Nonetheless the frequent description as two subsequent passes, these procedures are not inherently sequential and share no information. As a result, they could be implemented in parallel.</p>
<p>There is a significant reduction if the transition matrix is sparse. Inference for a left-to-right (upper triangular) transition matrix, a model where the state index increases or stays the same as time passes, runs in <span class="math inline">\(O(TK)\)</span> time <span class="citation">(Bakis 1976; Jelinek 1976)</span>. Additional assumptions about the form of the transition matrix may ease complexity further, for example decreasing the time to <span class="math inline">\(O(TK\log K)\)</span> if <span class="math inline">\(\psi(i, j) \propto \exp(-\sigma^2 |\mat{z}_i - \mat{z}_j|)\)</span>. Finally, ad-hoc data pre-processing strategies may help control complexity, for example by pruning nodes with low conditional probability of occurrence.</p>
<pre class="stan"><code>functions {
vector normalize(vector x) {
return x / sum(x);
}
}
generated quantities {
vector[K] alpha[T];
vector[K] logbeta[T];
vector[K] loggamma[T];
vector[K] beta[T];
vector[K] gamma[T];
{ // Forward algortihm
for (t in 1:T)
alpha[t] = softmax(logalpha[t]);
} // Forward
{ // Backward algorithm log p(y_{t+1:T} | z_t = j)
real accumulator[K];
for (j in 1:K)
logbeta[T, j] = 1;
for (tforward in 0:(T-2)) {
int t;
t = T - tforward;
for (j in 1:K) { // j = previous (t-1)
for (i in 1:K) { // i = next (t)
// Murphy (2012) Eq. 17.58
// backwards t + transition prob + local evidence at t
accumulator[i] = logbeta[t, i] + log(A[j, i]) + normal_lpdf(y[t] | mu[i], sigma[i]);
}
logbeta[t-1, j] = log_sum_exp(accumulator);
}
}
for (t in 1:T)
beta[t] = softmax(logbeta[t]);
} // Backward
{ // forward-backward algorithm log p(z_t = j | y_{1:T})
for(t in 1:T) {
loggamma[t] = alpha[t] .* beta[t];
}
for(t in 1:T)
gamma[t] = normalize(loggamma[t]);
} // forward-backward
}</code></pre>
<p>The reader should not be deceived by the similarity to the code shown in the filtering section. Note that the indices in the log transition matrix are inverted and the evidence is now computed for the next state. We need to invert the time index as backward ranges are not available in Stan.</p>
<p>The forward-backward algorithm was designed to exploit via recursion the conditional independencies in the HMM. First, the posterior marginal probability of the latent states at a given time step is broken down into two quantities: the past and the future components. Second, taking advantage of the Markov properties, each of the two are further broken down into simpler pieces via conditioning and marginalizing, thus creating an efficient predict-update cycle.</p>
<p>This strategy makes otherwise unfeasible calculations possible. Consider for example a time series with <span class="math inline">\(T = 100\)</span> observations and <span class="math inline">\(K = 5\)</span> hidden states. Summing the joint probability over all possible state sequences would involve <span class="math inline">\(2 \times 100 \times 5^{100} \approx 10^{72}\)</span> computations, while the forward and backward passes only take <span class="math inline">\(3,000\)</span> each. Moreover, one pass may be avoided depending on the goal of the data analysis. Summing the forward probabilities at the last time step is enough to compute the likelihood, and the backwards recursion would be only needed if the posterior probabilities of the states were also required.</p>
</div>
<div id="map-viterbi" class="section level3">
<h3><span class="header-section-number">1.4.3</span> MAP: Viterbi</h3>
<p>It is also of interest to compute the most probable state sequence or path,</p>
<p><span class="math display">\[
\mat{z}^* = \argmax_{\mat{z}_{1:T}} p(\mat{z}_{1:T} | \mat{y}_{1:T}).
\]</span></p>
<p>The jointly most probable sequence of states can be inferred via maximum <em>a posteriori</em> (MAP) estimation. Note that the jointly most probable sequence is not necessarily the same as the sequence of marginally most probable states given by the maximizer of the posterior marginals (MPM),</p>
<p><span class="math display">\[
\mat{\hat{z}} = (\argmax_{z_1} p(z_1 | \mat{y}_{1:T}), \ \dots, \ \argmax_{z_T} p(z_T | \mat{y}_{1:T})),
\]</span></p>
<p>which maximizes the expected number of correct individual states.</p>
<p>The MAP estimate is always globally consistent: while locally a state may be most probable at a given step, the Viterbi or max-sum algorithm decodes the most likely single plausible path <span class="citation">(Viterbi 1967)</span>. Furthermore, the MPM sequence may have zero joint probability if it includes two successive states that, while being individually the most probable, are connected in the transition matrix by a zero. On the other hand, MPM may be considered more robust since the state at each step is estimated by averaging over its neighbors rather than conditioning on a specific value of them.</p>
<p>The Viterbi applies the max-sum algorithm in a forward fashion plus a traceback procedure to recover the most probable path. In simple terms, once the most probable state <span class="math inline">\(z_t\)</span> is estimated, the procedure conditions the previous states on it. Let <span class="math inline">\(\delta_t(j)\)</span> be the probability of arriving to the state <span class="math inline">\(j\)</span> at step <span class="math inline">\(t\)</span> given the most probable path was taken,</p>
<p><span class="math display">\[
\delta_t(j)
\triangleq \max_{z_1, \dots, z_{t-1}} p(\mat{z}_{1:t-1}, z_t = j | \mat{y}_{1:t}).
\]</span></p>
<p>The most probable path to state <span class="math inline">\(j\)</span> at step <span class="math inline">\(t\)</span> consists of the most probable path to some other state <span class="math inline">\(i\)</span> at point <span class="math inline">\(t-1\)</span>, followed by a transition from <span class="math inline">\(i\)</span> to <span class="math inline">\(j\)</span>,</p>
<p><span class="math display">\[
\delta_t(j)
= \max_{i} \delta_{t-1}(i) \ \psi(i, j) \ \psi_t(j).
\]</span></p>
<p>Additionally, the most likely previous state on the most probable path to <span class="math inline">\(j\)</span> at step <span class="math inline">\(t\)</span> is given by <span class="math display">\[
a_t(j)
= \argmax_{i} \delta_{t-1}(i) \ \psi(i, j) \ \psi_t(j).
\]</span></p>
<p>By initializing with <span class="math inline">\(\delta_1 = \pi_j \phi_1(j)\)</span> and terminating with the most probable final state <span class="math inline">\(z_T^* = \argmax_{i} \delta_T(i)\)</span>, the most probable sequence of states is estimated using the traceback,</p>
<p><span class="math display">\[
z_t^* = a_{t+1}(z_{t+1}^*).
\]</span></p>
<p>It is advisable to work in the log domain to avoid numerical underflow,</p>
<p><span class="math display">\[
\delta_t(j)
\triangleq \max_{\mat{z}_{1:t-1}} \log p(\mat{z}_{1:t-1}, z_t = j | \mat{y}_{1:t})
= \max_{i} \log \delta_{t-1}(i) + \log \psi(i, j) + \log \psi_t(j).
\]</span></p>
<p>As with the backward-forward algorithm, the time complexity of Viterbi is <span class="math inline">\(O(K^2T)\)</span> and the space complexity is <span class="math inline">\(O(KT)\)</span>. If the transition matrix has the form <span class="math inline">\(\psi(i, j) \propto \exp(-\sigma^2 ||\mat{z}_i - \mat{z}_j||^2)\)</span>, implementation runs in <span class="math inline">\(O(TK)\)</span> time.</p>
<pre class="stan"><code>generated quantities {
int<lower=1, upper=K> zstar[T];
real logp_zstar;
{ // Viterbi algorithm
int bpointer[T, K]; // backpointer to the most likely previous state on the most probable path
real delta[T, K]; // max prob for the sequence up to t
// that ends with an emission from state k
for (j in 1:K)
delta[1, K] = normal_lpdf(y[1] | mu[j], sigma[j]);
for (t in 2:T) {
for (j in 1:K) { // j = current (t)
delta[t, j] = negative_infinity();
for (i in 1:K) { // i = previous (t-1)
real logp;
logp = delta[t-1, i] + log(A[i, j]) + normal_lpdf(y[t] | mu[j], sigma[j]);
if (logp > delta[t, j]) {
bpointer[t, j] = i;
delta[t, j] = logp;
}
}
}
}
logp_zstar = max(delta[T]);
for (j in 1:K)
if (delta[T, j] == logp_zstar)
zstar[T] = j;
for (t in 1:(T - 1)) {
zstar[T - t] = bpointer[T - t + 1, zstar[T - t + 1]];
}
}
}</code></pre>
<p>The variable <code>delta</code> is a straightforward implementation of the corresponding equation. <code>bpointer</code> is the traceback needed to compute the most probable sequence of states after the most probably final state <code>zstar</code> is computed.</p>
</div>
</div>
<div id="parameter-estimation" class="section level2">
<h2><span class="header-section-number">1.5</span> Parameter estimation</h2>
<p>The model likelihood can be derived from the definition of the quantity <span class="math inline">\(\gamma_t(j)\)</span>: given that its sum over all possible values of the latent variable must equal one, the log likelihood at time index <span class="math inline">\(t\)</span> becomes</p>
<p><span class="math display">\[
\LL_t = \sum_{i = 1}^{K}{\alpha_t(i) \beta_{t}(i)}.
\]</span></p>
<p>The last step <span class="math inline">\(T\)</span> has two convenient characteristics. First, the recurrent nature of the forward probability implies that the last iteration retains the information of all the intermediate state probabilities. Second, the base case for the backwards quantity is <span class="math inline">\(\beta_{T}(i) = 1\)</span>. Consequently, the log likelihood reduces to</p>
<p><span class="math display">\[
\LL_T \propto \sum_{i = 1}^{K}{\alpha_T(i)}.
\]</span></p>
<pre class="stan"><code>model {
target += log_sum_exp(logalpha[T]); // Note: update based only on last logalpha
}</code></pre>
<p>As we expect high multimodality in the posterior density, we use a clustering algorithm to feed the sampler with initialization values for the location and scale parameters. Although <span class="math inline">\(K\)</span>-means is not a good choice for this data because it does not consider the time-dependent nature of the data, it provides an educated guess sufficient for initialization.</p>
<div class="sourceCode"><pre class="sourceCode r"><code class="sourceCode r">hmm_init <-<span class="st"> </span>function(K, y) {
clasif <-<span class="st"> </span><span class="kw">kmeans</span>(y, K)
init.mu <-<span class="st"> </span><span class="kw">by</span>(y, clasif$cluster, mean)
init.sigma <-<span class="st"> </span><span class="kw">by</span>(y, clasif$cluster, sd)
init.order <-<span class="st"> </span><span class="kw">order</span>(init.mu)
<span class="kw">list</span>(
<span class="dt">mu =</span> init.mu[init.order],
<span class="dt">sigma =</span> init.sigma[init.order]
)
}
hmm_fit <-<span class="st"> </span>function(K, y) {
<span class="kw">rstan_options</span>(<span class="dt">auto_write =</span> <span class="ot">TRUE</span>)
<span class="kw">options</span>(<span class="dt">mc.cores =</span> parallel::<span class="kw">detectCores</span>())
stan.model =<span class="st"> 'stan/hmm_gaussian.stan'</span>
stan.data =<span class="st"> </span><span class="kw">list</span>(
<span class="dt">T =</span> <span class="kw">length</span>(y),
<span class="dt">K =</span> K,
<span class="dt">y =</span> y
)
<span class="kw">stan</span>(<span class="dt">file =</span> stan.model,
<span class="dt">data =</span> stan.data, <span class="dt">verbose =</span> T,
<span class="dt">iter =</span> <span class="dv">400</span>, <span class="dt">warmup =</span> <span class="dv">200</span>,
<span class="dt">thin =</span> <span class="dv">1</span>, <span class="dt">chains =</span> <span class="dv">1</span>,
<span class="dt">cores =</span> <span class="dv">1</span>, <span class="dt">seed =</span> <span class="dv">900</span>,
<span class="dt">init =</span> function(){<span class="kw">hmm_init</span>(K, y)})
}</code></pre></div>
</div>
<div id="worked-example" class="section level2">
<h2><span class="header-section-number">1.6</span> Worked example</h2>
<p>We draw one sample of length <span class="math inline">\(T = 500\)</span> from a data generating process with <span class="math inline">\(K = 3\)</span> latent states. We fit the model using the Stan code and the initialization methodology introduced previously.</p>
<div class="sourceCode"><pre class="sourceCode r"><code class="sourceCode r"><span class="kw">set.seed</span>(<span class="dv">900</span>)
K <-<span class="st"> </span><span class="dv">3</span>
T_length <-<span class="st"> </span><span class="dv">500</span>
dataset <-<span class="st"> </span><span class="kw">hmm_generate</span>(K, T_length)
fit <-<span class="st"> </span><span class="kw">hmm_fit</span>(K, dataset$y)</code></pre></div>
<pre><code>##
## TRANSLATING MODEL 'hmm_gaussian' FROM Stan CODE TO C++ CODE NOW.
## successful in parsing the Stan model 'hmm_gaussian'.
##
## CHECKING DATA AND PREPROCESSING FOR MODEL 'hmm_gaussian' NOW.
##
## COMPILING MODEL 'hmm_gaussian' NOW.
##
## STARTING SAMPLER FOR MODEL 'hmm_gaussian' NOW.
##
## SAMPLING FOR MODEL 'hmm_gaussian' NOW (CHAIN 1).
##
## Gradient evaluation took 0.002 seconds
## 1000 transitions using 10 leapfrog steps per transition would take 20 seconds.
## Adjust your expectations accordingly!
##
##
## Iteration: 1 / 400 [ 0%] (Warmup)
## Iteration: 40 / 400 [ 10%] (Warmup)
## Iteration: 80 / 400 [ 20%] (Warmup)
## Iteration: 120 / 400 [ 30%] (Warmup)
## Iteration: 160 / 400 [ 40%] (Warmup)
## Iteration: 200 / 400 [ 50%] (Warmup)
## Iteration: 201 / 400 [ 50%] (Sampling)
## Iteration: 240 / 400 [ 60%] (Sampling)
## Iteration: 280 / 400 [ 70%] (Sampling)
## Iteration: 320 / 400 [ 80%] (Sampling)
## Iteration: 360 / 400 [ 90%] (Sampling)
## Iteration: 400 / 400 [100%] (Sampling)
##
## Elapsed Time: 28.452 seconds (Warm-up)
## 3.768 seconds (Sampling)
## 32.22 seconds (Total)</code></pre>
<p>The estimates are extremely efficient as expected when dealing with generated data. The Markov Chain are well behaved as diagnosed by the low Monte Carlo standard error, the high effective sample size and the near-one shrink factor of <span class="citation">Gelman and Rubin (1992)</span>. Although not shown, further diagnostics confirm satisfactory mixing, convergence and the absence of divergences. Point estimates and posterior intervals are provided by rstan’s <code>summary</code> function.</p>
<table>
<caption>Estimated parameters and hidden quantities. <em>MCSE = Monte Carlo Standard Error, SE = Standard Error, ESS = Effective Sample Size</em>.</caption>
<thead>
<tr class="header">
<th></th>
<th align="right">True</th>
<th align="right">Mean</th>
<th align="right">MCSE</th>
<th align="right">SE</th>
<th align="right"><span class="math inline">\(q_{10\%}\)</span></th>
<th align="right"><span class="math inline">\(q_{50\%}\)</span></th>
<th align="right"><span class="math inline">\(q_{90\%}\)</span></th>
<th align="right">ESS</th>
<th align="right"><span class="math inline">\(\hat{R}\)</span></th>
</tr>
</thead>
<tbody>
<tr class="odd">
<td>pi1[1]</td>
<td align="right">0.14</td>
<td align="right">0.35</td>
<td align="right">0.02</td>
<td align="right">0.24</td>
<td align="right">0.05</td>
<td align="right">0.31</td>
<td align="right">0.69</td>
<td align="right">200.00</td>
<td align="right">1.00</td>
</tr>
<tr class="even">
<td>pi1[2]</td>
<td align="right">0.38</td>
<td align="right">0.29</td>
<td align="right">0.02</td>
<td align="right">0.22</td>
<td align="right">0.04</td>
<td align="right">0.26</td>
<td align="right">0.61</td>
<td align="right">200.00</td>
<td align="right">1.01</td>
</tr>
<tr class="odd">
<td>pi1[3]</td>
<td align="right">0.47</td>
<td align="right">0.36</td>
<td align="right">0.02</td>
<td align="right">0.22</td>
<td align="right">0.08</td>
<td align="right">0.35</td>
<td align="right">0.66</td>
<td align="right">200.00</td>
<td align="right">1.00</td>
</tr>
<tr class="even">
<td>A[1,1]</td>
<td align="right">0.03</td>
<td align="right">0.02</td>
<td align="right">0.00</td>
<td align="right">0.02</td>
<td align="right">0.00</td>
<td align="right">0.02</td>
<td align="right">0.05</td>
<td align="right">198.89</td>
<td align="right">1.00</td>
</tr>
<tr class="odd">
<td>A[1,2]</td>
<td align="right">0.54</td>
<td align="right">0.53</td>
<td align="right">0.00</td>
<td align="right">0.05</td>
<td align="right">0.47</td>
<td align="right">0.53</td>
<td align="right">0.59</td>
<td align="right">200.00</td>
<td align="right">1.00</td>
</tr>
<tr class="even">
<td>A[1,3]</td>
<td align="right">0.43</td>
<td align="right">0.45</td>
<td align="right">0.00</td>
<td align="right">0.05</td>
<td align="right">0.38</td>
<td align="right">0.45</td>
<td align="right">0.50</td>
<td align="right">200.00</td>
<td align="right">1.00</td>
</tr>
<tr class="odd">
<td>A[2,1]</td>
<td align="right">0.56</td>
<td align="right">0.57</td>
<td align="right">0.00</td>
<td align="right">0.04</td>
<td align="right">0.52</td>
<td align="right">0.57</td>
<td align="right">0.63</td>
<td align="right">200.00</td>
<td align="right">1.00</td>
</tr>
<tr class="even">
<td>A[2,2]</td>
<td align="right">0.31</td>
<td align="right">0.30</td>
<td align="right">0.00</td>
<td align="right">0.04</td>
<td align="right">0.24</td>
<td align="right">0.30</td>
<td align="right">0.35</td>
<td align="right">200.00</td>
<td align="right">1.00</td>
</tr>
<tr class="odd">
<td>A[2,3]</td>
<td align="right">0.13</td>
<td align="right">0.13</td>
<td align="right">0.00</td>
<td align="right">0.03</td>
<td align="right">0.10</td>
<td align="right">0.13</td>
<td align="right">0.17</td>
<td align="right">200.00</td>
<td align="right">1.00</td>
</tr>
<tr class="even">
<td>A[3,1]</td>
<td align="right">0.20</td>
<td align="right">0.17</td>
<td align="right">0.00</td>
<td align="right">0.05</td>
<td align="right">0.12</td>
<td align="right">0.17</td>
<td align="right">0.24</td>
<td align="right">200.00</td>
<td align="right">1.00</td>
</tr>
<tr class="odd">
<td>A[3,2]</td>
<td align="right">0.72</td>
<td align="right">0.79</td>
<td align="right">0.00</td>
<td align="right">0.05</td>
<td align="right">0.72</td>
<td align="right">0.80</td>
<td align="right">0.85</td>
<td align="right">123.20</td>
<td align="right">1.00</td>
</tr>
<tr class="even">
<td>A[3,3]</td>
<td align="right">0.07</td>
<td align="right">0.03</td>
<td align="right">0.00</td>
<td align="right">0.02</td>
<td align="right">0.01</td>
<td align="right">0.03</td>
<td align="right">0.07</td>
<td align="right">78.75</td>
<td align="right">1.00</td>
</tr>
<tr class="odd">
<td>mu[1]</td>
<td align="right">8.94</td>
<td align="right">9.16</td>
<td align="right">0.01</td>
<td align="right">0.14</td>
<td align="right">8.98</td>
<td align="right">9.15</td>
<td align="right">9.34</td>
<td align="right">200.00</td>
<td align="right">1.00</td>
</tr>
<tr class="even">
<td>mu[2]</td>
<td align="right">18.73</td>
<td align="right">18.64</td>
<td align="right">0.03</td>
<td align="right">0.29</td>
<td align="right">18.29</td>
<td align="right">18.63</td>
<td align="right">18.99</td>
<td align="right">107.77</td>
<td align="right">1.00</td>
</tr>
<tr class="odd">
<td>mu[3]</td>
<td align="right">29.23</td>
<td align="right">29.52</td>
<td align="right">0.01</td>
<td align="right">0.17</td>
<td align="right">29.30</td>
<td align="right">29.52</td>
<td align="right">29.76</td>
<td align="right">200.00</td>
<td align="right">1.00</td>
</tr>
<tr class="even">
<td>sigma[1]</td>
<td align="right">0.19</td>
<td align="right">1.80</td>
<td align="right">0.01</td>
<td align="right">0.12</td>
<td align="right">1.65</td>
<td align="right">1.79</td>
<td align="right">1.97</td>
<td align="right">200.00</td>
<td align="right">1.01</td>
</tr>
<tr class="odd">
<td>sigma[2]</td>
<td align="right">3.65</td>
<td align="right">3.98</td>
<td align="right">0.02</td>
<td align="right">0.24</td>
<td align="right">3.67</td>
<td align="right">3.97</td>
<td align="right">4.30</td>
<td align="right">100.08</td>
<td align="right">1.01</td>
</tr>
<tr class="even">
<td>sigma[3]</td>
<td align="right">1.69</td>
<td align="right">1.75</td>
<td align="right">0.01</td>
<td align="right">0.15</td>
<td align="right">1.57</td>
<td align="right">1.74</td>
<td align="right">1.94</td>
<td align="right">200.00</td>
<td align="right">1.00</td>
</tr>
</tbody>
</table>
<p>We extract the samples for some quantities of interest, namely the filtered probabilities vector <span class="math inline">\(\mat{\alpha}_t\)</span>, the smoothed probability vector <span class="math inline">\(\mat{\gamma}_t\)</span> and the most probable hidden path <span class="math inline">\(\mat{z}^*\)</span>. As an informal assessment that our software recover the hidden states correctly, we observe that the filtered probability, the smoothed probability and the most likely path are all reasonable accurate to the true values. As expected, because we used an ordering constraint to break symmetry, the MAP estimate is able to successfully recover the (simulated) hidden path without label switching.</p>
<div class="sourceCode"><pre class="sourceCode r"><code class="sourceCode r">alpha <-<span class="st"> </span><span class="kw">extract</span>(fit, <span class="dt">pars =</span> <span class="st">'alpha'</span>)[[<span class="dv">1</span>]]
gamma <-<span class="st"> </span><span class="kw">extract</span>(fit, <span class="dt">pars =</span> <span class="st">'gamma'</span>)[[<span class="dv">1</span>]]</code></pre></div>
<div class="sourceCode"><pre class="sourceCode r"><code class="sourceCode r">alpha_med <-<span class="st"> </span><span class="kw">apply</span>(alpha, <span class="kw">c</span>(<span class="dv">2</span>, <span class="dv">3</span>), function(x) { <span class="kw">quantile</span>(x, <span class="kw">c</span>(<span class="fl">0.50</span>)) })
alpha_hard <-<span class="st"> </span><span class="kw">apply</span>(alpha_med, <span class="dv">1</span>, which.max)
<span class="kw">table</span>(<span class="dt">true =</span> dataset$z, <span class="dt">estimated =</span> alpha_hard)</code></pre></div>
<pre><code>## estimated
## true 1 2 3
## 1 155 0 0
## 2 6 226 1
## 3 0 5 107</code></pre>
<div class="sourceCode"><pre class="sourceCode r"><code class="sourceCode r">zstar <-<span class="st"> </span><span class="kw">extract</span>(fit, <span class="dt">pars =</span> <span class="st">'zstar'</span>)[[<span class="dv">1</span>]]
<span class="kw">plot_statepath</span>(zstar, dataset$z)</code></pre></div>
<p><img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAA6wAAAJACAMAAAC+FBs7AAAA5FBMVEUAAAAAADoAAGYAOjoAOmYAOpAAZmYAZpAAZrYAzQA6AAA6ADo6AGY6OgA6Ojo6OmY6OpA6ZpA6ZrY6kLY6kNtmAABmAChmADpmAGZmOgBmOjpmOpBmZpBmkLZmkNtmtrZmtttmtv+QOgCQOjqQOmaQZgCQZjqQkGaQkJCQkLaQttuQtv+Q29uQ2/+2ZgC2Zjq2ZpC2kDq2kGa227a229u22/+2/7a2/9u2//++vr7T09PbkDrbkGbbtmbbtpDb25Db27bb2//b/7bb/9vb////AAD/tmb/25D/27b//7b//9v////hmhurAAAACXBIWXMAAA7DAAAOwwHHb6hkAAAgAElEQVR4nO2dC3vcyHKeR7KUSFrbu16TsTfxZSWvj5MTiydxvLGQObc41GhJ/P//EwwwAOqKAUYghRbf75mHaADV1dVV9c6N3NWuRggVod2XDgAhNE/AilAhAlaEChGwIlSIgBWhQgSsCBUiYEWoEAErQoUIWBEqRMCKUCECVoQKEbAiVIiAFaFCBKwIFSJgRagQAStChQhYESpEwIpQIQJWhAoRsCJUiIAVoUIErAgVImBFqBABK0KFCFgRKkTAilAhAlaEChGwIlSIgBWhQgSsCBUiYEWoEAErQoUIWB9Z9//yZ7vds7/4+XEW+82b3e77drjfPXvfX323e3nbm9zsnn+Ixud9/48PZy6glQWsj6sGlFbP3j7GaofjUlftcG1Yf/nB2LoLaG0B6+NqvzvpUTpbECqGSpfC6mwXvSyjSwSsj6ub3bMfm1eh5vX1+NJ6/5vmNfb79lXu+I712Y9tx3dtf3fdviaOJg1u/9ycvOiY+13zbvpPfqy1k06//8/Nrb+9PS52fA3v7MXs0yvrccUXv+4W0+N4xaN+27+Hvzk94fz+h+b47fvxgojmt4/4hv9JCFgfV6fXn09vjiQ2PB714sP49tjCKkz6F+WWvpturC06newaHjWsg/8OVrmiHGcrCidtjO3xMBicLvjZvN6uJmB9XDUd/Oxv//V00lD584nbputf3R5flQyswqSZ+uLn48/O/C9vf7k+WfcWrZphg2Jnpt4G716eZnewyhXN6tGKRzUhvbqtf7c7BdZC/h9vT8Zd1ONsaYxWEbA+rk6vYS9+fTu80d2PgDZ3NazSZN++dW5AeNWTsf+Lf1UWrU6v3TfdFAnr25PTDtZhFbli+9IYrnhUM3ohnmi6Bf94/Hq7D0nMlsZoFQHrY+v4YbN7o9i0c/8usv+C1n5mFSYn8o4vWPL7XGlxVH9rP05pZWc3P1oEj4vJcbJi73t3+jjcP7v8qjPuYdVbGozRKgLWx9f9H//pzbG/D7uhs3sgBKzte0lhYnA78aMsjuo9HY6vihOwyhXlOFmx0y8/dLf++/g2ePfiv/3xeoBVzhbGaBUB66Nq+Gx5ehHrf9savLK2psIkf2VVv7Jd8Mo6rCjHyYq9fvmnP+u/vOpeSK96g/6CiGYwRqsIWB9VTV8/+7umeX//RrygHSU/QXaMHHanb2l6E4lOZ3749tcapdFT8JnVoJ5/Zg1XHHX/j8OUQ/+h9mr8zKqj6YzRKgLWx9XwRxEdcs9+rH+5Hr+bvdmdvte5an8Ve6VMJDqN+ffHF8eGFWHRKv022MAqV5TjZMWj2u+gj29v+3vtWsdIX/ULjLOlMVpFwPq4kr/R7H+lKX9xqr6luVImCp2bgXjppNX4e9ZJWOWKcpyteNRvxo/Hx0+nz/+t382r04UPYrYwRqsIWB9Z9+23wacvSX/5VcNl9zc+8m+IPv3QXP1f3W9BRhONTvsXTH9nnHRq/4KpvTUJ6/3/fLP7kx9P3+qKcbZiv+ru25/r7u+cXvx8/Db4xa/3/Z8uvfhZRjMao1UErJsSf2CLcgHrpgSsKBewbkrAinIB66YErCgXsCJUiIAVoUIErAgVImBFqBABK0KFCFgRKkTAilAhAlaEChGwIlSIgBWhQgSsCBUiYEWoEAErQoUIWBEqRMCKUCECVoQKEbAiVIiAFaFCBKwIFSJgRagQAStChQhYESpEwIpQIQJWhAoRsCJUiIAVoUIErAgVopVh3SGEFupLwbquO4S+fgErQoUIWBEqRA8E6/277k12+o+IAitCC/UwsO53V93g0A8+yx1C6IFgvX83ILp/efvZ7hBC9QPBenf9th8ekjfCwIrQQvHKilAheqjPrKeXVj6zIrSWHujb4Lvr7tvg5HUVWBFaLH7PilAh2iasr1+/Dq60V/sfnUZLPSW5PJyfZs8NYVw5iKa/3h2tqfalbfuh3Mt4OhWLXi4M1IQbLSryMPzU8Y/7s9tVyXeB6h1M5FIH/NqsVXsvOj9yZ0GsQXR2hvZo6+MrV4/WOme1DGqyF8RdUxkfs9JDw7q/5Ntg26wioZH8lOTycK4Kfj4EvVQUzTDDmmpf2vbMziZi0ctFaXJLTC7qgn8tn9PcreQ5pbbzzufShhCspb3o6zYYa+NL6mYoj8ZvUDkXnBkIi7wX9N2wLUI98ivrvP+AYCrRgfyU5DKwLoG1AtbRSRCxnv4VwjrP3VSiA/kpyWVgBVY7A1g/152P+bS9un6tcj1a6inJ5eE8qOVUCOPKQTSinoGp9qVt+6HcS1DgIBa9XBioCTdaVORB9tnprJIbddtVyXeB6h1M5FIH/NqsVXsvOj9yZ0GsQXR2hvZo6+MrJ57odc4qGdRkL4i7pjI+ZqWHhPWw2z17f5m7KrxS1eJHJa75Kcnl4byK1shDqMwVFU2lZlhTfVq5K3YvVRCcj0UvFwZqwg2vjEtV6jTYn92uSr4L9Fz8VXB2XNzmsLITzebkzoJYA9kZOuO2Pr5yna2bLotWTcVRqbumMtM9+UCw3ux2V5++u5V/eLjIHbCeiUUvFwYKrKGA1ejm5W19076qXvjnhsB6Jha9XBgosIYCVq329fTTN0dYL/xDfmA9E4teLgwUWEMBq9bd9fEvgu//b80rq/UFrMC6LVjHP4XosF3uDljPxKKXCwMF1lDAarXvvgY+7JLvl4BV3QTWfnFgndA2f88KrMAKrE7AOisEYAVWMQBWKWA9E4teLgwUWEMB62IBq7wErP3iwDohYJ0VArACqxgAqxSwnolFLxcGCqyhgHWxgFVeAtZ+cWCdELDOCgFYgVUMgFUKWM/EopcLAwXWUMC6WMAqLwFrvziwTghYZ4UArMAqBsAqBaxnYtHLhYECayhgXSxglZeAtV8cWCcErLNCAFZgFQNglQLWM7Ho5cJAgTUUsC4WsMpLwNovDqwTAtZZIQArsIoBsEoB65lY9HJhoMAaClgXC1jlJWDtFwfWCQHrrBCAFVjFAFilgPVMLHq5MFBgDQWsiwWs8hKw9osD64SAdVYIwAqsYgCsUsB6Jha9XBgosIYC1sUCVnkJWPvFgXVCwDorBGAFVjEAVilgPROLXi4MFFhDAetiAau8BKz94sA6IWCdFQKwAqsYAKsUsJ6JRS8XBgqsoYB1sYBVXgLWfnFgnRCwzgoBWIFVDIBVCljPxKKXCwMF1lDAulhLYW2rWPXbawZVpXY71Nme6cvD6cnD3BCqSl+x0YgZ1lT7EuuPN/tgqlMfVCbseGtV3grRc4u5Ugnz0VklvJr9jacmBdFGq/j5ZVxa18fBWuk70ouwEMFEscb1tTO6XauZaiMq7D5MDWu/3DhZ7cOuX9lsj/sOIx4ErLNCAFZgBdZEwAqswGoFrLNCAFZgBdZEwAqswGoFrLNCAFZgBdZEwAqswGoFrLNCAFZgBdZEwAqswGoFrLNCAFZgBdZEwAqswGoFrLNCAFZgBdZEwAqswGoFrLNCAFZgBdZEwAqswGoFrLNCAFZgBdZEwAqswGoFrLNCAFZgBdZEwAqswGoFrLNCAFZgBdZEwAqswGoFrLNCAFZgBdZEwAqswGoFrLNCAFZgBdZEwAqswGoFrLNCAFZgBdZEwAqswGoFrLNCAFZgBdZEwAqswGoFrLNCAFZgBdZEwAqswGoFrLNCAFZgBdZEwAqswGr1ILDeXV81Pw+73e75h8vcASuwAqvVw8G6f3l7HL29yB2wAiuwWj0YrCdMW2SXuwNWYAVWqweD9dObFtZD8kYYWMVNYO2tgHVKvLLOCgFYgfWrhXXX6FXdf9W03B2wAiuwWj3Ur24aXp+9rw+7hFVgBVZ51lsB65T4PeusEIAVWIE1EbACK7BaPTSse74NVqfACqybhdV6GXTG0HdBJWpw3KXtat3eY0tV3k9rbe9EC45nmp9hZRlCdWrlyq0qTj2sPaujx7GcaSzjMXgy6ltJLmEWFd0kwq5NIGp/4zNJ1ft0KUzS7p9sxvvj2WnpYYK2E1ZyR2JnQayBVC56awVQnDG1tUogO+ZMBCVK4J+3qjHlZp1twTrbHbBOxjIegVV5AdZFhuu4A9bJWMYjsCovwLrIsNW+eaPb/VHEZZ9ZgRVYgdXqYWDdP3tf310f/yoCWLUvYAXWbcF6/+6q/fnyFliNL2AF1m3B2v+HcTcvb4FV+wJWYN0WrN0ra6ObV8CqfQErsG4L1uHN79119v+KAFZxE1iBtb85pQf7Nrh7I3z/DliBFVjl+puD9bPdAetkLOMRWJUXYF1kuI47YJ2MZTwCq/ICrIsM13EHrJOxjEdgVV6AdZHhOu6AdTKW8QisyguwLjJcxx2wTsYyHoFVeQHWRYbruAPWyVjGI7AqL8C6yHAdd8A6Gct4BFblBVgXGa7jDlgnYxmPwKq8AOsiw3XcAetkLOMRWJUXYF1kuI47YJ2MZTwCq/ICrIsM13EHrJOxjEdgVV6AdZHhOu6AdTKW8QisyguwLjJcxx2wTsYyHoFVeQHWRYbruAPWyVjGI7AqL8C6yHAdd8A6Gct4BFblBVgXGa7jDlgnYxmPwKq8AOsiw3XcAetkLOMRWJUXYF1kuI47YJ2MZTwCq/ICrIsM13EHrJOxjEdgVV6AdZHhOu6AdTKW8QisyguwLjJcxx2wTsYyHoFVeQHWRYbruAPWyVjGI7AqL8C6yHAdd8A6Gct4BFblBVgXGa7jDlgnYxmPwKq8AOsiw3XcAetkLOMRWJUXYF1kuI47YJ2MZTwCq/ICrIsM13EHrJOxjEdgVV6AdZHhOu6AdTKW8QisyguwLjJcxx2wTsYyHoFVeQHWRYbruAPWyVjGI7AqL8C6yHAdd8A6Gct4BFblBVgXGa7jDlgnYxmPwKq8AOsiw3XcAetkLOMRWJUXYF1kuI471wWydWTbjLDKOX3Vx4G7Xk3nxrSb5seEMLZvPd6IfdkuGyPpQfUcuD2oJwf3ZHSK4gysul/so5Ib0bYK1tqt3kesquGyMc4cXUpYVX51yJVMicyFizWsr8peVcnJ1Xg7glUtpdcazmT0td/CmBr5EOWYFrCmAQArsALrHHfACqwuDcA6U8AKrDIyYJUzgFWfAmvoAliVF2BdZLiOO2AFVpcGYJ0pYAVWGRmwyhnAqk+BNXQBrMoLsC4yXMcdsAKrSwOwzhSwAquMDFjlDGDVp8AaugBW5QVYFxmu4w5YgdWlAVhnCliBVUYGrHIGsOpTYA1dAKvyAqyLDNdxB6zA6tIArDMFrMAqIwNWOQNY9Smwhi6AVXkB1kWG67gDVmB1aQDWmQJWYJWRAaucAaz6FFhDF8CqvADrIsN13AErsLo0AOtMASuwysiAVc4AVn0KrKELYFVegHWR4TrugBVYXRqAdaaAFVhlZMAqZwCrPgXW0AWwKi/AushwHXfACqwuDcA6U8AKrDIyYJUzgFWfAmvoAliVF2BdZLiOO2AFVpcGYJ0pYAVWGRmwyhnAqk+BNXQBrMoLsC4ybHX/btfq+YcL3QErsLo0AOtMLYJ1v7vqBod+sNQdsAKrSwOwztQSWO/fDYjuX95e5A5YgdWlAVhnagmsd9dv++EheSMMrMAqzoD1i8HKK6stELCaXALr+BDlmNZDfWY9vbTymdX4AlYzE1hlOaZ1Eax3f/3h9CPT3XX3bXDyugqswAqsjwFr/2uZFMRl7mIBK7C6NADrTDWG+6u761fH8d1/Sn9/usDdlD5Gqqv2R9WdVfq0rsI5iUJrtawyaM2bwrXj083umvBVncKyk+Vp7aO1N9v5E7vRLob9WxfyzjCjUnbpSvVpq3p/gyNzUUjspf44JrRzV4ule/9jFlurY/MO0VUfdSIHq/rY1mZnKiy9hI5JzugScrIeZ/a+hsjtLrvl6zEb/XWTBbeF1ijwOGTzs6ARhjfNx8/m3W2LafYdr9f+om+D/XbEbocdmlw9uB5xqckMKOVB+TvhM1Q4deWtznPXwjrDqq6XPTNP+BqHycLhQp+1/PQOpxmcCV29O76m3l3//fWz90tgtV4GTZpNbQdYlZ4irFNvPBbpq4X1+OPu+m3zifXqclhnrjtrs9WMva+qR1xqfjfkQfk7kW38dnblrc5z135CnGG1HqxitcRj/Fz2Oel5xFfWt8ffy7z83Rf5zFqpD1ndB47hdFkCQ+v8M6tcuf/gU6nBMGP6M+tgW6c3049KoQvTt6MLeUd9yh/tug9e4WdW+dnNxTr7M+sYav8Z8OOQSxlIn7nq9Jl1NAk+s56+szE7U7HqJcS27AyxmpjZb/HjR5ve083uOcV9Zq11UEkvRM09foT/HGiEYfeZ9fj7009v8j/QP2nfvNFtf9V62WdW//Vk8A1hJb7Fs1/JqS/bxu/m1Hdv7qvPWq8j7o8rm+9Qk68GzVd+1bCpIEp1sx/L70CFxSmWWi4nQjDfOqso3bnYvV4omCL3Jb/tTGb5/YsVdYJU5iq9tMmu2XAd2Qax9pu0KdC+KjHTraYeg+24mFluSI3agr1pZ4hCZ1r6bXBL4N31GVj3zQfb7ptjYB0LFEepbvZjMayFBbAKb5FtEGu/SZsC7eurg3W2uj83vH/38hZYRYHiKNXNfiyGtbAAVuEtsg1i7TdpU6B9PV1Y+z/kv3l5C6xjgeIo1c1+LIa1sABW4S2yDWLtN2lToH09XViHP+S/eQWsY4HiKNXNfiyGtbAAVuEtsg1i7TdpU6B9PV1Yhze//R9RLHfnmQPWIZZaLgeswHqxYav+P7u5fweswAqslb9pZ4hCZ9rw/zDNPVRJamBV/SZv2jvAaiIB1kUC1vBmPxbDWlgAq/AW2Qax9pu0KdC+gPUz3HnmgHWIpZbLASuwXmy4jjvPHLAOsdRyOWAF1osN13HnmQPWIZZaLgeswHqx4TruPHPAOsRSy+WAFVgvNlzHnWcOWIdYarkcsALrxYbruPPMAesQSy2XA1ZgvdhwHXeeOWAdYqnlcsAKrBcbruPOMwesQyy1XA5YgfViw3XceeaAdYillssBK7BebLiOO88csA6x1HI5YAXWiw3XceeZA9YhllouB6zAerHhOu48c8A6xFLL5YAVWC82XMedZw5Yh1hquRywAuvFhuu488wB6xBLLZcDVmC92HAdd545YB1iqeVywAqsFxuu484zB6xDLLVcDliB9WLDddx55oB1iKWWywErsF5suI47zxywDrHUcjlgBdaLDddx55kD1iGWWi4HrMB6seE67jxzwDrEUsvlgBVYLzZcx51nDliHWGq5HLAC68WG67jzzAHrEEstlwNWYL3YcB13njlgHWKp5XLACqwXG67jzjMHrEMstVwOWIH1YsN13HnmgHWIpZbLASuwXmy4jjvPHLAOsdRyOWAF1osN13HnmQPWIZZaLgeswHqx4TruPHPAOsRSy+WAFVgvNlzHnWcOWIdYarkcsALrxYbruPPMAesQSy2XA1ZgvdhwHXeeOWAdYqnlcsAKrBcbruPOMzcH1u7HkOkhkzqjstimwUIX48pVXdX2mp0huk9cG9fVgS6DVfep7LdK3ozu6Fydh1U1q9yXhXUcmFz2++uCWBFW9Yx1FtbKejkL6zBjWE0XchLWsaJmC7rcYqDKMamtwqpqXfWbHUipT+dV345VpXOt8mYyOnRob9KvksA6NlHlr6mLpvt0jYctDNHK6mlCRdlE1P1yo8n4hKVuBndOPS7tzBTVSKJ5h2ZOO9Ln0hSwh1UWSiZCLCpsDTwxrMpLHqv0IsM5DfVMNSOBtY6uqdSYLSSwihDGVkgErMBqFwJWOQNYgXXcYK2DMkNdVmAF1hkCVmA1ANhJwKpSA6zACqzAek7ACqwGADsJWFVqgBVYgRVYzwlYgdUAYCcBq0oNsAIrsALrOQErsBoA7CRgVakBVmAFVmA9J2AFVgOAnQSsKjXACqzACqznBKzAagCwk4BVpQZYgRVYgfWcgBVYDQB2ErCq1AArsAIrsJ4TsAKrAcBOAlaVGmAFVmAF1nMCVmA1ANhJwKpSA6zACqzAek7ACqwGADsJWFVqgBVYgRVYzwlYgdUAYCcBq0oNsAIrsALrOQErsBoA7CRgVakBVmAFVmA9J2AFVgOAnQSsKjXACqzACqznBKzAagCwk4BVpQZYgRVYgfWcgBVYDQB2ErCq1BQC6931VfPzsNvtnn+41B2wjomQI2AF1rNaDOv+5e1x9PZCd8A6JkKOgBVYz2oprCdMW2QvcQesYyLkCFiB9ayWwvrpTQvrIXkjDKzACqwiC18UVl5ZgRVYxRa2C+uu0au6/6rpEnfAOiZCjoAVWM9q4a9uGl6fva8Pu4RVYB03WOugzFCXFViBdYb4PSuwGgDsJGBVqQFWYAVWYD2nC2Hd820wsAJrGbBaL4POWSpUx3wOZ7VMQGUSY2FVzNYutZUcPDys0tEMWDVNHlYTlMqf3WI1hCCisQudhbUa9+RhHcrlYNWFqvXZkF8RonSYw1opWxurDkuF6JrrwWDVXabyZkOY0nbfBqewjvuV2a2EhWrkekxDJdMoTCpVoHrM6Xi/VtDW4zU7Y/gh27aW65pe0Tf7cR1dHmIZLvfpkIEK2zEyya6G1S80ekpsK7GQnKXSL1nt+R62rhI0ZK4SzmV+jZVodHM/itU8B+gCjFtRLTY8L0RES9uxGUVHqdSMHpOnZmCt5QBYgRVYP0PACqzAKrMArMAKrMA6qe4vmDpd/G0wsLrLwAqss7TolfX+Xfofss51B6z+MrAC6ywtext8/+7VZ7oDVn8ZWIF1lhZ+Zj3skv/qfK47YPWXgRVYZ4kvmIAVWGtjC6ytgNVfBlZgnSVgBVZgrY0tsLYCVn8ZWIF1loAVWIG1NrbA2gpY/WVgBdZZAlZgBdba2AJrK2D1l4EVWGcJWIEVWGtjC6ytgNVfBlZgnSVgBVZgrY0tsLYCVn8ZWIF1loAVWIG1NrbA2gpY/WVgBdZZAlZgBdba2AJrK2D1l4EVWGcJWIEVWGtjC6ytgNVfBlZgnSVgBVZgrY0tsLYCVn8ZWIF1loAVWIG1NrbA2gpY/WVgBdZZAlZgBdba2AJrK2D1l4EVWGcJWIEVWGtjC6ytgNVfBlZgnSVgBVZgrY0tsLYCVn8ZWIF1loAVWIG1NrbA2gpY/WVgBdZZAlZgBdba2AJrK2D1l4EVWGcJWIEVWGtjC6ytgNVfBlZgnaUiYD1d1tU3STYXJDZV4mKAVQRgYBUz3PoZrDJEEZSeogCUoWtY9c3wzjJYzYI5rGJgYLWPtWCVgZyHNbSahFUVWi4ol81gtV3WL2jLLa2Kh9XWurYgydOxMT1pg6lqzFqZnFxUmYvK2eoF58KqPJjqZUMbtd5T5YITxj7sykYYLprsxgIQTzLJEw9TKHM2lV9rpakxtYhjjVrENVeaMV1IO0McVUXVgqbcysr1SSxgFV6SfotqaBokaW9XBNM3lb/phjZq1zEmOGGctV7U1HKY7MYBEE4yyRMPUyhzNpVfa6VyYIqfxBq1iGuuNGO6kHaGOKqKqgVNuZWV65NYwCq8JP0W1dA0SNLergimbyp/0w1t1K5jTHDCOGu9qKnlMNmNAyCcZJInHqZQ5mwqv9ZK5cAUP4k1ahHXXGnGdCHtDHFUFVULmnIrK9cnsYBVeEn6LaqhaZCkvV0RTN9U/qYb2qhdx5jghHHWelFTy2GyGwdAOMkkTzxMoczZVH6tlcqBKX4Sa9QirrnSjOlC2hniqCqqFjTlVlauT2IBq/CS9FtUQ9MgSXu7Ipi+qfxNN7RRu44xwQnjrPWippbDZDcOgHCSSZ54mEKZs6n8WiuVA1P8JNaoRVxzpRnThbQzxFFVVC1oyq2sXJ/EAlbhJem3qIamQZL2dkUwfVP5m25oo3YdY4ITxlnrRU0th8luHADhJJM88TCFMmdT+bVWKgem+EmsUYu45kozpgtpZ4ijqqha0JRbWbk+iQWswkvSb1ENTYMk7e2KYPqm8jfd0EbtOsYEJ4yz1ouaWg6T3TgAwkkmeeJhCmXOpvJrrVQOTPGTWKMWcc2VZkwX0s4QR1VRtaApt7JyfRILWIWXpN+iGpoGSdrbFcH0TeVvuqGN2nWMCU4YZ60XNbUcJrtxAISTTPLEwxTKnE3l11qpHJjiJ7FGLeKaK82YLqSdIY6qompBU25l5fokFrAKL0m/RTU0DZK0tyuC6ZvK33RDG7XrGBOcMM5aL2pqOUx24wAIJ5nkiYcplDmbyq+1UjkwxU9ijVrENVeaMV1IO0McVUXVgqbcysr1SSxgFV6SfotqaBokaW9XBNM3lb/phjZq1zEmOGGctV7U1HKY7MYBEE4yyRMPUyhzNpVfa6VyYIqfxBq1iGuuNGO6kHaGOKqKqgVNuZWV65NYwCq8JP0W1dA0SNLergimbyp/0w1t1K5jTHDCOGu9qKnlMNmNAyCcZJInHqZQ5mwqv9ZK5cAUP4k1ahHXXGnGdCHtDHFUFVULmnIrK9cnsYBVeEn6LaqhaZCkvV0RTN9U/qYb2qhdx5jghHHWelFTy2GyGwdAOMkkTzxMoczZVH6tlcqBKX4Sa9QirrnSjOlC2hniqCqqFjTlVlauT2IBq/CS9FtUQ9MgSXu7Ipi+qfxNN7RRu44xwQnjrPWippbDZDcOgHCSSZ54mEKZs6n8WiuVA1P8JNaoRVxzpRnThbQzxFFVVC1oyq2sXJ/EAlbhJem3qIamQZL2dkUwfVP5m25oo3YdY4ITxlnrRU0th8luHADhJJM88TCFMmdT+bVWKgem+EmsUYu45kozpgtpZ4ijqqha0JRbWbk+iQWswkvSb1ENTYMk7e2KYPqm8jfd0EbtOsYEJ4yz1ouaWg6T3TgAwkkmeeJhCmXOpvJrrVQOTPGTWKMWcc2VZkwX0s4QR1VRtaApt7JyfRILWIWXpN+iGpoGSdrbFcH0TeVvuqGN2nWMCU4YZ60XNbUcJrtxAISTTPLEwxTKnE3l14lik1AAABBZSURBVFqpHJjiJ7FGLeKaK82YLqSdIY6qompBU25l5fokFrAKL0m/RTU0DZK0tyuC6ZvK33RDG7XrGBOcMM5aL2pqOUx24wAIJ5nkiYcplDmbyq+1UjkwxU9ijVrENVeaMV1IO0McVUXVgqbcysr1SSxgFV6SfotqaBokaW9XBNM3lb/phjZq1zEmOGGctV7U1HKY7MYBEE4yyRMPUyhzNpVfa6VyYIqfxBq1iGuuNGO6kHaGOKqKqgVNuZWV65NYwCq8JP0W1dA0SNLergimbyp/0w1t1K5jTHDCOGu9qKnlMNmNAyCcZJInHqZQ5mwqv9ZK5cAUP4k1ahHXXGnGdCHtDHFUFVULmnIrK9cnsYBVeEn6LaqhaZCkvV0RTN9U/qYb2qhdx5jghHHWelFTy2GyGwdAOMkkTzxMoczZVH6tlcqBKX4Sa9QirrnSjOlC2hniqCqqFjTlVlauT2IBq/CS9FtUQ9MgSXu7Ipi+qfxNN7RRu44xwQnjrPWippbDZDcOgHCSSZ54mEKZs6n8WiuVA1P8JNaoRVxzpRnThbQzxFFVVC1oyq2sXJ/EAlbhJem3qIamQZL2dkUwfVP5m25oo3YdY4ITxlnrRU0th8luHADhJJM88TCFMmdT+bVWKgem+EmsUYu45kozpgtpZ4ijqqha0JRbWbk+iQWswkvSb1ENTYMk7e2KYPqm8jfd0EbtOsYEJ4yz1ouaWg6T3TgAwkkmeeJhCmXOpvJrrVQOTPGTWKMWcc2VZkwX0s4QR1VRtaApt7JyfRILWIWXpN+iGpoGSdrbFcH0TeVvuqGN2nWMCU4YZ60XNbUcJrtxAISTTPLEwxTKnE3l11qpHJjiJ7FGLeKaK82YLqSdIY6qompBU25l5fokFrAKL0m/RTU0DZK0tyuC6ZvK33RDG7XrGBOcMM5aL2pqOUx24wAIJ5nkiYcplDmbyq+1UjkwxU9ijVrENVeaMV1IO0McVUXVgqbcysr1SSxgFV6SfotqaBokaW9XBNM3lb/phjZq1zEmOGGctV7U1HKY7MYBEE4yyRMPUyhzNpVfa6VyYIqfxBq1iGuuNGO6kHaGOKqKqgVNuZWV65NYwCq8JP0W1dA0SNLergimbyp/0w1t1K5jTHDCOGu9qKnlMNmNAyCcZJInHqZQ5mwqv9ZK5cAUP4k1ahHXXGnGdCHtDHFUFVULmnIrK9cnsYBVeEn6LaqhaZCkvV0RTN9U/qYb2qhdx5jghHHWelFTy2GyGwdAOMkkTzxMoczZVH6tlcqBKX4Sa9QirrnSjOlC2hniqCqqFjTlVlauT2IBq/CS9FtUQ9MgSXu7Ipi+qfxNN7RRu44xwQnjrPWippbDZDcOgHCSSZ54mEKZs6n8WiuVA1P8JNaoRVxzpRnThbQzxFFVVC1oyq2sXJ/EAlbhJem3qIamQZL2dkUwfVP5m25oo3YdY4ITxlnrRU0th8luHADhJJM88TCFMmdT+bVWKgem+EmsUYu45kozpgtpZ4ijqqha0JRbWbk+iQWswkvSb1ENTYMk7e2KYPqm8jfd0EbtOsYEJ4yz1ouaWg6T3TgAwkkmeeJhCmXOpvJrrVQOTPGTWKMWcc2VZkwX0s4QR1VRtaApt7JyfRILWIWXpN+iGpoGSdrbFcH0TeVvuqGN2nWMCU4YZ60XNbUcJrtxAISTTPLEwxTKnE3l11qpHJjiJ7FGLeKaK82YLqSdIY6qompBU25l5fok1gPBev9u1+r5h4vdJbmqZF6MSdQso6lre1eXpN+iGpoGSdrbFcH0TeVvuqGN2nWMCU4YZ60XNbUcJrtxAISTTPLEwxTKnE3l11qpHJjiJ7FGLeKaK82YLqSdIY6qompBU25l5fok1sPAut9ddYNDP1juLslVJfNiTKJmGU1d27u6JP0W1dA0SNLergimbyp/0w1t1K5jTHDCOGu9qKnlMNmNAyCcZJInHqZQ5mwqv9ZK5cAUP4k1ahHXXGnGdCHtDHFUFVULmnIrK9cnsR4E1vt3A6L7l7cXuktyVcm8GJOoWUZT1/auLkm/RTU0DZK0tyuC6ZvK33RDG7XrGBOcMM5aL2pqOUx24wAIJ5nkiYcplDmbyq+1UjkwxU9ijVrENVeaMV1IO0McVUXVgqbcysr1SawHgfXu+m0/PCRvhIHV9F1g58aVPFWLVOZmeKeydpWbodNyDoBwkkmeeJhCmbOp/ForlQNT/CTWqEVcc6UZ04W0M8RRVVQtaMqtrFyfxOKVVXhJ+i2qoWmQpL1dEUzfVP6mG9qoXceY4IRx1npRU8thshsHQDjJJE88TKHM2VR+rZXKgSl+EmvUIq650ozpQtoZ4qgqqhY05VZWrk9iPdRn1tNLK59Z1bK2byp/0w1t1K5jTHDCOGu9qKnlMNmNAyCcZJInHqZQ5mwqv9ZK5cAUP4k1ahHXXGnGdCHtDHFUFVULmnIrK9cnsR7o2+C76+7b4OR1FVhl6CYoM7RRu44xwQnjrPWippbDZDcOgHCSSZ54mEKZs6n8WiuVA1P8JNaoRVxzpRnThbQzxFFVVC1oyq2sXJ/E4veswkvSb1ENTYMk7e2KYPqm8jfd0EbtOsYEJ4yz1ouaWg6T3TgAwkkmeeJhCmXOpvJrrVQOTPGTWKMWcc2VZkwX0s4QR1VRtaApt7JyfRJrs7B2r8zNBk6D9ko9nsnT00VtMFrWcoa54LxELk7O1U25oI+xrs01se4YrQgxsHNj4dwsIoPS+Yu2KCaHi0a7OXaSWFra6knjzWqXyKTf5drl11qpHMj7fl/GrV1CbzUotLeRtvaaSk3UIj5vYZ8kemhY9xd+G9xvbah4TFSq4X7vaziYCzNczFYdDMWymcNaBmWGNur+NI9wcqkwWLNQtJu6f9aU96NJNsR8efdEuyTkiQUsg/MdT1vr/dfRNZWa1FtcVVX0DIoz0Cw3nPaiQp5hCKw66v40j3ByqTBYs1C0G2B9grDOdqf34DIJrGmEk0uFwZqFot1IA3nRTbIh5ssDa9DonwfNcsOV3Jl81/oTij4dtxvNkx8oRudyDZNL52LI5c5d0zfHK/FnEelBFa8O7NxYODeLyKB0/qItisnhotFupIEM203SaQ9l0u9y7fJrrVQO5H2/L+PWLqG3GhTa20hbe02lJmoRn7ewTxJtFlaEkBawIlSIHgTW/k8ijrr4b4MRQkoP88p6/y79D1kvcYcQerC3wffvXq3pDiH0YJ9ZD7u3k/eBFaGF4gsmhAoRsCJUiIAVoUIErAgVImBFqBB9MVgRQgv1hWDdzFpLtNW4NhsYcS3USoEB63bj2mxgxLVQwLqathrXZgMjroUC1tW01bg2GxhxLRSwrqatxrXZwIhroYB1NW01rs0GRlwLBayraatxbTYw4looYF1NW41rs4ER10IB62raalybDYy4FqpAWBFCnyFgRagQAStChQhYESpEwIpQIQJWhAoRsCJUiIAVoUIErAgVImBFqBABK0KFCFgRKkTAilAhejRYD7vds/ePtdgcffrT9p+vHOLaRoD373a73dVxtLHA9i6cbcTV6Oblbb25uLp/zPhVvWZgjwXroQnzsIEcDrq7bv+t2SGubQR4/64JYX+s8cYC2zfZ0uFsI676yMAR1q3F9embUwgrBvZIsN6/O75a3Jz5110fUYfuH28f4tpIgJ/eHP9JzYaMjQV2d33V/eu8G4urbl/CGlg3F9fh9O+OrxnYI8E6NOHjLHdWh91Vm80hrk0F2Dz/bjGwI6zbi2v/8h8bWDcX1/5E5ZqBPRas7ZuCwxdPoVAHax/XpgK8EfFsKLB98x5uc3E1cRw/s24urps/7758WDOwR4K1e6++gY8So9q0DXFtKcDmZX+DgR3a3ttaXMc3l0dYtxbX3fXxg/TNuoUE1npLNT7q0H+/tLXA7t+9vN1aXPuGiS3C2qnpsQJh3cqbE6Gtvg0+tL+52WBg3WfpbcXVRrHJt8Gtms+pBb4N3srHfqGNfsG0737Lur3A6q73thXX/vSPJm4trpMaQAv8gmkrX6gLHbb4q5um+962x40F1nXaYXO/Ump1s8Ff3TxIwp7sH0Wc3pBs7Hfpn95cnUYbC2wAYmNxHXWzxT+KaKm8WTdhj/bnhvtN/BGY0OnTwxDXJgI8va07BrKtwOqb45vN42BjcdX9nxtuLa4HSBh/yI9QIQJWhAoRsCJUiIAVoUIErAgVImBFqBABK0KFCFgRKkTAilAhAlaEChGwIlSIgBWhQgSsCBUiYEWoEAErQoUIWBEqRMCKUCECVoQKEbAiVIiAFaFCBKwIFSJgRagQAStChQhYESpEwIpQIQJWhAoRsCJUiID1iejw9ktHgD5XwPo0dHcNrMULWJ+GgPUrELA+CX16s9u1/zAiKljA+jTEK+tXIGB9GgLWr0DA+jQErF+BgPVpCFi/AgHr0xCwfgUC1qchYP0KBKxPQ3fXV186BPS5AtYnoht+z1q8gBWhQgSsCBUiYEWoEAErQoUIWBEqRMCKUCECVoQKEbAiVIiAFaFCBKwIFSJgRagQAStChQhYESpEwIpQIQJWhAoRsCJUiIAVoUIErAgVImBFqBABK0KFCFgRKkTAilAhAlaEChGwIlSIgBWhQgSsCBUiYEWoEAErQoUIWBEqRMCKUCECVoQKEbAiVIiAFaFCBKwIFSJgRagQAStChegJw/rpzdu63j//IK/d/ZfbyPK7WzOWV/R0f2e3i5J8aC6/HdfTC9/99YdgyiX6+PHjssXv3+12r1Za/PXr19Hlqa3fHG+hWF81rP8n0nD3059/X9//w7cKixjC2bA2l92d3S6i9dOffqg/ffN+sFbT7q6frwTrx48RrVOL7181vK7Dy+vXIa1Tqx9erfhE9dXpa4Y1ZFXA+t1/vb37m58+1Pvd7qp9RblqfrxsOufTtz80ry7Nz+cf2nvd+dHuVX/vu9vD6RVI3DxO/0N3OiqBte3Qdr1+5uDx/qf/vVbDJrBOLH7U/mqVxTNYp1cH1lRfM6xn9Om7f3l/+MufPjTP9Hd/9b57Tj+9bn7zvnl1aX7W3b3mDfPx/Lvbo8Vp/IfG8ubtaNzdbF5Zv3mv2y2GtXm/d3z17F6KTzN7jys2bAzr9OKrrZ7AOr36nrfBqZ40rL97+9t//ulDg2nTKZ/eHJ/bxze5+6vjz9O95o1b82rTmDx7342buUcGrwbj08227+7/4b1cJ/7M2r3ZPU7uZx56jw//mXV68b96H05ZruQz6+Tq9fCcgayeNKz//jc//VsP6/ELp93bKVgPz9tX2R7WV4Ob2/FmBGuudo1h5mF89/wYbwWzxY9vKB5e6dZXexP+FepJw/r/fvX9ff82eH/E8kq9DT4+9Yu3wU1HHZ4Nb4n/0NoMxqebs2Ft+rTu1hhn9h4fHNapxY9PRg+rqdX7J04U6UnDeru/uu++YHrb/tLg5e3dtfiC6Qhue687v7ve/Yfrt8kXTN3NZvofZr6y7ts3fs2Ef+9nim9ZHvqVdWLxG/mO9NFXPy7PC2umJwxrruQXMysYI3S5gDUQsKItClgRKkTAilAhAlaEChGwIlSIgBWhQgSsCBUiYEWoEP1/rXGxT3zrIOEAAAAASUVORK5CYII=" width="\textwidth" /></p>
<div class="sourceCode"><pre class="sourceCode r"><code class="sourceCode r"><span class="kw">table</span>(<span class="dt">true =</span> dataset$z, <span class="dt">estimated =</span> <span class="kw">apply</span>(zstar, <span class="dv">2</span>, median))</code></pre></div>
<pre><code>## estimated
## true 1 2 3
## 1 154 1 0
## 2 4 227 2
## 3 0 5 107</code></pre>
<p>Finally, we plot the observed series colored according to the jointly most likely state. We identify an insignificant quantity of misclassifications product of the stochastic nature of our software.</p>
<div class="sourceCode"><pre class="sourceCode r"><code class="sourceCode r"><span class="kw">plot_outputvit</span>(<span class="dt">x =</span> dataset$y,
<span class="dt">z =</span> dataset$z,
<span class="dt">zstar =</span> zstar,
<span class="dt">main =</span> <span class="st">"Most probable path"</span>)</code></pre></div>
<p><img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAA6wAAAJACAMAAAC+FBs7AAAA7VBMVEUAAAAAADoAAGYAAP8AOjoAOmYAOpAAZmYAZpAAZrYAzQA6AAA6ADo6AGY6OgA6Ojo6OmY6OpA6ZmY6ZpA6ZrY6kLY6kNtmAABmADpmAGZmOgBmOjpmOmZmOpBmkJBmkLZmkNtmtrZmtttmtv+QOgCQOjqQOmaQZgCQZjqQkDqQkGaQkLaQtpCQttuQtv+Q29uQ2/+2ZgC2Zjq2ZpC2kDq2kGa225C227a229u22/+2/7a2/9u2///T09PbkDrbkGbbtmbbtpDb25Db27bb29vb2//b/7bb////AAD/tmb/25D/27b//7b//9v///+okv/ZAAAACXBIWXMAAA7DAAAOwwHHb6hkAAAgAElEQVR4nO2diaPcunXe571bq5XkJH5pquu6SRdLSZyuummbNq04dZ06mpF18f//OeUCnAUAd5AEht/nZ90ZEsvBwfmRPCBn5mIgCCpCl6MNgCBomgArBBUiwApBhQiwQlAhAqwQVIgAKwQVIsAKQYUIsEJQIQKsEFSIACsEFSLACkGFCLBCUCECrBBUiAArBBUiwApBhQiwQlAhAqwQVIgAKwQVIsAKQYUIsEJQIQKsEFSIACsEFSLACkGFCLBCUCECrBBUiAArBBUiwApBhQiwQlAhAqwQVIgAKwQVIsAKQYUIsEJQIQKsmenlcrm8+epeve0v+Ppfv0xu8sdYUbH59ZPtc5Lanq+XHz5PrgElEWDNTA2iLQbfnwdh/cOvogTGm0wKa9czYN1fgDUzNbBePtQvvr0fhLWHwBlFl8La1QOs+wuwZqYa1j9pybleHKz/919dLv/k37Qw/Z8/rs+7//zvLdMWtpqe/1EX+Vm9uWHoP/7q8uPfy1p2f1Ot3vyruuIvPuvNFtbXv61b/7OvbAq3y/Vsz3VH/6Uu/jMQu58Aa2aqWfjT9/VJ6/XTj3/TwdpSe2GAW0g1rJ2aU93VlRW13P6m+J2Lis0drO2Fd83fFzaFCnM9gpUbhfYRYM1MNQu//lRfB397/+a/tbDWl8Mdex8amt5+Nb9tL5PFRWxdpT6V/rYtXZd78/fm/8lazf43dn9N5T/72uwMNtelX5pmun1+u7pedxnc1O46gPYRYM1MNSEfX2p07pcP1xY/S2Xzp4blZ/+TyklYP7ot1+61rFX/88NnUeMf/vsf24ZpcwtrfShowLtSw7JdVc/C+rFdBQOsuwmwZqaGkHuNw8vl45VPet2CTv36IhJRhrV92S752HUfWUvtf/1Nd/X6VlVri7crWu6y1283qNd11Jzrd3fRaQVYM1Nzvqux+XfPP3y+N2w4HO7NmewPv+po+s8xWNsSV7rvw7VeNO4/+w//8NxBJzc35/I4rE0buh5gPUaANTO9uDNoS48+s9Z//vDv/7hbNlp2Zu2yzu/PPWfWj54pA/UA6/4CrJnpxa24vjV3P2ftSrz+dZeI9uSs3XlR56y0vz3T8gKT2+xy1reeKT31AOsxAqyZqYX13j4YUbOhV4Przf/ia3Mt3G6ghxL0anC32VsNdvvbzX/4ZI8CcrG3PVn/8Gvzh2d6PqKn3tXeIwKsewuwZqb2vFWzUaPQwSrvmP6tvvHp3WdVDxbF77N2S1RuoSh+n1UeBGL17vY+K2DdW4A1M71YNjpk+Qmmf9vu/W2Tsf6ieaSoedqoe7aoqfK/f2OfUOITLteSTzA1q7o/+09XeyHtPcH0h9+8dw86+e2Kel3PgHV/AdbyNeMx4SzahRYKsJYvwHoSAdbyBVhPIsBavgDrSQRYIagQAVYIKkSAFYIKEWCFoEIEWCGoEAFWCCpEgBWCChFghaBCBFghqBABVggqRIAVggoRYIWgQgRYIagQAVYIKkSAFYIKEWCFoEIEWCGoEAFWCCpEgBWCChFghaBCNAfW78/Nb3GKr4KHIGg/zYb12nx1+/fnj+PFIQhKqbmwWkyv7seLIAjaSXNh/fa+hfWOC2EI2lk4s0JQIZoHq/3RP7vUBEHQjpp566bmtf2tX7AKQXsr8X3WCwRBM7UvrPP7hSCo05aw3sUv2SdoDoLOrY1gfblcPnz76Wv/QxGAFYJmahtYX958NS/tWbXv1g1ghaCZ2gTW9nz67ecNrH0PRQBWCJqpjWBt7ti8/t7gzApBybTNZfDVnU97H4oArBA0UxstMF27ZeD7pe9DN4AVgmZq5/usWzUHQY8vwApBhQiwQlAhOi+s7969O9oECJqj08L67h1ohcoSYIWg2TomdgArBM3VQcFzWliRs0KLBVj3VnW0AVCpAqx7C7BCS4WcdWcBVmixDgkewApB8wVY9xVghRYLsO6r1P7G8vKJBFj3VWJ/48btmQRY9xVghRYLsO4rwAotFmDdV8hZj1PxvgKs+yq5v7G8PFXlX4UA1n0FWA9T8bBWgHVfAdbDBFgXCbBm3ODjqnBWAeveAqwHqnBfAdadBVgPVCG+6rsCAKw7C7Cu1YqL2TJ81ZtbA9adVQ6smWZ4a5aJAOsCAdaMG7TKde0UsO4swJpxg1blwDrdzjJgRc66RXNLNMPf04LwbLAGbplhaCGw9tkJWHfWdH9PDMKz5azBiAHrtgKsE3Q0rNmGNmDdVYB1gjaFdUrTuYa2b9fD5ayAdYPmliiPnHXSgSDX0A7smmxoriPyBVjTN7dEc/w9qSxgBaybqiRYE6+zANbVSgprlstogHVZc6nvYOQB60PlrGtgzfMGFWBd1tyDwjql2pahscapgHVXAdZ0ZXOHNeq+WV71iwLWXVUQrI+Zs+4Ha5yGOYwEZZGz7qqSYE18NQhY+7dObCHtanCOmTlgXdocYF2jHiznXQUD1m4zYB0TYF2lvo97LW8BsO4qwJqubO6w9jSzwg8PCysdlQDr0uYA6zoB1oni6/0+lwHWMaVdQtwG1vkGAdYFhmwrwLq+ubQ35+YG6WhP1TKDYnaMLeQsU66w2tEC1hEB1qllx7tKB6vfymPD6kabD6xjOStgHRdgXaax08R0nQVWsgWwLm3u0Jx1GqxpctYhWOd3QK1NhrW3j+Nh3ek5J8C6trnpqxKpP8oyOWddcH6Yl7MuOHXPhrW/j8Nz1r2eIAasa5ubDOukKZ19+TdWISGs3sYHh9X33MAIJw0/Ac/7wjrR4I1gff10afXjlyTNWQHWZaeWuTnrcbC2/Q4NccrwU5x9d4C1MbIzdKrB28B6vXzoXtzdi1XNOSWGdYbD84HVOwpPm+XKb0ZWOzxnZc918zY4e9NOrNvCWje/HtZ3TgfD+vqJEL2++bq6OVLinLVMWKNrM9O6kNeZXG26V4+FdSyzpUpbw5omb84F1u/PH93Le8+F8MawTokawCq29q9o9RjSVtgG1uFPyfd1Ica0cc6aHNZDc9bjz6yANVa/H1a/nVFYuwrb5KzDiEyAdciMYX/R3u1hFTnrVL9tlbPaU+vCnLXHGYA1eJMoZ80F1gEfpoF1GDTeu33OKsd7KKz1hXC3GtxzXh1prs+jgHVk10gXVbDFvUwN61OjCRapd8OwTs9ZN4fVpLl1kwus65orB1ZlJwfa6PnOP7PFO4y/2wLWpDnrk9CIRerdCKyjB8Phy2gy+3hYhZU5w3ohDZUqBlZtqMy8hmkNF3iiHcbfpYSVT4/TWvVaiML69DRIa+xiNhWsY5fKZuw4OilnnQDrYC/SymxgvV8uP3xe1FwpOWvpsLZ2JIa1Y7Qdn6NVesMO/TBYRxzmQ7oM1mnn71xgfblcPnz76au8iTOruaHZmlKSNg84LT2srtSZYW35pPHpd9TlmWANQyEzWF/efDUv7Vl14a2bVLAOYZM+Z6VSpeSs6WHtzqXsdnuW3QnW8Zx1d1hjEZhVztqeT7/9vIF14UMRW8Pa5moJYFXbGNZx5wfcmCGCt4I1ec7qslQeiqOViszKWfXBcNC0cdiDIQzsXQersDsagf54j4W1ubv6+nuT65m1O6NkBuvAtfFmsDZ/E8Nq91Zik56DiIP6YO1ZExgwbjNYvcOGaqUXR1MArObqzqcdtvObSwVrzwUpYN0V1oj9pcHqW1IZjq0Yj9xLLAKzgtVcu2Xg+6VnfWk3WOP7Aes2sD7FYX16SFhp0zCssQ7zgnVtcxvDOj9njZ5HvG0F5qzpYXV7b7cbb1wK6445q7c0BVinN7c1rG2QroM1XChZC2v/DJcH6+1GtK6BVft30LR1sPo3ffy/kZyVGR3KWQHraMntYY3cglgDq98YYA0N2QjW1vWjsHqWVEYyGvYHWB8X1q41wHoArJ3rl8DaF2SDUyk2AVZ/81JYvdsNOjTfWW0Iaz+fe8LqX+BVvCkGq4nnrBH7e2CtjoJ1LGedAWu3emlfxY0BrPHNC2H1Fg00rMxqGlhdcznAqoYVLJ3wE319sLrVYNo369bNcbB6NVPAGjmeq/KA1d+8Gaxh0C2FNXaaPgpWbckZYNXnVMA6v7nzwerFfnawutPPKKxPtrlKbMka1m5Q6WCl1gCrUJ8rVNGFsI7mrJGgezxYjWTVbh/MWStHq4O1e585rJxlJoGVnIqc1SqS50WLLoXVCzCfJfc2Bax0KF4Pq78gpAtNgNUPMLHZQSwaC2F151YbzvTON+YUsPbGwslgffdIsLrZXQ2rvHiX3M6AtScMJ8NqP3xe0cu4/YAVsIZFzwqrSrojsPKmSbCKM+4IrLGvdckc1lk5a1cWsLIC4/38NLaCGm0AsJpYSMyGlbePwRr5wrQlsIYJ33awhm7ohdWehQEryzfeO4uKsypglVJXwfNhDRe5K31rYxKsJvgq0giskSEpWCPHYsA6qdScgkmaG4bV5RhngbUKbYmbKd+qnDWI4SisnLsJ9znqZ8Fq9MezHxnWSC4GWAFr/N1Qu14VwBrdNwtWP2cVi1Neg0b4LGrDo8Kqc1bA2mPmA8E6K2f1V7/GYfXWymfA2pUFrKzo4Y+32tNsKbDKyAhbzwfW+Tmr78KEsEb8G7bVKbivNAqrn9ED1hXNjcAaD/BoAxnAqiIjbD0jWOmtcp8XeIXAGjlCcv10sCJn7YNVOwyw+j2pMC0QVuenBLDGnM71E8LKdY0qER1CMBLAarRre/ZnBWvQwXJYVZKpquQOKzlqMqzv6FuQfFuHYU2Xs7q68mYVYDXFwjolZwWsS2ANK0yFNZiAVbB6j4EshrXv0t3qbLCKC60dV4PpLc0GYA2rJYV1OGdVRVbD6j9guRTW3gOM1clglbN7AKw8GxvCOjtn1WeJA2GdnbMOwToSGfqotgpW8WDlk/v47gisMWcC1hJhbdcXV8AaRvXusFbasKmwzl4NDug+AFb9ZKX9TvNeO7qlsMCZld33WLD6/n08WLs7d4B1CFY/nTgQVv8XB+TH7UM7uj57YH20nJVX6RfBWkLOClhHYQ0OeotgjeaslSo1Dmv4tTViSw+s7/pg7bPZ6mywirnaCtaIVeGUHw2rXLh8QFjHIyMY9HJY/a4Aa/6wuhlcC+v2Oau3dglY08DqbBqC1ZwJ1pU5axGwBsYlhtW701AgrKM56xawqrG56iJDpSMIZ7ERWMWh+PFhJQeMwBqhQr4/MazuS5LcyxJh9f14JKyuLp/uB2GtaJhVxd09MKwVYI2aOQVWee3b3SMErClgvQFWteVRYa12hNUtLLnY1F9ASLCKBSh/lTsFrFW4qXhYb3RtDljNMKzeLSrAGoeV2KTYFFQ6p/qPuvbBWvV593Swtq4KfzcPsJoYrP7DH8XC+s5+rn47WNUW2iTiNnjUNTGsYc1yYa2dVXnHNvG7eYDVPDKs3cr+obCGj7oCVrEngNU/tg3fugGsC2CtSoc1fFyfDOuH9SkO65PGWTyS4y0XA9YeWI3z7hNgHYV1ds6aAFZ/W5ezzIe1tWUurOLBVr+nYVjdltba1pEKVnHjsN2gl4uTwyoHkSGswvMDsLp3nEEA1kFYvSoHwHq7Ufwvh3VyzroW1s7aAFa13OSq8HJxalj5gTSjYKV+codV3LTmn/cZeJAfsEaqsGvj3GQLqwud42BVdStKc6fAqgM6Nn5TCqz6bTA2W12uK3UbO3aDKwRv+IBVFX90WBfnrHbLfFjJ+bQHsLauoyvfik60gDVzWJPkrNNh9aLWNywOq4nnrFTiKQ4r3W1NDWtfzloUrMGKMGA1c2DlMNoRVlPFiuq2joeVVoNd0ScNqzOCMdoSVh8aCZvvyPxgdWtJwf2bKbA2jQBWUwKswu7tYbXP1syBVVygAtbI2Noe7FpS5Xht9wPW9LCGS5kDJpQNq82m3TnANmKLPgHWBLDar2tp9gNWwBq8WwSryEC7ojIjPRZWTuYGYVXXmSGsvXPg2bIPrLFYAKyAlXcOwspnUhduTxLweM6qxnUIrO3SdeRH1ZfDykUA67rmAqP19AFWamoKrCJnNXwfv6KzFDcqb904a8StGzm2XWFtDzfhRwwSwVothbWysJpK/CapXA0GrDnAWkW2rYfVVk0Oqw694D6DKJE1rOKM6l4tgDXgaBWsTxZWctkTYM0WVvk18oXAqi8mA1jpM+ddI0/ZwGov4G0J+Q6wLixY6/vzh/rf++Vy+fHLsuYKgpU+77MKViPHJ6KZH4rStlBTS2A1dGbSXTObDKtLao+F1RCrDBVdwstWEsFa6beRsRGsTxrWpyJhvb752rz6uKg5wNqV5IcCPVuoqYWwVtSjByvd6O/qt7GXAax0zhcU0XHkOFjdEYOC0b0vC1aLaYvs/OZmwFrFqvTCyt7ZCFb3b09b+cNKmWHlffj8eFi5bLfzEFi92KO0vzJ0RCkO1m/vW1jvPRfCBcIqWxKwejnrGKze7GwCqyBAVZkGa9/XumwHq9fBQbBWcu9kWE1wbFsB63BE4sw6zYQhWDl608O6LGddCWvPF6YdC+tTHNYnIyqopvaCNXo/qShYL7XeGrfUNL85wCqYWA1rPPT6YTXRryLdEFae1CFY3Va+2MgC1vAL04qC1bS8/vDZ3C89rALWrGHlIMsQ1u6ea/sRvx1h5QkJYDX07UscCQOwkpOzgXVlc4C1TFjVztj4tYmuAzGpArZBWN2dV+GTA2F1G6jsI8B6IQ2XA6xbw0rmlwSrO6VOgzU+v4DV17XGsVtgevzV4INhDcf3kLA2j+HaJ4TbnNXCKgkCrLMLNrrW+er352aFaTNYK+UwvwpgLQZWejcK65OFtUOrO7GWDqvu5whYXz99aP998xWwem09DKwaOa+jPWA1lVkCq/IFYOWHDF/efN0EVjfebGH142QGrBQLR8Ba5Q3rU5uzEqxPgHVtQePOrLVe3j44rDRN1OciWLsTxsPCGtq/AFbxiHxniXqQH7AuKtjIIfr9ue9zN4BVmErf5ODGBliVEdYyelC5tUTf2RyGlX6MEbBGdO3WgutzLGBVbaWEtTLBvoeGNf4YLrfSCyt9GvbpAFjDWMgN1rXNAdblsDqL0sHKLB0Lq3qwT3+tywisshpgnSvAKkxVOStgFbNdiWJ8bhXQTYM1hDwFrHQFA1i9DQ8Nq+GxAFbqPwKrO7saN32TYBWM8in5QFg5UgCrAaxep48EqwVlBqz8Sbpmq/zcH2BN0RxgdQuYe8Bqb18+KKzidmxbwS40AdbJAqzC1Aisbs1pB1hv7sGgx4RV3o7tTKRbQIA1SXNlwSonuDhYb0OwVnnBWi2D1ffFZrCS9YB1V1gjAeZF/2lhpX4PhpVxA6wLCyZpLjtYTSW3bA+rn7P2PYOTClZq7NFgfYrD+gRYp2sNrO154NFhFWFvKnpsQo4tEawVufNRYXXFui+fa14D1llaASt9PaeYEr+KiBrAOgIrv3tsWG3cxGGtZE3AOqu5XWAd9M0DwBoE2kPB6iznVnaCldgErJ0Aq4Z1Uc4agZWaAayAdbJS5KxngtV4nVMTAaxdRPpddH/5BF3RRwfyhlXQqGEld3MrUVhNPGflIT4urK+/n1p+UnO9CmElBtxMBbBWfnH1wiv1sLCq+zB+DYbVvpoGK/0sQJmw8u9EWhMjt26Ww1plC2v3LWg1tH/V91uOs5rrFWBVsKph7A4r5R2A1ZQEq7k3X7Rvrv0/vDqvuT49FqyyZgxWN+LNYeWcNSGskXR6Pqz07UpbwOoeN7RDjT1u2A8rPZxSHqzm9VPzFd19v4oxWQ8Oq246CawR9EI7zXDOKhiZkLO6oQ3DGluong2rPcRQzCeAlaqKD5y3/48+yN8LK30xwBRY2an8izxRWHkIm8La0trzC8kzBFg9U5PBqiPer0Heoh7HYR3OWaN3lUxGsPJH5NoC8Y/IJYFVXK64X+Q5FtaXy+XXny49P+Q4XYDVMzVnWI1+mxGs0p29sMa+u+kMsH5/bq+A7yfIWY3fnkSOrnG2gFVHadD7kbDycNnI2C1gMxNWm7NuBGvkJ1QnwzonZ80N1r/oTqknWA0egpVnYhqs/HYxrImeYNoAVjHW5bDaGZ0Kq+f6MVjD727ySg/AavSWIVjzy1nT6CFh5RhKDqu42ASsAlYJ1QCsFV0AK++lhtWI9pyrzg0rT+2jwipmMV9Y2cgyYPU2z4e1eyFy615YKboisFbU6Oawfv/l5/bv/cCcNQNYe3JWwLoFrHxDOANY3QPG7X/umJ05rH0/YTNZhcPKPt8J1kU5Kz0IrGHlcCsBVrrLuyGsPL1zYL3d3CI239u2lTOA9YV/uHztUxFj/fpmnx7W6HnSH7aGlc7GGlb3jJOGld5OgPVmP70CWDtXOo9mBSufWVcLsHo7C4KVbooeCasbw86wypx1CFYvRJQle8GaTIDV2wlYe2CN56zulubusLphsyOyhfX7s70M3jhnBawJYJ2Vs+YLKxuaG6zGTVeYs+YAq9XqxWDAugesvhMq7kvGyzxYKTaTwlpFYFVc0P9mwCoX5fye08GqDLclJsBKBzzV7aAWXga/vJ1ab2G/62B1VyU5wOqSHL9/r6OSYJX4xMayA6xTclZ1u8vvuQ9Wa4BLIDaD1a1U7QDr1vdZ18EauSlpvFK7wUrLh37/Xkf7wyp4E39LgdXQdmolIayu6kPAuvV9VsAKWM8Cq13sZjN7tQxW9+0uywVYvY5WwkprSdvBqv5XAqwqZ3VIsIkVlz4AVspZablO+yymhavBW3+edRqsghQdEafLWfkuTRxWvSUrWO3oFsAq3BSHVbjd3WWZCOv2OSuNaTtYk2kqrOruw2RYZaxvC6u+honB6kVNZeR1D3c0A1YRSIC10m2OwyrvPffDqtqVPlRGAdZW1mx9Xz8/WCkGzARYLaSyDnc0HVZ5iQZY58JK3pOlj4TViIWm1LC2F8Krr4LPCKsrmx7WGyfpgFW+FYa5y5n8YBUBkxjWe/NNpOb6w9pHhB8BVlMlgdUFz2JY7YWUG99usLpeI49KmexgdTtPBOvrp+6LDV/WnlsPyll5O08DrxlGS9K7GKxzclZ5JRay1wtr5ffu56y3g2DlA6k+pJqtYL3xT31OhZV7IUpW5qw8xdnDuteHz9lsCd0msPIlpImWpHdRWIUJY7AypNLsBbCKQHKknAHWrpfZsDrveuycAFZ3Zr3udWZV0B0Iq7pyZSInwOoMNQzpAlgl7yKQCFZRHLBKWMk754PV3Nsv+L6u/prvwmCVJy+yxmgTpsEq4sbuH8lZJazWjBBWI4rvBeuOOeuNPjUKWKfI+4jc2k/J7Q6rWFboStM0TMlZt4SV/yyC1TKTHFbhcVWaYaUQC7ydGFYL3IyclZpNBStdvKSB1V0hbQprMq2E1Tl4CqwtjHT+CWGNu6gcWKlNessMScuKh9U4T8yBVeWssoYoPQ4rz38SWGkeN4X1qAUmEeFzYe1mOoBVBnjcANVespxVDkX2PglWL2e90fed5AWrvC4PpkYhswOstp5nsyq9CaxVNrDu96mbA2EV18c8m7K2iFu20XgTlBRW2XjkmzDzgFWteAVTo5CZBis/X61g9a4yyod1hNX8v90wEayUs6q58vrSBsiVp9Ww8sXYMKwV0X0UrM66nGD1DAy72RrWBTlrP6ymL2dNCusR3264Gla+C07tGHa46ksbkBJWscyRAFbbXxRWnnXAmhBWNi0JrF78yGkb0EYLTN1PLg+sGO8IKzfxYLBGc1ae9cNgTZ6zHgireCWuzRLBKv/bANaJ3254dZfJ977r5WWwUizvAWvKnNWdBHmifVjl0GQ0D8EqSnu4OPSPgdWPvG1g3SVn5Ve06iGMyhtWq5HF4NdPhGjfo04bwipXTNXsUDtmGqzKiHWwej+OswRW3YmhrfNhdY3NglUMeAtYvZifBKuc3DFYfZtV6ceGdeTbDb8/0wNOfVhvB6u6F6lmh9oxS2F159r5sDpTJ8Cqo7kQWCvZux95g7CKddFKGDodVn0cHIRVnotF6QeHdfjUutGZlcJ8Cqyht6kdt53RixoQwkpZ7L6wUlXRickKVkWA6174cgBWl3kuhtXLMKKw2uoqyxWlJ8FaXs7qNHKflZ4dTpqzJoZVoBc1YCmsumrhsOrg7IFVxTZ1L3y5DFZ3HlwBq66+DlYxAUXBOvrthqNfrLYdrPLXIQArmy8ivgxYxQq6ZyBgnVRw/283nA+rDGhRIgGs03JWXRWwDsPan7NOhHUwZ9XVV+Wsi2D1YoFdswesa8WPQI0UnAqrmPW5sM7PWTUSo7De3AOOgDX0bOW9Ww6rnotBWKWpjnHAeq1xbNPWvuQ2B1hd1Y1gVRd4zpA8YRU95wPrxJx1Iax09fy4sHZPJo1eBDdfqNbltY8EK2+ZD6vKYPKHVQ1tM1j1wSyAlbti02fCKp4bSwgrA+fMzhFW+2TSdeQ5/u7WzeunGurlsKqEMntYaQW6H1a6KhZGid6nwkqzWgKsN/E9jNqzlbgjTnnrFFhNNR1WvjOQN6zU2LBmwXp330D67f3g17q4hyJe3nxdDKtb/jkQVq/FIVjlHXNT0RbOWd1S8kRYhXGJYRWNifWWKbDKlSBlZD+szgcjsJJrhKFbw7omZx2H1RwO6+snumEz/FWk9FDEy9uzwurCxJVaCquIwSisMqJ5vxmDtd0u7mRMgFWn4NLI7GClDUOw8iw8JKwTniK0coh+f+574v8ssNIFXuawGu55Pqw395U382ClnHUzWAdy1j5YXdTtBKuMCV0wom1gpUeYXj9llbNScHoBHjVgOqw3PoK7HXYLh2E8Z6UwLxRW59W+bzcczFnZr17Omg5WXb0PVvKMOOrOgVXtnQUrXVywywa0yWXwtOYGJWkz5HCzDla6Nk0Lq/K4iD+jYTWVHoXRX4M6CVYq4YqtgVXmrOOwRnNWMwKrbX0MVmkcNbMEVgc+eW09rNqHVCt3WCcvME1sbkgbwy1RcWoAACAASURBVCqLxH20FlYK8CistqW0sNIl1VRY2ZApsJrKqP7yhFXSdiys0lty9DvBOvXWzdTmBpQ7rHImBmAVOasljq8lTWJY2YwZsFK8cs+zYe3PWWX0Ss8aNnNTWPWMqDnsgdX7fumtYd0uZzWTH4qY2ly/NoHV+9orV3UtrJX0uAcrRYI1hIjqWlI5q4jIEVi5mDLiKFiNtNZ4W8Rmfyq50Xxg1RZvDqsz6+Z9E05UGX/J9wawKpdPhfUmVv8VrHw+VRGtoowiIQ6rtF9G5HGw+iMoDVaVs6aHtXKRlhZWsdY3oBPBqieDi8R9RNMkDtVdZzYa2MEZwWrk/oeBtfJKDcIamQ65EbDOV0JYbdq3F6zu/TJYdc4awOq6WgCruDU0BqsIUsBKXQPWpc1Nh9Ut0qSAlZd7eJoSwyoD1Piw8uKxIUOjsFIx2Re1D1j1dMiN28Aq/e4ZPQnWs+SsKWC1kOrF2Z6clWClr8KcCKsuuwpWdzF9GKyuKSP/XyisziPpYdVTTcOnY7ywzm0f1GPCqoJlHFbXRBxWf6JdzqpiQEeHrbw/rGJ8+8Lqx3h0sz+V0hfngdVOnrbOVCr64ioD1pscoPBF+79IzroprDzPyWGVOSv31Qcrp78xWImnwmClJe2sYXUpJmA1Hqw3qzisruB6WKM560awypihVzTxw7B29oqF5U1h9UJ+Y1jFRIvB7w8rWR6F1QGnDFPDHYWVu7OvAeskWBlS9utxsN7sh6IB62PCKh4LdfNW6SjsU/6wVjvAavgxM9FISlgZpTFYRRwYIyq7OInDWsVhVaG3HFbtgn1gFf5NA6sKF4cJGbEjrCpGHazuunpIRcA6nLPKgrbufFj5YMCNTII1Gh2JYCUYIzmrMSpn1bCqbLYsWPUTSNvCKoiZCeuanFXHqIWVZn1IZcCq3J8GVnEk2w1WfjkJ1s6mGKyqJxeK3c7gnEtHOS6XNazKnRNhVSaPwRr27PbOgFV6DLBuB2tLgHTOjrDy7Ih4k71UImddAuuNZLeLgQHWnp7dXsC6rDkB6009J7Qa1s4tCtamwtKc1YsYDo3FsJKtBcJ6E1/v0QurPNerHtLBykkTN5oC1ht9S01iWB8kZ+V01Z+SpLAKZ7tGjoa1P2cdgVXmrDvBelM3qgdhVRapHpLByl2khdUePmfDepO3BHWMunCgLgd0VlhVzporrF5lvkLuh5WvENyeXXJWS8f+sPK1hyy9DFb2lOG/KWB1bhERBlip7hCsFWM6Aqty7CpY3RSvhFVcDvTCKtznWcbl0sLa/DkKVrrg15GxCFZhlyvlJtJtAaxRcTR7OWsfrCIgfFhtATFHvNPu2RbWLqROAOvuOWsPrIty1kFYbcFlOeuJYBUBpKeEg8fNmKtLsNJVUipYOXJzgFX44FhYg68u8v5RsEqLVA/pYJUjSQmrc/o8WM+Ts06A1V6dcF3nVp7Lh4A1yFnzgVWEKb/xN0vTOFjXwNqTsy6CVZzxDf9NAqsYP2BdBav05zawpspZmTPDg0oIqx6BDuiNYHXvpHFytq3nemGVZWlAC2HVB23AOrm5UVhloG4H6817TnUhrK7/M8BKU2H/0SnKTFjdNckorDd5l6sUWN1qzKlg1QFRMayTctYBWN1RYBtYhXGyVbZVxFcxsHJe3f3jHUg3glVOVDGw0n2Oc8FKPuMaHEMJYZWRuxpWedqXrbKtgLV0WHn5SwzflTk9rG5uA1jlxKSFVZnHRdVbcxCswvz5sLJ79dsdYeWclS9qQlirDGDVXrCw0uABawRWAUAKWP2cdQas+q05BlZ5Z2NzWG/0Y6vClytzVonfEKzH56wLYD1pzko+4+i80e35ebC6W2Li4bNEsBoP1h1y1j1hlYEpYaU2tGm62dWwCqdPhZWnU/lvR1hdT2eHtXOTRULDWnHb9l8xSa6evEW+FFa5Jwar2ClbNSowVScGsHadFwLrcM5K7nhMWMXFThxWylkJVqZ2U1iFmTKkORhpSveEdc+cNQqrwM9o03SzDwir0cPhh7zk/D0wrGq9Jw6rm5UDYaUjylpYeSF0MawyboVNW8AqzyJTYO2Wjs4Dqwgonr/HgZXD0cyGlXLWmbAuzVnZTDZyJax0DZsYVkFoUliFI3thdU6lmzIzYbWkHgarswGwSq2Gld7Oy1mFN9WE8HwD1uWwUsDuAqudh7xhpS4eDdbhnJXdS9Mhw4b8DVgTwaoclh+szn+ZwNqTs1IXDwerjMAlsHZu3RbW9ogiwkSEdQir4kt03hbeKGfNA9ZlOesUWF0zucEq/gGs7F6ajkNgNfIbBSgYb+5jy9Nh5Qjj+BqE1e3MFlaRs1o8soD1Rh9gTAOrWxYHrBwRJcHKW9zsTIb1Rt8nPQwrucxQt9nBKi3aGtbJOaulOhmsoj3ZpfgHsLJ7aToSwarmW8TOLrC6qXejAKxTYaVwcGfYbWClayYXUYDVCx8Fq/ZIBrCGOStgVaPcFVby/SawipkFrHvCyrvk7CyA1RnCb/n4WwastKQtvVc2rNoXDOu6nDWE9Zw5Kx0Q9fS5oS2FVcykrSH2sTfTwto1ugDWeM4qD2XrYXXRxf+J+0aPD6szNBWsFMAngtXeiiO/bAlrxV3wPoGUjvRdYdURtA5WDn8jzKfrtjxgFVQlgdXlrElhZbODnBWwbgOreH4hP1h5jCKCtoWVTrEPBattfCqsdOSaBquYDh64DJHzwCr8mx5WDsmKQD0zrEzt7jkrTcHhsNKTG3400W+TbAKr9FKPcoa1ooO7nnjA6opMgZUsk57SsNIJVcAqmwCs3f/dHsCqpWCcAWsFWLmo3GOouwisrgpgXQmraPCssFZ64tPBqnLWx4NVWyY91QMr56wnhtXPWbn8Q8L6/flD/e/9crn8+GVhczvBarjNEmAVcA3DSo6aCSu9WAqrF5vbwyorJIJVu1zAOp6zigaLgvX65mvz6uOy5jKFVdz5PSmsboF+P1i1zSlgdU+Z+LCqGYvBymWl2eKPaLAkWC2mLbILmssTVnkzKQariEVv6u2oiofVu6UT1PJiM0tY7dXsAbDe3I8ACpfkAOu39y2s954L4dSwVnFYPfAAqwp8VWURrO7pKsA6Amu73XXMtTkcDoV15zNrprDKp1vMY8HaDM2tHgv/AlYavW2TnkbsgbXSDQ5oI1gvtd4at9S0oLlMYJVz2PwdyFlF/FNl9dyoeQRYKWe9kUZgFf5WDpoIq29zAlhlzurs3w5Wd4DPFVbT8vrDZ3O/9LCaBlZ5k3QfWMmuNbDKoNgfVrZoEayuxQFYXRiL6TgeVrGZBsXfWJscVudlBWs8Zz0e1pXN0RlsCFaLwlxY5UwabrNwWKXtZgBWjh3dHvc+DKszqhtaNGcFrOxlDSt7ogRYL6TBYpwblg1rPGd1e9hWUVWVmwqrtJ9b2xTW6ua+rG8HWJ07FsGqJliMbidYRc5aGKwTm8sQVm55DqxkeCduyC3UjMDKEWhEd1RERvJKWINz5BishgvbxirqpBhYt8pZJazk5VPDujBnHYa1UnUeD9aenDXMPreElabzcFglcwlhFTkrB0mesHarwZ2W3Wfty1l56ozeL2GlEmthpTB4JFi5kUNhtQlCD6w39ZWlBcIq2qQgyRNW8/qp96Hgic35MJoHg3VizsoRaER3VCRTWIXX+2B1S29xWG0qmRmssuwDwVrT+nZdcw8PK3UqYXW34eQA9oF1cs5KPuZ5ch1OgrVKCCvN2Vaw0myLCRiBle/ODMLKU3U8rOZ+6XmCf2JzbhY2hVXMXSpYKbKXwKrWJneGVWJne3VLAsJRbnU7JaxkTRJYxcIO+28urM7982GlKeQ284d1bXPTYe0KzYHVuzV9LKwSr0NglR7nF5VYbOeu6b7xaljFl5iJbrsq1dScNQKru5zvgVUCBFhTNbchrMKfUVhFTQWrwDQCq5g6jm9huDHeEI6AVcbOfrCqXoSZA7AKzyWA1WsFsM7UOKyVOR2svTmrewBhGawy9CbDSleqmcEqonsKrHJmp8Eq42c+rEXmrGub2wzW2217WAVm82DV5agVL4XjIpvCGuas/ETWIlh5SYkCfA6solQ/rH7OujusHo+A1SyH1bIqXJsCVmeihJVjOltYmYH2hbsG51FxXBtRdg6sxB496bIVrIzCIKw0rH5Ytcd6YVWUAtatYBUzKX5ibAGsyixj5JbVsPKlr4X1Rr92RE0ZOey1sLorjrPCenPP2/fDKsGcCGtFIzstrDJ1Wgcrp2F5wSoA7c557vR6Klh7Sm0BK11xjcIaIRew9sNq530BrPTlo92Ow2EVsERgdWs8jY6AVVDjwUqhNh3WJTlrTynAmqhgmubmwOoCZBKsrvHuxTpYtVlGvDFkihHzsjOsZh6skZzVOc2FMTe+BNau3VFY1TECsAqdBVZ9QhWwGmrJzYw7v5hDYRWXvnb77Jx1LqyMnQcrn3Op8ceDdXrOOgKrmu05sI6xWgisCgQXyW7sU2CNLAJ3TQpq7MxIQEJY5UxvC2srmWBLzh4UVm58BFbhp0Ww8qTJiXOjBqzzmhuBtSsyAivRyReV3Hj3wofVVGlhFUcMfwiTYFXbAet2sKrpmwWrnrFhWD3TlRceA1Y1mBmwMp2AdR2su+Ws3PhqWD0WjoTVSBPOCyvjU6mnawJYIw8udU2GsNp2vP3nhlWwY0sAVsDKqnphNcJlAlZe1o3BaqLfAxmFVcQlL0AVDCuH/ePBqoe+CFYJHWBd1tx6WGXOSpFq34ew2ugOYOUV/ZJgvblfLaQu1NuDYSXvbgerhGIAVkfmxrA6lwLWXlgpzouDVb6UYEyF1d2TnQOr8FYIqwfALFhlmU1hdQfnnGFVO5UXTgNrZeeKqhnAOg6rGKg0YBmsHONHwUppD2DNH1YOxpSwHpKz5gbrUyPAClh7lAmsur1SYOWclS1bDOuTEJWMwarYWgmr8u8UWKPfs6E8sw+sPo+AtStBbyWskpFKx0ElZ9D+fVBYDSEyG1bxVGPb5NNTQKuClZbcd4NVkKj2Z5GzAtayYPVisR9WqubM5YLaE6LE1rDKzws0/1hG6aUtSTbJ89kArHRZviGsRk5CAlh5EQSwDguwqkpUYmdYic/2vX03H1a3VK/ByxtWcXsBsA5rN1ipSrawqhL7wtrQKQFwC02AdQGs4U7pBcAKWOfCqnLW7gqYLe8uhRWs03JWwApYAWt6WEWJqj2vagACWEW49sPqvsr/MFjFLE6FNXnOClgB6y6wUtmFsNp9RcHqXKp+xFWXBaytVsFq90+AlSbkLLDKcmOwPsVhfXp0WN1grav0DVxVFrC2IkyXwyqwTQ+rETPxwLC6rZfLxZY9AFbjlwKsiQomau5ssPIH+HSVPGBtfxf7nLAawDodVqMG86iwio/G6yqbwMp+AazjsC7NWWWIANZdYBXxHYVVIwFYHxBW1bAqmwZW3Xq/AGtKWN3lkqx8DKxqD/tgHqxVPGelGWHXckfZwSqmz68xCCsHA2AdK7AIVnEXew6sMqbWwOqIWwDr2pzVsKG8PwGsT5Uq4c62nlFuuHLUB8CqAoWL7AErewGw2i0jsHYfFA9hdbFyOKwy+jxYhXW6il9iNazc5klhlV3sDKvwG2A9G6wiBjeC9UnD+hTAKnxaGKw847RjR1ilAx8DVunPqbByLOwIa/dnLGctAlYxZH6Q35agb4yYC2tVIKwVYJ3Y3BJY+Vc1uvIq/FLAakZgVXQ+Aqw3/nqI5j19RG4irBUFauGwksFeWcDaahGsai4Ph1W/9Wx2jecMa3OtIr4gQnz4HLD2wqo8ClgBKw9hc1hv0a91AayAVShbWGW87wurisINYJXecZZ7tJJFS2CthL+ViarRBbBKKjw3JYGVVoy3gNWLnSHlDCvNDGAVZg/B6pVNAGtnVvBVpFnDqsblT5/oYh2s6l85LQrD88D6pL6XoB9WkwTWdv8OsIqWVNRvBKvqVBqvg1pj58HqWaLMFrXcqBRX82Fl7nwY9QSH+wHrzIKJmlNfgSnmgKdKvO2BlcZvb38GEebBWhk/NGQ+5Bkgpt53OMeVeEs298CqJkvOmwdLpUqHSIrWvD2VX64KanBVEc2BJdLsw2GVGBYNq5r0qLKFNb6uMRfWrqT8EjC3i4p0HpWwhrzL2ZbxLl/wbO8Cqw7xytupYlx0qstVQQ2uKqI5sESaDViVb04J6xOnSfKOQS+saoKZtK7kclhl0HgGiKkXkaZROCGsBrAqf8mZflBY5ZMy9OW1ihX11ptgcq0tqR8DNMLrlawDWEVVEc2BJdLsUVhdvLtoFeBFMAzmcldYvbkErFOas3kq/ceLkYtg1Y8BGuF1wBrpX8EqzZIAxGopEgCrhrXSO1WpytvUoyxhbVlVc01PlC+DtfK8AVjZg2H/7OPKKLMkALFaldqpJrAL1yWwmkgp5QOxH7DOLJigOffjKuK/jWBVdSSHlb8fsGoAYrVC5wnXhrDGMAzmMlpK+UDs74HVb2UxrL7rg22A1QhYxVQJdLwJtmMHrHpP5ZerghqqqgLA6zpwpU+F7hGwPiqsT3FYn3QsFAGr7jaIGxn1gDU7WMn4x4b19VP7FVuXH78saE58Dst+TVdlv6fLiKFFYnAPWFW8yxcVN6yjD7AaAavsVL3zqZL+9UspH4j9+cLq/l8ZXd5z2LC2gfV6+dC9uLsXc5pjWFvgu8GUAauHQjAJXtzIqN8YVvpYvrYQsE6BNQpmbFuBsL5+IkSvb77Obm57WGW0CG6i8UYe93bKYMofVvdciG9hYJyoqgDwuuag66FCT01usIq/O8BKwZUjrN+fP7qX954L4XxgFa1E440CIZxp9UJ0oaMv4EFhJGyKcBOxOogYzzjRmtgDWD126O+esEq/yVJBnMSV45nVxHNWLxYUZnqC7dgBK+/ZH9Yn9RSaW2JRnap3PlU+WrKU8oHYD1hnFmx0vdhT66Kc1ehv5Wpf9Ny64cmTE2zrAVaxZzxnVd4SBQKv6aCLUqE+hOHivZLmauP9+mY/WAP0aO8iWP35oo686ValgjiJa6PV4O/P3Wpwz3l1H1h1+BnfjaJIJf3pNUGBEM60eiG68Pr3eVAYCZsi3ESsDiLGM060Fg89VTkwTlRVAHhdc9DFqNAfmXLxXklztfFefddu1VNK+UDsB6wzCyZojo7H7Ts3+x5Inr/lBHuw6BreBtmKN7G00eeB4l2+EA73+vd5UBgJmyLcRKwOIsYzTrQWDz1VOTBOVFUAeF1z0EWoYEbpYng2rAFaspTygdgPWGcWHG6FNFiMH+Q3NPs+SJ6/aa5kvdCTwQbZCr11vztTmWJhDfZUMlzCoJYtK8vnw8ofkmr+c4+4AFbtN1kqiJO4sjyzKjrV7MsBa3/TXMmxh54MNshW3Fv+QU4dOL3B5Dnc69/nQWEkbIpwE7E6iBjPOFFYh95GsHr90ZcL0/jsxzL2gDUyripeqvIbUzWtxzgu9ET6NcRfPaOyQ2+6VakgTuLKFFb68LnxPnwuBqz9TXElxx56MtggW3FvAassEHgtHnRuo/xij+4/d+StpFHaRD2EwL9+KeWDSpf1xlXFS1V+Y6pmEFp6Iv0a4q+eUdmhN92qVBAncW10n5UvdxfcZ23V87UuYsCBRwGr15XeMxlWbfkCWD2uKsBq8oXVvH7qfSh4cnPeeqLxHB3xqFfAn9Qg7IN5obcyZ3XFh4LJc7jXv8+DwkjYFOMmtDqIGM84UVjvAawm7MKbpHCiwzJhDfFXz6js0JtuVSqIk7g2e5D/7drmKv4qUg8E9ovnTl3An9Qg7IN50YGt9w8GU2hjsC2w2Yu7SLngdSXfqk4qb2dsz0JYQ6/Fg85ufIrDan+VuVKdqndD/vVLKR94k++Nq4qXqvzG6F/6ifdK/2e8bqPb9IyqDr3pVqWCOIlrq5z1fvk4uH8SrLGgr6RfPHfaLXat2Z/USNh5s0ZvK8AadhgCEKtE99js9Uk7HQXB6p5w1aGlJpLjy3imeDOqOvSmW5UK4iSuXBeY9Bh06LBfPHd2W9ydIX9SI2HnzRq9XQCrnMGY3YHNXtxFynUSt7kUO6qTytsZ27M3rF3m307HDFjliS3EzHNaDENvXFW8VOU35v4dh1XEl/FM8WZUdehNtyoVxElc+cNKs6cd7fubHXMArHzvWIdMMAle3FThTv+lvCmt2FGdVN7OYE8TgZVfrgpqaLf4bfoAeJVaQ1fCqlgJMevekEdiGHrjquKlrE8Aa5LmHKPsvBis7O9ZsNrZvuiiy2Dlhe+LbZhOsup6KRjC5rA2Fa2DxMeXJsEaXid4XUdg7Sy1sLbLDW3nFlbVfxxWa+8orOwSEQI94/IyUPla+ETW9IOChyujxviTe2ZYadIGYXWQ6Jno3HjhOZVzS40zZrZR98rQXXD/kqyKbBOsXvQDWvKo4bQKVt3aKKxky3xYleXzYG0+HqUX8ltYyT8hrC7qL9LgcEJ7YKUK0XGp5qbBSi73YaVuOXiC6VgGq8d9r8qAVR/nA1gvfBYhMAU1ugETwnpxZwDHYcWzU3FXKjbcfEpWt4CVr/k4hriIiP7VsFJHZLlyqFFdD8Ea3CJ/srBqsPgYKd3F7iN3+VNJ8zkRVtsce28erM7H3K0oNAYrFbrIa27ld+XtQWUPK/ubxh3C6klUmwJr11e3UdLENlRkhdt2URucn6UJnt1k81xYtUt0viYjlY4gorRvjQtZ4Y9KN981rYemz+xyFUERbl8To1V7+du86oGVDFLuEqxcBMVRi3xYg3FxcYm88FY/rG4vdSD6Fe+GYbVN2aQgCqsaW2TahXKFVUaEGgxdNMgp9kSV1OxrB3EZO9/dazGJ/mGZJQ8T+oDojNKsUgnaHYNVNtQLK7cmYb2I074q7DOgnKStF5Ur37mqgOs0qNRt4QfOGpe686vvPM8geQFlXxDGEVilm3jsgnv/Aku1YvQSlcbU/XG2MqzeqY8GoYZfGWXeRbjAPTSgZ1iPLTLtssvh3QsKpmtOR4k3qXputbjWRTUUQ8BQ9IVx48cIvxaw9huvYs83VVxjeRV6LVVTyk6wisCqXgVO8oYQuivmUHaXb7Q9lfHjoVVrkc9qcC4Vb1SbdFqKmOy7SR5M4+OKtOJaYFhl23aUtmxP5BjVELtMTrdOCvpg9U/c8d6Gdy8omKw5L3ouUn1nEf/wpzXgcv/keek5Ehh/9ketDw21FavI1I5ZqonxW4y2pmoEpUWnfgXjOVQ1FMBq/4pMtYP1olg1JoQ1GL0qJ3qlQRhV0C8bG5c/eLf7crmIc6pXKyjdPx3yAloNSR6s+Cl348GqR9OrImCl98Ib4exOabB/5xisXH+ie2lnTwCoLMpMHpBvhGpS1AsbE0NhD0bisDc2dUORbcamp/YkUhkHr+5a+DNsSHZGM97rCu2D/nHJKYxZQNWVA0WzPTZIE4wanH3T8tlsoaTA1vRTsikqCFbaJt0zp8v+aVezaDyP+xUH2xk1gcfgwepDPXEwF6Fp/at3kVpzeg+2dGT6K8Ju3F44D0yiHt9wGX/4kXF5ffrOEyd7v1IkALyGRfvKkMtF0llV6v2iEMoY1uiAFnMyHIRi3iPdhDO/1AbRwgBiU8iTMTGxyhzrRkvEYXUnksj9m76aYz1MmrTp7lKvh3Gk+RnvwI8fu1GeSXm1aaLJkV6SFzykuUldToB1Z2vGLq7mtbfeqBmd+d1d3InVbWBI18E6XGhOm33dhOdU3f4E5/aY4OeoHa2LTT4RrEMuPwjWgZ172jJffQZydIrxSVgnD22OC1a5K42v460EC0qANYV2x2P/w8MeGof1TIqt/vJ18Pz2AOtRekRWAauSGDjP9gpvAFYopeLheZAxRyv1oQuwQikVCcXTnlgBK5S1AKsQYIVy1lMQi+GW8yhxUgBYoaTy2Twzq6mvMwArlFaazlOzClihzCUeMHw6N6upkwLACqVWz2PBZ1TapACwQskFVklJkwLACm0hgGqVMikArBC0pRImBYAVgjZVuqQAsELQ1kqUFABWCCpEgBWCCtFhsEIQNFMHwZpNX3OUq13ZGga7ZiqRYYA1X7uyNQx2zRRgTaZc7crWMNg1U4A1mXK1K1vDYNdMAdZkytWubA2DXTMFWJMpV7uyNQx2zRRgTaZc7crWMNg1U4A1mXK1K1vDYNdMAdZkytWubA2DXTNVIKwQBK0QYIWgQgRYIagQAVYIKkSAFYIKEWCFoEIEWCGoEAFWCCpEgBWCChFghaBCBFghqBABVggqRIAVggrRbrDeL5cfPu/V2RR9+6MvzR+yKw8DXz9dLpcPzavMDLsG5uRhV62XN19NdnZ9f26+ZPStSWnYXrDeazPvGfiQ9P35xwZWsisPA18/1SZcmznOzLBr7S1tTh52mYaBBtbc7Pr2c2tCQsN2gvX1U3O2eHm7T28TVB/lGljJrkwM/Pb+o2nJyMyw788fGme9zc5h7SmshjU7u+7tuSBthO0EKwXhPt2N6n750HqT7MrKwPr4m6NhDaz52XV989c1rNnZdbVUpjRsL1jbi4L74S4U6mB1dmVl4IuwJyPDrvU1XHZ21XY0OWt2dr38Sbf4kNKwnWDtrtUzSCVYrdvIrpwMrE/7GRp2b2MvN7uai8sG1tzs+v7cJNIvaScSsJqc5rjR3a0v5WbY66c3X3Oz61ozkSOsneoYKxDWXC5OhHK9DL63d24yNKzLpfOyq7Uiy8vgVnWeWuBlcC5pv1CmC0zX7i5rfoaZLvbysutqfzQxN7usakALXGDKZUFd6J7jrZs6+j62fzMzrIu0e3a3lFq9ZHjrZhOHnfahCHtBktm99G/vP9hXmRlGQGRmV6OXHB+KaKl8Seuw3R43vGbxEJiQzR7IriwMtJd1jSF5GWZemovN5kVmdhn3uGFudm3gMDzID0GFCLBCUCECrBBUiAArBBUiwApBhQiwQlAhAqwQAQaiOQAABKVJREFUVIgAKwQVIsAKQYUIsEJQIQKsEFSIACsEFSLACkGFCLBCUCECrBBUiAArBBUiwApBhQiwQlAhAqwQVIgAKwQVIsAKQYUIsEJQIQKsEFSIACsEFSLACkGFCLBCUCECrBBUiADrSXT/eLQF0FoB1nPo+zNgLV6A9RwCrA8gwHoKfXt/ubQ/jAgVLMB6DuHM+gACrOcQYH0AAdZzCLA+gADrOQRYH0CA9RwCrA8gwHoOAdYHEGA9h74/fzjaBGitAOtJ9IL7rMULsEJQIQKsEFSIACsEFSLACkGFCLBCUCECrBBUiAArBBUiwApBhQiwQlAhAqwQVIgAKwQVIsAKQYUIsEJQIQKsEFSIACsEFSLACkGFCLBCUCECrBBUiAArBBUiwApBhQiwQlAhAqwQVIgAKwQVIsAKQYUIsEJQIQKsEFSIACsEFSLACkGFCLBCUCECrBBUiAArBBUiwApBhQiwQlAhAqwQVIgAKwQVIsAKQYWodFhfP10ub4359tPXkYLf/+LreKE+XS4xP93rzR+bhqkH2d+//LKwM0+3221e39YlKfTu3bt5fZuXZteWakb3MTrf9bZ6579We1bNenbKHtYqJtr7+ulDHSBvJ8C6Zs4ulxit3/7oi/n288/UsOrh+/OPaWC93WK0DvV9fVu7JQkx797FaB3q+/527VHqKSra3Uz4919+7pnNYPMDkWryhzXKKsPaTkYdHt9+8avL2/ao+6E98tf4/uJXP/5N/eb6wVzr9/WuN7/76eu1KfDtT5/nnXp6YG0DoWn4q+vh692e1F7/6n8lOrP2wDrQd6N61AnUA+tw3+tgjbPKtLad/8Pvv7VT6Wa8+/fbT//4fPnx7376+u39pT5S9s86xcff1f//0u1sXq0wex/lDuuI7o37X//y87f3H+uzSXdcb+bz5WN97G8vjP6yOQp3W7/99Ls/+tIclut/Z0VUHNb6kq+Z4LZh6qHput2Z6jI4Dutw36k6j8M63Pd15WXwyJn13vFWT2XbdzfjNO/tf//4y8/11UX/rIv4aP7f7axf5a9HgbW+NLt+qA+pb7uU6vKhPQT/7edvf1bPzvvLD5/bafttvbuep5bhOd3Ec9buYrfpyPXQdd3t2jZnHe77l4lCL56zDvZt6IixiQjWr13f7Yzbfy2sv+sufXtnneOj+f/d7dzQ6FQqHFa6DG5hbWfoYzef7Z772zpy7j9+6bKc5bD269rMOvVw56vrZLDO73uP00TvuFNdgvfIUlVPZdt3N+PdvxrW/lnn+ACsu4oWmNrL4Gvj+A91rNavO4z//M+/NNNxt8dYe0GUBtY6HEzXEffQdt3u3RbWob6bA9eWGuq7C/0NO28XmJ4/tATWfXczbuddXAbf3/y2d9Y5Ptr/u50bGp1KhcNKt27aBabmxsGbr3YBoXV/zXE9t5d/+vzx+7NdavhoUp1Zr+21X93wP7oexELLxmfWgb5f5DXpzn03nW/ZdTubbjGp7tvOePuvg7VbYBqYdYqPNkaufbeC8lPpsELQaQRYIagQAVYIKkSAFYIKEWCFoEIEWCGoEAFWCCpEgBWCChFghaBC9P8B/sCDFFlZZDIAAAAASUVORK5CYII=" width="\textwidth" /></p>
</div>
</div>
<div id="the-input-output-hidden-markov-model" class="section level1">
<h1><span class="header-section-number">2</span> The Input-Output Hidden Markov Model</h1>
<p>The IOHMM is an architecture proposed by <span class="citation">Bengio and Frasconi (1994)</span> to map input sequences, sometimes called the control signal, to output sequences. It is a probabilistic framework that can deal with general sequence processing tasks such as production, classification and prediction. The main difference from classical HMM, which are unsupervised learners, is the capability to learn the output sequence itself instead of the distribution of the output sequence.</p>
<div id="definitions" class="section level2">
<h2><span class="header-section-number">2.1</span> Definitions</h2>
<p>As with HMM, IOHMM involves two interconnected models,</p>
<span class="math display">\[\begin{align*}
z_{t} &= f(z_{t-1}, \mat{u}_{t}) \\
\mat{y}_{t} &= g(z_{t }, \mat{u}_{t}).
\end{align*}\]</span>
<p>The first line corresponds to the state model, which consists of discrete-time, discrete-state hidden states <span class="math inline">\(z_t \in \{1, \dots, K\}\)</span> whose transition depends on the previous hidden state <span class="math inline">\(z_{t-1}\)</span> and the input vector <span class="math inline">\(\mat{u}_{t} \in \RR^M\)</span>. Additionally, the observation model is governed by <span class="math inline">\(g(z_{t}, \mat{u}_{t})\)</span>, where <span class="math inline">\(\mat{y}_t \in \RR^R\)</span> is the vector of observations, emissions or output. The corresponding joint distribution is</p>
<p><span class="math display">\[
p(\mat{z}_{1:T}, \mat{y}_{1:T} | \mat{u}_{t}).
\]</span></p>
<p>In the proposed parameterization with continuous inputs and outputs, the state model involves a multinomial regression whose parameters depend on the previous state taking the value <span class="math inline">\(i\)</span>,</p>
<p><span class="math display">\[
p(z_t | \mat{y}_{t}, \mat{u}_{t}, z_{t-1} = i) = \text{softmax}^{-1}(\mat{u}_{t} \mat{w}_i),
\]</span></p>
<p>and the observation model is built upon a linear regression with Gaussian error and parameters depending on the current state taking the value <span class="math inline">\(j\)</span>,</p>
<p><span class="math display">\[
p(\mat{y}_t | \mat{u}_{t}, z_{t} = j) = \mathcal{N}(\mat{u}_t \mat{b}_j, \mat{\Sigma}_j)
\]</span></p>
<p>IOHMM adapts the logic of HMM to allow for input and output vectors, retaining its fully probabilistic quality. Hidden states are assumed to follow a multinomial distribution that depends on the input sequence. The transition probabilities <span class="math inline">\(\Psi_t(i, j) = p(z_t = j | z_{t-1} = i, \mat{u}_{t})\)</span>, which govern the state dynamics, are driven by the control signal as well.</p>
<p>As for the output sequence, the local evidence at time <span class="math inline">\(t\)</span> now becomes <span class="math inline">\(\psi_t(j) = p(\mat{y}_t | z_t = j, \mat{\eta}_t)\)</span>, where <span class="math inline">\(\mat{\eta}_t = \ev{\mat{y}_t | z_t, \mat{u}_t}\)</span> can be interpreted as the expected location parameter for the probability distribution of the emission <span class="math inline">\(\mat{y}_{t}\)</span> conditional on the input vector <span class="math inline">\(\mat{u}_t\)</span> and the hidden state <span class="math inline">\(z_t\)</span>.</p>
<p>The actual form of the emission density <span class="math inline">\(p(\mat{y}_t, \mat{\eta}_t)\)</span> can be discrete or continuous. In case of sequence classification or symbolic mutually exclusive emissions, it is possible to set up the multinomial distribution by running the softmax function over the estimated outputs of all possible states. In this case, we approximate continuous observations with the Gaussian density, the target is estimated as a linear combination of these outputs.</p>
<p>The adaptation of the data and parameters blocks is straightforward: we add the number of input variables <code>M</code>, the array of input vectors <code>u</code>, the regressors <code>b</code> and the residual standard deviation <code>sigma</code>.</p>
<pre class="stan"><code>data {