ࡱ> {~~t}yrrt r    !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~RdO)>N Pictures _PowerPoint Document( ]SummaryInformation( R4THpZ)\ ]e8U Qe<'SO/C(ɈEP z y ̂ Xn`ƽ<] ٘| ~Xÿz^FIB dMbt%F qPBÎי쉌dc&ͨOaAM-dQm,  xS=OA;;8@4JIaD,MLHShYA@D4:b6Ÿ`ieE(O;=.@f? #JzW ź&t~8د%WwHqؘ(Ϩ?We d\ӍQUOυ[A ~C־YFG82x#tY2lh6-w1L]ۛԐ"K0^)i񚃯 -h&w/+~|uxJcFZ˕UZ볆;(/̵i0Ti.sۇhwյUh&ת1EYHD JxmR=KP= ;ġbpb ݢCV`b?At 8.?ppvlRr/\x=/y\&*!i3-z^ 8jNXL y5oɩr_5z#f7 ȫelj-8zeu.,;0N Mh Ox;xJq\U^%;JxfIu'4D1]8C!r7Ԕ2+eȯHzU*jHxYgI3 3Lץ"bw'F1Sѽ(qG·)BaNi YHD xcdd``$d@9`,&FF(`T̐ & ôAPRcgb I`ꁪaM,,H3rE1At 2Bar`b YK@.r#X=s~nfBpb^o0rߗrs00pAddeuUU k@vL혜xA|8_ Y}3Qq?bݗpw~F7#a"b.< |gNߏ?%>dy;&&p$K!K]@z&{2a/aoې D ?aL, +ssQlBn<` 2–Bn.hNp2-/F&&\i ] F3X?+%J0T/eA.|sN+DP:+NA h 2xca< Lyc`d` ,dHaP wc`uS @@>R 7$# (䀘 n+;xf [ܯ1dtg Ma``fgen`lެ_2,`AF[ 98c!7e׊D]x0hpije4- q y ̂ Xb=ñ<&%v,`70d28(t a * :gh@ h xR=K@~Ҥ%j*mQAw! NRWR?U ".V'?IEtsE'⽗KЄKy|T>  JDz'(sc4X5uuhP~ ]!t^#XWWrJBS#NR[yPpjbtyP&zi\1a6ć\C&gڛ f5!գԽW\%&̀ ٷ>\jj5B}7Z}o4(J^nypؓbɯocDVuTy24y8O;(:3PaM.?.3thLD&n^ v)74Է.:q8&3)50T2Ŏp,/yη3-M(~"P6 xuT=KA}wb'F a1XY~]LFHT^!JJ,[;Ie?qAL޼eUvj?뗩E ) y`)6T9uGV]%Ӌ'na ¼Z~z>о3C';A͟E",ٳlfI$7XF[`jT!N2wzWzJ+=DQ *-U/N>pi՘Tg)g;$ܰcf>so*14NȷηJYJ -Hx{EPz:`\tm,4ŝ-{n,GZBv%~SӨ4=CI+YFқҤc6x ]\_ z"P6 xcdd``na 2 ĜL0##0KQ*faz@IRcgb %66ĒʂTF/&`b]F"L K8A,a kſBTN`gbppLVN+V fx[@rv. ްJɁj ?OU2|EdS2 j^yL{P;r#_Ǥ1@Xrs"?}ŒdD ñ9!2A͝LnVܭ?.GW93($`W$ByÚ!lb 7LdQRw# )щ;>"Y7"_$,&KaBܤ`+6:;-?T ,M - N,\ \=4/0y{#RpeqIj.Cj v.F{ F`00TYR>GcR@4=}** !H 8o}xuMLQ{-WNU~?" m)HX*`D'Ә>  .!q cwv1.pנ+MԵ;ƅ)55Kf2ޛs=7#B2+9իSYEQ"deky m=g֍'4 7'Ym*qENY7g#JjҌ&Q ezZGU8J3Zۡa9̏>ܗ݈n:>19e}Yc_ Vq~^ξȣݣ3St=&HvI[ $&⽷"[oJbyPB:~gHHȕƾX?obs`s`suHHy] ||{_0i +L(:BSJ`N!۾0GΡ"x/\g8ztgz$X_EN )AFNcDYzCd ;tmc Xm3k|t>]g9Fʮp r-S\ݩ{z#!ʒK& \Z~0T_x&ṬD_=Aηܸ8J:%aȄ:SGxuJ@$))b-Q[P|xPz2*DP,[уx׫zO C_`ܶR)aCf2,9Wx!$o * QCqS"9@J m"WPگs:Ny$;zq#1(Lm ] nOs:xH.Ut6Ih*7k ~hP6lߞsό&yV%ǧh5sN1TN~'>gsM*qn95Go= ~a1%`Ȅ:SxKAߌ~XTPPJcF` 5!df &Hu,wG^ĈzHy]}?`OW{pze\KfiDWSm{+ ?[`oPVpFпxR0Ts4x bn%D5Ԏ8ũ;sX! \xmRKp}mjR[I AnQ!).VXZB*P ;8޽w˩Ip&z9`:_^ލ88ѯ.P%^ 9]cloU^ k\w?! Jy'CuBXJs։N>,q7ez(=^ &.)'qSQHnQ Y])3Nʂ1,s Jk:2o%%&H/7#aܼ,Z.YF/(Mf V>:Td;-ӳx kG9X! xcdd``a 2 ĜL0##0KQ*faz@TRcgb q(CPT obIFHeA*C#PD.#l% 0! ros3|ޖk5-?|WLF]F\oX jxA| (?)MY>ׁ9y701 9#}\cb 2\FF7\&jA&dBl̮¨F>d}̠.H.HDBw7na}͇lhn~p",'D=]H ɺ0,&KzdBܤ(6n34u骐 s iMΏLLJ% _f3t1f>ݏ`N0T%'>L! 8#=mޘ<?Pg_!H(+xccR Lyc`d` ,dHaP wc`uS @@ ~  U)PXRYP2AHpa\뜟W,сA & QL`=!xAqx<9< Xy`Mbtb%F qe0jb#=F lac&؍`4W&ART ž?0^&͏NFF[lLmp) ")[z t H(xcdd`` @c112BYL%bL0Yn RB@?6 H200@0&dT20]?c&gabppLVN+V fx[@r.vo \ ,@u@@ڈKA3ʗf: {@w =\3p{0ƋLc0&@{S+8_a@^ 10FAο'kX@|+8;P 27)?"vM(_x(=A|]8?FU mLy.h\ppBXNLLJ% 嗹Ӯ3t1Ø0T,ݫau.d֧Ž2TXPoa " xmKBQs_W! ZfAѠ$d pȖ0=Zj_jh)ki ;fsj#) ToԚ;qLKԛvQ@m\Us"jm$#D*<[XϨKIo꘨G=<5dT)dӹU"AQ6 oJNGtѯb. sG;&kFOesaanV~l:+}kAEEEEey2n+Fl*K沩5ƍ:!+‹ pׇiw'q^/Byv{(N!);-x}==Z{4 xueN3$û ÿs(S%̃qĥ:5ڣ H 8ox}MLANaZS>" R>$ IIcL/Q<4 !ѓ&G={Gnz#ѓ&hLühٱ@/_\ʎ,Nը fE;rBbtBs1z-8K_Zv_V#)@?_bӓk]Zu{^\|GR2 y°QXĽN_g1ܠ#'˜09lqq1e۫7>z%kO]Eg ܾʧyz&CĭP,ef_3:Ow0s9|I5+%_UfiiGi9ӟ?c_4`u]%f/8_Nq|>b3T> x>}U?Oʿ*հ˝2elE5w*{TI lBzJa\dgi[Yz88̗(ֈcZ?rOTcUX_D3q)#f<3̟Ef,L=Fz$3_Fﵱu`n1C_ky$T3WѺ:;-BsS|²ڧT̹=Q0TORmJu[:th%pg|Dr}x , ExukQIf&IfljGb-"*qh"V)B 34օDԅ UhN\)iA]\"o)ͅwν3QI m%B_"֮LcTh6s'SD).WLJӚѲC͓1QiIQ2W ׉&qvZ=ѝ8Y!xTDG(u@݅\쾅9_׀|W`Aށ@m3yddz\eknQJ~"RIXa|lo49rԨcS2"<7T(y0jʘCJy8ٸK@<1 tV{|<ۆTCtKhMOqvY]v1ɟJ~-^o1?_ʕ!^#34fe<϶lz5#5Nn=Yԧ+ոŊUŎ^5N'U(늫%k!v9 J%׶o{wã}fz-IjW'g44ߨj zTSy ZTI^/*c奞cDӄx=Gzh|luD [ja=PЬ ̀w50 z䳱Ym.lgL>t;-vAo+b2P?V[ g1X`}M)zʫOQeeAE^bBA4^eR)0TeM'R{^ o.5<,; *1PLv xm=KcAߙL1Md5D/YlAb(FP/tu+(Xhka#hac 46mOt" 9̙{' iq]D㽗"Ŗ0Yi@L2Iv̚Nm-JbaGM+J>+0fGNQKlڢo[nxQ}^[2Գ3)gfK$٠oMgXʡ//OQn3H?aykŵz9{ (C۟A#jz7: \sZqTCo"˵ T G&{2\*0T2vT˕,@]1Y ԡcf!R]xcbfEFA< 10p2A0\2010) k;020bCM,,He`hȉ?$7 Roxcdd``c!0 ĜL0##0KQ*faR`n` A?d-bC@P5< %! \PQ.NF$]0TL3z@fl&vh][LOr!` 0xc*bvE Lyc`d` ,dHaP wc`uS @@. Na`U)PXRYR $vg Ma``fgen`c`ۮ{-U Qe< -SPgB 98t,zLm:  ys3|20A8'<]1ř1a @=b 60(1`Bv7 qQ,O3v3j@SDPS #$Ayt, ` xcdd``cb``baV d,FYzP1C&,7\! KA?H " ؀zjx|K2B* RA P 27)?abr- >g`b`*F\Xr^ey9!#cb{2–οxiNdwBedD!?l@ DYs3|P14peC|A|?(DDCWl.zd `Ion0p?!s_AFhķ!M\v0o8221)W2KŰ3t1fHݏ`Hw0T!cLyI#J?#@><8+XL xmKKa33fDFbv D jHPh-,.-AmBE킠 Z}aԜ>|Üe"4TCXQjERj(0Eղ! qʋ<W%3+e@%QjY[Z#PeNքfJ@6kѢ%vEإ,BI\蛡i_ÙWxrcG␝9]S#=0Z38QMyR/ӾK~tKq8x-pv9<鑹OP,Q9~^o28F#C;iMʯ8+XL xcdd``d 2 ĜL0##0KQ*fab`{`!d3H1icYkxRnĒʂTF/&`b]F"L g4A,a DL !*' ZQR\ /30pAddeu ՛y @s3|c嚏0 L 5$bnhb>P=C?@7װdMnp &o+eA| 8邨ꭹAX8߁ 7Y@h8"#|l!I9 B y/䐇<t@FYߠ KPAp?IT"1Tn8@|m0 ǖ L # ,`xy / )p).ppxPOp!pή% qA,8_340y{%#RpeqIj.C$W';C#?l@9=̘0TKݞQB6ڮ}!@xv^rA! 1xc*bvEaGA< 10p2@0\2010) k'p R 7$# (7AHpZ)\ ]eJƫ xAm F[ 9 f>3&bG=c!6}ƭę0b0.uLVN+V f`\u34N?FoLvqLbb^,⼌@L?M 9PBÎ*!0|Ej&bFMlaxQ* Z! ?s ,@  xSNA3b,Q4#@(K06Ă014`c;v3{fs\0 VD 1#Bu]< kQj}evjJ<8̣Q1 yTCxy tawt(⃖?E1PA@$z6~V㣾,"PA#|ϋt|o'eElCGnRĘ}.=8bk43}|uQiQiaSW )m #~qU_eT0%R &2U뾉 zߺﹳߪ/x3GB ýUoչޚP" =x2p{Ыp/%e~'2TTqB"НQBD/dP\ro8%ݣK0T,mdNu򇺷juaEͤAYc9Y xm?LSQƿsk *})OyV` 摀Ac%-RAb:`G7D0:`,5y9|{=oh3ۈj (NYy\1`sڀlզRxcUWɕ,PT:&DluQ{SjD1^HgZp-*V,:O=oNOrO۪7z'{B})5r8*|F 3~gQwYx玼dk[/=˟tF_/݃-[p7H1Op{_{S'ȽXrFnoiqY=LoTIYeRi]y3 RreОٿ8G"14Z ~P i }_:G 891DQaw>b|8>@|JC1ek/;ov|bW z%#}Hq>'#gW 3g|u,}@;g Y@%ƖsKW"Nf>v$k=FܯϻtН(>'=;_hް =?L7msn\k7GHa\FU<.]X'F߈s}')az}9]k~{[h{dcIKLA}N)_Aq^x`CFx7{npx0TujE fjB'p!r_\6!+A BpxmkQw̴Τ!ƴ:U2_)tQ($MAi* ƿ@Н4 n,dn da׳h]p-"RJ΅; 91~@SpUޭ[e>n UQϔAӅUtm;rau3ud&^n6ׁ2#5\G8#߰<4wYmهg[rc;kO<)~K[a0s+>u{nsJPD:wgk;|"kl /򠗛j-v97v!Lo/rGg7|cjq=|`e3UI2}#<>}%K`>d >rnsg/l$szI^3>4+'cq}&G\iv}H"oaP | +@ BpjxMhAL$mImcik-F{CŋD<TTi!<'A"͵KzAq Z; Mۄac!?K z3J;#j4I2ԳFLD"ɘz`Z2 u1{%Fs+1O՛w֨o6TB|cT&ĨV !(wX,33~R3?&~2׬_8'of]w}u~u:{sTecL`vkn?Wxg˔s3gyPwpL'@wn@g}<}H0q?789/8w+^uΪ} t.l?AA pJ Bueu1'$l޼]P^0TEXYo̹ ڲOe&;^ xmKPƿ[TmتP"jZ7QRPj j[,8Q8tv*E'GJEzN8p}qeJ#O)nBeL֤Qȓc1`YlDa; l+R$?X0DĘX;ê]ȭle0tQĠㆩK2=-Qu=MEp4kx zx0A;>hU4+aLw+_:90ѮqQzG XǷr뫆3,"XGq=)}zoO)֏Wxc!83C֞>LM^}%7˽qzM!S~x&;^ xcdd``a 2 ĜL0##0KQ*faZѠ&d3H1icY UXRYP`,@1[I9Bt@.r1L@(\pbr- >g`ˈK-Ki%T 5/.ٹN##gaSi" `6m+ae ~ \~8%_ˇp`!r/|C\+JxUp /{g?rD88f&+'+3᷏)~P{jX Df-) )p{xr!!p>K0bI?0gKЄ3S ׀CU* hEpiTrAs&8;+KRs9ҢAga y0TjV<0*J6 9TI@P jVwth* a@cxxu]HSaǟ\g;6LEIJRhIRq.JI=š Do>" (.JZgS yT7/7*#_n}Cstxv|wЖy2ʷ?oPi `@cxxOHQ߼uvg7nڦK2nJRT,`RNA&G *a=t(9RvHJL' >fC6,s)a_zٰXqxeeE[s9* ǝCNuB+ I*=T 5O u IU,aeY+X"='zj"#Cw~-2՚W?+ɓu}?gBGx6Vڦ %|8]DBWai̺'OS|.r+uguA]ƭ->LڇIz_\/ө͸B =]H̀9y]K>~R|aW=,TzUlCvK!?~'BrzE^G/f`9x6aR=âoRnIpK+bs9cEP] R[c9,rBv l~D{LMAoVa9;2IgiCgyfy&?RX8W?_\_%pև1 >~< +''tqCf;Yj'Q8r^D@lFg* {ib(5cbbjq?lAqbM!w AmnQ%c&;u5~o6nw?[`QR\gr_&+JF$X‡_/W_yOw>v&/exƓ-02agX}N6#źo3vG4z#q{#=hKC9?VU3֫IQg0Tq],HVUFV,@i'P=@n$wxuVkAf6?f&$bkMK RPh/m4BCU8Cs#!'`=*IK'&-Y7o{7;)y͹;,S/N{`ӚLYQ^x6I43|iBU{D#gM"h5:N{:i6i~:|c>o;'=azq߬cP/s970,Ŭf$W)\q;nH$Ͷ(dJ^wF5zʠөψ>Zu< ;zg#_:c v1}ͣ44<̺JF<##<e+PBag3tC:\&aU|AR6̫[:B-,0 WHǬ SO:9c~@D\/#)`enKY5qb -tom"d'Mb"[_wqy5/Snkth'P=@n$xMLAN)ݖݶBKG&hHH*'>) C5WH:ƃ57@<{2gYgy3-JEPucXwȳ}֯^oo\#‰D&WǥwXpy偅rgwFn =*T摈ʱd@ n|) 2_?R"{8/v7o?yojuH~Go*qj%?9I4%ꮟdGuGuGu#;d]wf#Xiao~1 #9i^_Kg8$τ9IW/]j|zء^s39J^qt?TS:Ou8?8Cֻ)tM~l:T?E=?XYzJ.'x IUkk;v> g/ oxb:1r! Ik)Z9Q+7z+0=wFH2_^qXnhܿi^v >} R0Tע,7G_%e >?SҊ%n`mHP+x՘KTaϹs0glPdƏPc4WQ R *L jSmZ*Z #HjDۖMp/و;9uYb$$k:iǼ"t@G65WIu1uOVڝt1\:n4%"OKKr>fI;k#@3/c:,:(SFm5fu$TQlu$TQ77fWk7[x$m`mHP+xWOldk4 !nZiCV `bz"MSSr)OSYa ,X1 $'^rݮշjn:9jbyRUVCG[-lsG0A~uζp?K4k˹s|w<xO(8CN;>D|r$*7 &1M7A9qdΑdd7B++ ->yꌲ,A'GZi\8c:ohPŭ`uf9V2._E^w_ /wGvb;vm4zjH^zI%ʶ&Q}-|\\ uޥI\ēoH$Q#w܎?ݨG>KXp.v*T}b~BwlIݱ&u&lRwlChvGv #xvl;o^oNOK,=_/t:0TQ[~ֈF߅$)%wV}3Lhja!x3@2[xmSMKQ=dY~MIbd-Z 4!pgMf`YjiպO;*Fxs{v  }RNq1&>y4@O+VԺ`}@v)ɩl/\ % 4sӮ6Mˆft]v#\2e%<*ďuZbތ&Mԉ~kƕ.˧,[^SG׆z R͚CLIx$驳X&K҇t? G\;Ӗ+Ի6D?Dn5@CbjXҧ'ҶEh;%4H/̷2`!x3@2xmRJBA>sM쇺# r)ABls]Jjm!=@u˞Ensf#eow~0w |,Șa躮biqcGpEe .3|[cQFʊ&8YIn%/HawN,Y2 ?X¹5%_WOK~Ç)h9yx0z (z!syUD>Jff >Nei+btJ_?^? ojn3smO0OQ b7 @؛I&Ț&MCڃ^@g e0TE:sjptU4z+be!4X )xmUMhAfmm?i "=Tj=`5*Dj ZPo^?PxE"*'O4TJ.˛yog[{Ub*."\3ۣ*/;9.r6;!i]_"v˗*W.܁{ HRv%vR+-8"#b5VO_<{44|g8↮n1&Z*Ye|(/ ?#^ꎕ R|j*3RԾ䋤Z zvMa}SoŭkQu|`酿1Fx+J=[^ov^5&fFzQjwWߵun烅V=uh~۴k#W#UqGϱKS縐0gn<YoGy(X &W|NuHxx]yVq=I|ULM!>F^en]+Y=gߧIZ&MG!SY-3Ѝ/UylDlD^8?>_-;q(E'!}w*J,~ooԞ~:v9Ц;gJ-']2~ɫ:SbjD;F_$~k`=%_xcdd``.fb``baV d,FYzP1C&,7\Sk A?db[kRnĒʂTF+~31;a& Ma`xuFmH_30p1Usi#oL O+!˼,\ϑ1psYZ@Fh@ui.]gyٝs602 ?A ۀ3|)Dd-600188f&+'+3nHPV ]\m@Fh e @| 8@U2Mߊ?!d'j8m'R /B埓c|'ITVaTm!+`W%\!50d$wqlK N0f Q ~K " fu^rAE@e ;\021)W2Kggb (c f~ 0T^91慜w.=n(ΧDP@ua"h(xcLb Lyc`d` ,dHaP wc`uS @@ Vc`aCM,,He`hMc` $v0.uLVN+V f`\Tvch&(Ņb&'ǨM97 63ga ,`9;FZMW8KvsqÀ^@wgh0)`&qܤfqp, h@l.`gcp=) "$K|\B`"hmxcdd``^ @b1##X`=F !#T57LSS A?db1znĒʂT ~35;aL  y AZ;DAnфA|+8? U@Vnb\ VTQMcbF.f$ۉt+.sxA|8M=#zz.c<+1bNObJ8M 2@penR~~Lw40n) A|[y02=!qA82M)+KRs Vr~0TI&Ukp Oq1X6dޮxCAA |xcL Lyc`d` ,dHaP wc`uS @@6#TBobIFHeA*CHn8B9 CAHfnj_jBP~nb3-^=+4Me22e`X04)6qRFl 50y0h010m`4Đ8[penR~38@Xm1Ti ;601@SDPC1U)>@@ |(xJAKr! "ha)XY"&/D|y6aGI칽`F@tKvuNC}Pfi %r$%Ѯ+~p,HkRzobU[#(gh% v}^L *u{zP0T LlWJזaG{Uhb$a)Pxu?,CQƿzj-PQ Db+ 1 "ad B HDL]D$꾒"|}{^*x:f8DhO0U>8W5ګ㻲2-M'g; Z= {Q|Ƭ\;蛲Нad"U^JZYez^,ML/|3p{s\͜cr9zsp0B'wpFr"xqGuwT&׵2o@dq~6mdW2˒z KԄ"o;T\%L[!GkW{DZ<ۙdݯ̷@{H'Ew*`)Px=H#A߼F~bQ&ʜFTМA +jk kFX3o˰o;;o@huD ȆBЈ@hhǸ0NZk/@<#ۓm P'ؾ& éL2(PB*䬛^Wc n*-TIP5+qx [; Scg /Dxj+}6׏~7 ļh,' b/ś^LtCN?(7ӠTs}g|<4|L >|><+RlkLf猾?¹-4CWQsd,]NߢKX :2˹Ċ8U݆Z_ i^_\K]l?+k.~z* >gBtW'J6B&tʮ0TEGE^gdV!k$0+Jx`le.AA |xcL Lyc`d` ,dHaP wc`uS @@6#TBobIFHeA*CHn8B9 CAHfnj_jBP~nb3-^=+4Me22e`X04)6qRFl 50y0h010a48[penR~38@Xm1Ti ;601@SDPC1 U>>@@ |'xJAK- F,Dpu>}BRYX@V6θ,r|3{&؄Mf7imi[ƺ5lÈ 6Zj_r5mfrJO? gR F<˳ir=&WE>TC{n?4K(io{;."}uD;Ĭ"8qOOйuO<(??yEdN9'#mn`כ(1,1Z @+Q0TB~MlHV?a]Aa 0xcL Lyc`d` ,dHaP wc`uS @@6A R 7$# !$7 d#ۡ ~?$37X/\!(?71/c  ئ220,`bHƔQ8k)6q|FMl⌆ 42cK-27)?AfI,ތA~baش  ) "JT>@` 0'xJAK&HJB0yD8QrD:>,םqE$rs|3{*؄Ud7imk5^yuk؁"VUTزq1q Ϥ2}Ll8BCI~hF/@t ʢ $$?# }zX^:ෞi 5Y^shTՇi5OΨalQg_yOVf;G$V qxN@ǿB(1&jt6N/IQ5Ht0&.N&oa+8]zwmJ ڦkW@aeFW+u2?ۤۄHͰjIM|V=m _|>.v&@C˾0^Xf3A*G9lG3[c8oCj|dE\Fe15*}JXL,xzϿ׿3OKàwq^Wp4'X ^ck]`=r!SOp99-93_MC:> b]%FoO?.3_42gv|r†Āb;]b mZke+7SٻH!0TH'U"y "| S+uDL,a_E0xmMKQGq2n*+!E MM "R*PrТP"vmZm*h џpYvGŐ wy{s`fCbH=GnD%PeV̑(Rhj%1\bM݊yR54"pz/Fn]1Q4jrgnbgS E";KH|L$sf#:#>y!=>=Ӽ="8ΰ{쳬#n-bnຼE~10H6w~$J{)-o|@=` D\{žs=oqGMb9q]~]^[(؍Y[Ez|GMRam0(3 a*RKŕí ,`_E0x/AL[]G}uGp"i/% -Mgŭ /opIXfkꉲo7oeLr>&c*gL0nbr$Yvun5B4w&Z6( UtWcg?d"pgfa<\[֣Զ0Azx8s0)GU ]Ȼ۩La?/?_KFCԟ#Kmsoҍy1|UN3ڈ<ŋRwP?m3֍S73s".yNqίw O)֩?Dp^M./SUԯr[+<\s:A.n-I.mC:S7(P^Qw!_A/u"1}r'rlEyFy48SL5wA巯qlIHܥEz0T1 m?%NHG !6[Q_oEg1aK0xmM/QϙNGT볪A5E"vB 6;D 6mbca) ;m*&w2y;91%RD vXYJ_NGf7/]< BDTLz<Ko.gd1tBRY5Ɠ^=J&GdTy$q&.YM(kw;x+7cׇu5soa{*bian<Nd"<>ă7܇xp^y~s eSw{}&7p8 C_7_D_ GwFq'#ܿH E p$MT:u /VAUBo/K U8*WU_AT*xQrbA_bD5O_%'P 27)?n6Hv8"**?] ~`< (V%24 >@|o.H9.KB|)+KRs} V0TZVĨ  9f OVm/@4 L =xR1K@C1XEl&B\nQm )]uw+8A :8H }ݻ=pX^1'$ƙ1i _ݮu&@ 8 ?8ٯuBV$n9ؾW:X`]VPƼ֚QxnN5a!l:*{ᙷF@pDp/G.4^%pyOM.>4>e~UDq0P\:qJ1!_s+I%ZtÍ+ӳr6 ?Ɵ.|4 L xSKPwI ";.`GiR ]tgqv*M\"hw/%hK{޻'g䓠abh%}fp0p?H!$ Y_Ap&KIO ;}|f?PEG1-<[d]R'+3֪#w]^tjR-V`ѮkNfĦ&#]Ma#@@ |(xcdd``gd``beV dX,XĐ ɁɁCRcgb x ߀jx|K2B* R. `W01d++&1] `ĵq*F0)H\7A|M8ߐ׀@?!䁸p00 h~aBܤ .f) F]q X;0pAÞ ?`/B]+J @LLJ% yetR0TA_(D!~+!a٥ƕ fB݉.-n:x$ xuOkaƟ'i44K b?@TPˢeB/Ibν4/~ћуA(=Cﭷ6}H$8.oyffg_m7'?1:cBlcZ HkOQ.pW?n>Ծ} b@`I?fE0ڨ?՝V`8!LkYqPޣ;ϳ9<x/Eυ|)Dħ6ZqG΂-|j~jʴ]V$j,gK:ފ(cI8zu\;h4jw\fjo&.tD*>E&^UW2E*q2߮ηS09~Tly~PQ_fI2)`>^<}Lq1>dst 43Y8,V,y^7H+hQxcVbVGuFA< 10p2@0\2010) kPc`U)PXRYP2hۡR-27)?AAl}xƧ KId!fa\뜟W, h <p٥1]?3`xѯM_l-g: ,氀>A&6=Jk6qi~HX1`|X|*0^ !ɞ?E'@0u<"xcdd``~ @c112BYL%bL0Yn&Vu! KA?H@0&dT200\1 @0@penR~ 3ŦZ@| }94AHqgi}5/.1s"#c&0%ʀ{sP{0188f&+'+39!0Br(ft26|߭ 2!`{|Y?!( ~Ϗ U!&T XP~ LTAFh2@8o 8_U:pDT/P@_Ŋ"#D=4ss؛  0y{yI)$5)_6`ϛ@F =VYjL#=/}> JFIFKKMSO Palette ÿC   ")$+*($''-2@7-0=0''8L9=CEHIH+6OUNFT@GHEC !!E.'.EEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEEE"3!1AQa"q2B#R$b3r ?,^\n Em:9 <'d;B?U RSTS&Ց+6Ri:HuyUI=WV`v?#I"`!"&R7rdȟ9SH6 % ]5"H`s( ƭ.tsOB^,o}!E@Lkp ׾k!%ka79&K CGȭws#pcdH"[cg4Vw/LA?EC7o GN;U=f05wY QŒG{(!ǎ_KKo /Qk|9fa7>ӯ@[E,98Jq-jzi%u?|DzAEfDl岒82yH7_D7"5UU_I$y5ʑ.s FoI# BEH(s@~+ YN8L rFc*ߌ逧G9CHP b/|Th.,t0\iRKl95w l1p߬Q'ba%YXȈouyxQ:0硎0dI$XL5KPB.'*|ň7'U9, >0+{ d5L~DmqNiҵgprO<̭VȀ9w$򠌺#HŷHDj9't*ti4=R}4OI vYWQG7UJB9F Ⳅ~N~$x> r€Xn3 (u;0=gwac:*-Bx5x㿙9p?x? j@VZ'O; Ы_?|PSUvsmkm)]LD!lgҵYdé/YxȾ3#bk{^K06[eHJj 8j #笷U݈eCR=c uZ{Odd UI7c?YQ~|;Z9|.RP)(Fo.ȇԉ2M}U@iWՌf#gf=8#HMd8VJ!AjY6DRndRvÕ@Bt6ȩWŎ5ܓB?8ǑTVڨ=wDoHMHI o H%JRlzΪT1"MN;*ꂗw*;#m uD~+Q *פ^"C%0Td$jIqPzFTCP@ ++kD+/JT6BNzH$=ߜ3j\Drb~0zu2&s,˦=.bg[x?+m;׮af<"]M+us'*>`*Th6U@# ;:88h6e4ۉv^Zma(^ɇwȮغ4|jOuL⩞JhA~yYW^U (r {tV)xN$^+]ml)9)7r#; }ȼVV0#S}2Fq/3n@ 0Jdx}SKPwI$ mZĩt(:((?жK*ThA :_ U(RWqs\Agw/ꃗw}. 4ƀVo RFƄy3ƅgN`g`V]d`<"q[x^cjSj>gʉ7N3(>$[zsIYl8[ H.8?K|Qy(ΏoURwaɫr~iױk5{?[:e~%n8qd_%* RLR& oHPeFzS11Z ozv0_啿N|&TBd e\J >VPnb ¶>XH8P/fbEkrL C=Kt0T Ynry~Sz,GWX:a[:xu/QϙUVU=Wb%,0!J ǂ+KĪ)Hl%,V: wsgν !Gn#Yde3{Fݘ#4K "۵!WY#k3;'նcxes1]#e͈v4 cKJR;)5Kn]8 s!;+0o{ě9x3[{}*اޥc}as.NSP sWъyw#z h=jo.7DkVί^tY;QV9*̝O=9M)FnGp<|]$~$j"RY?wVnd:`[:xcdd`` @c112BYL%bL0YnVB@?6 dzjx|K2B* R Xs3|ޖk5-?\k3 P.P56J`e Ҫ:p*6_Gp#ET&ɢDU_+~//?TL`E_d臄?bÇK>`g`b`*F\LAZ `r u8?PU_U~$=q_oAUIU6` $v0T%tj"ayw0PrfQfIj36%`m`6xݗMKUQ:4̓?ƹ:(Q b`"AvͲn?@EE B:kLsTH|v{kwا[x*)_DpdltXdbdd_Dq.ZFfB{ݼ_Ky3iGeKzds3RV)y2$}2zC˛Qr>N-Ck.Vڠp)9r^ՅCf w@>w8rUri$}n ʶ$_} Ο[9W(ky>>AYĜ*Uu*d7buR3ru΃1Ekq 7oɒ 4쩂fxH佴 a>B%Uxwɫ23/Q*;lB1MGk/xMXՈyd*z5}!GE7%S}Q빖]DysqHαd\O;x?)sga!YGM1j!y.v_`nbU"F vcUg fY#ձhoz#QIYa mdς)ŁX/ʇld޺EG߫B0=y*=+jώF{~֛=ò>ěffisj6ȼx ϙDjYهȤ`S$V@+b0;'?!\xm%kD)nй.rоf =.@kIqS#]>D}G}RbtRUzO8XE'w\_ڃHWN"mdzgFMKi#S>[6ڎGۺi[[ߩc $`m`6 xW=lVGr"Kk˒\"تIJ!SuH23$D@GorѡK=!Cڥҵ[[&rwɓ_ Gݽ}G 2Ҁm(YMP*Z:wuȪl j!`=rKǾz-)(NN!K-dȒg/t!y=i{wlǞ 5[MC*خhQc7~(o}YW}Q6 ?i:$avZ;XN`|#b?\/5.:Bё%'E GqQSZVP;m 9@.uW{7!6U[giPՈrwq,3Y2ץͦqL;2At1IV.ȕvu3,n2,]θq~S"W$MĪR` 0T>?/d }MBSn]j/ [dA |xcLa&Gև Lyc`d` ,dHaP wc`uS @@6 ^R 7$# !$7 dPۡ ~?$37X/\!(?71/c  X50LG1 (8!8 X94 q`W&0003MXU > wMJ@ |xcdd``~ @bD"L1JE `x Yjl R A@6 ^T 7$# !L av@Hfnj_jBP~nbm@y mĵQXv] ok`Æ6}@)Oc D3 +ss.?B.hpC u"8=pfdbR ,.IeCw#3X-F0T_f~eǁhQ $@ZY7 AM|xc|L0pXȐ m@#TBobIFHeA*CHn8B9 CAHfnj_jBP~nb3-^=+4p320,`bP>VqyFl⬒ę72`g4dexCNA-27)?A_m1TŜ1C -\ǐXOaA)hd` @M|<xcdd``ad``beV dX,XĐ Ɂ A?dm@5rC0&dT20]! `bM-VK-WMcHD(0sa`giփ˃p$+ odk|CF_"\du#8>]NP@ W&@B#~FK]j6{ K0LJeF~tw8?/`*8(Ĥ\Y\ːZf:@a0T"A'O:BhCm~ז?! A%|xc|L Lyc`d` ,dHaP wc`uS @@6 ~R 7$# !$7 dPۡ ~?$37X/\!(?71/c  8 0LG1 (~Q<6qVIFMlk32j`2<``b`:h! DqܤfqO@6eP ) &s y&+0csCcTVqyFl⬒ę72`g4dextCN"lI9 `lhSM,),LWlat:4x jR 7$# !$7 dPۡ ~?$37X/\!(?71/c  8 ,0LG1 (~Q<6qVIFMlk32j`2<``b`:h! Fqܤfq_@6eP + &s y&+0csCcT? Ã+㌨ !>SOgPٚRI"Y& Ma2?Os>GIXc,NR[.hN6p)N[Ĥ\Y\ː&f:9]0Tr#͍\It{Eݧ6ƳtD|a A|xc<L\ Lyc`d` ,dHaP wc`uS @@6 Vb`U)PXRY2PHpabZZ~A Ɍ ge&ΪXMѐQC\qqܤfqJKF[l&0b b"FwlO0Fbu}l iP8`DY` @|xcdd``bd``beV dX,XĐ Ɂ)KRcgb 7FnĒʂT ~35;a#L ! ~ Ay `9dNb8?ׄYA8ߐ F& Ma? gRj F b2#ܤE`'AH8>8?/`(8A(`8+KRs} V2J0TWfu+bH#2Ak/@\'-vs! -v Jxca Lyc`d` ,dHaP wc`uS @@ ve``CM,,He`H`mc`a`dΒb.@#xR#d1*ba\뜟W,AhhȨɰa:^eF LqWlJ$>bW/zACl|raaX*l"sX~bg!)ln|M\zmf/S O*6sXZga`:|`J`1GPق+ssqŰ:6Sm) "EP-v xcdd``^$d@9`,&FF(`T̐ & tAVRcgb V,CPT obIFHeA*C#PD.#l\K27S_ `L  y o˵ޖH_~˘ j&#.#7@Z5a TUYTyCA./Vg`YVW/2BG#pp!? 3 H?@Ip8T!T|0 g`iaCg`9ĄEK<2@penR~0LP9!d8>m) * `p:fdbR ,.Ie(/bL[h`l0f4w 0T(.Qr<5{Dhl9f7;a 0\xu_h[uIKnnMҦgmuZǪYAuLh]k)n0Y ^7NaH?> aeuLG?;s~ܛ{[ F.Z-E晵=9$ʼn"qZwK )Άũِ;D©fǿ(8d6:qtnzYd%^JpK G¬w`=-Dвʎэkz g=z 57v1w~tM,_?ARٹapeo~n9 V$r˒sCQKۨȫZ$l9?}w.2yoxԫd~<xmxO9N<؛3ح8vp g,r2+'ꗱYΧy_3Ag5`ӓ'X/c_r4hhP4 jPj5~jPf.Kx _=˸>A_{>q3=k$SzMW ҷy|1ϘWܞJ0kb+:X+E=Fz5&g8?rũ\qw\qen1qNrX7vdoءe⺻t/qr{=Wܮoq6֧2M{yBmW4R͖\q2E'XPj,OqB u384AkиD;=A\4K\ 2%)i j/nEW$o5^y7"n} 8o;-O-l\n{Nۧ){?9$[{~M ;` 0\Sx_h[UϹIro&mR{6iڦ]ڹ*fnWIgXP]=8놠>}aO> QhQ6\<-ֶ|\JlBH&^Y֎UVm7hj>#j=,6l(FHUFer9(;x=/ yߒUܶQŐ2%~㳥)7+^ ~/K&!A/#'""d B{UVߗˬOr}T<}x2w5'P:|,|]sV9C}(XuwzSw5ӽWnPuT8+1ys^_i)'4$JmE| 51}w>@}kgxSO7"ӯNM ;׉4iT胠Yoa NL؋_ךXߙ[G^uuj>YBN0H;uM-oV}k1:O٨T}ݾL?. m+ }0~ۼξ`Of8J6v54} ̇ 3С?.:3ޥ/5\z!# 4E=mj@ɏc_Xujو3*-z">}bD{$м>W59ͩM<5ߙϢ7Gw)@ms_%oirS-Z9{~ޣ5ͼwG0)ll밙_쓛sߖҡ_0_#Mpk`-%-<5³ppgd^$&MwtFH v!Hea`$ԥ0)H7魨R<`:GF^ D|sFR3 p\?e&|wb1>)j!5fi0*³ޚyFHS*+\kN՚qk./+O|xC9rt }3Z]MHpѳ*X"ݓBCBx& N+g{`&i XXƭ0a]0|bbY_ǸA:Y퀚笗[5ӐXFNhtp+>UDz&  3mkfu:@}W.Pu*b@n3 I\w> P#*u#eRVywAnDlj]R7v_QpF$m`mHP+xWMT8ld^aWj#CX Vd{""'KnPE8*!(8 qY! iegى[o73"@@ xU2x|Wrso8e_)*s'_|OϸV_1g1ƿ\ZR",܍|}oLVJN%$@e^:׺PNcx 耇QHHtV_^r! )YsJOOl`u|Cy$Y_oV o8W1g^YLJEx?T}$6ӆH$ҁXkMV(Of9h U.+Cϥf$*ByN[FDS CD"7#r   |ݕ DM. _oh,#e3g8]n Xw<3)7~IHS|PYIEFѣ={z/^r{G:6Z$ԵI2{-I|bLZ(W )NNΎ;JVb%m-^Gg |FjuW=ET3qEH>N2>n|<*uGϪؖc̦zc sU3s.>[NWM ^b CF \4vq]ݍ 9ᢍc4~F˱8F讫Ά;$QnK|O3>`|mXITmCnusLYkq]I׵QgIRT󧍫nmݾ6{-|j55GBR}i>*)rX#oϨ_yoFr bLu}x)jWy6͎A;|GN89RIs*))&*T&ěP$Q#}U|z8;_"v>?ʅ){YMeu6٤^VgzY cq?QCiaqdzE6{5nv᜴4b!-@`_,C rͣ0T'` Fzb1<XY5D.u%n`mHP+xWKTQ>s7㠌3?z&+UTBP JG3%ul~nM A(ZFp!R"h*ν*h~;9s&~":RzR~͵"jl=K=·iޜe:km`"84@=m_ecuXy26# qq<=`cjٿsQ]A:ZuJd}}E9 .Qh&B^P_G}5 1d UsQ=c+C$,ꦰ(C,g +Gj0va_%L+}ū;?^P@src/b5 W8g Q3ܿ֫9}B@ ־!\}Fz29'Cx:3f-A=-e >͐ԪU)͏ahRǻ{9ڴ7hf޶7Dc錴8.O Z=T4GHZ ؉ bB¬40\F=abYf9M SFY:S͂hMgAgNBx@F̓O|xC\9`:b|W"kEֿf_Tͦ"ݓIƔC0x MŽ04}F2 E*>uƬob bVo\_񉱌֝60 ؑ#)Q5*'R$XY' ="R|Fy{14Ka wuU[IoYpOc@xxHQ,S2*Q#]=!n5ߊb$m`mHP+xVMlU^;qb1;N$Rk&@?EHzco 7:q@\AeBTB\ʡG$. #n\R~{o7o޵ݵgy <@.}Vjd 'xεA(_hiV_\081\_?=Zuϸx+W-܏&,hŠ>#IW>'H[{C>O]h\0jL+ڟ'%ͷIoTd\ -Xyst PסuUFЊz=J^KQjNVZ &?{z{QLɘ9|Tx"ZfW4XÔ1?2ϊ$JV{g:fr%U\F\ IZZ\HH·=<%SWaqLLaFŐ+.dJ9dXyTf2+mm0e6y6EIL |%Dl)Q%_T礥; JM(~>+W?`Nkk{ͭ,*V*~ qs΍4Wm 7r2ݼµgࢶ !pʱ,rW8;FN`eMwG)V}C^9UUWOuX念M$JnLR-rm\iU8ۓ8.$A/\hϰAKE]Yg;[ȓUQ |N}6jE>Ȟi0g&ezYMe-6ؤ^bzY%v8h/(3 ~d<]l3^3oo8vWe,R|:P0Tb꤈PcnDWjbn%n`mHP+x՘KTQy>5ԙI upA ƕ**!C $Y#A6-Z-[Bv ZgA2;DǝΜ{9};-.f]4{&)~~^/$Kc#e|9h.-,267pkaf;lO.tڭ!5_wDe_V幫7~K|Hɀ?'>]s.vDћRiv])Nnx2Xtfhe۸)q96 ֯z^Ǖ{5]o_5]-x/FJ }E&t2lD8L 尞qگ{iĐ+7r/~z5ikk#Y\Y;-rѠX ;'lwR5Ll[&WUWB,V  jZp6Q+֯MKh=֌ y 澗n.aci$Lȏ;) W ~H`+{?6 w,K9릊Su'ج!Es`r}#W= E-cUs6=I—gZu#+U^A&HT4]g##uj)<ٽh:bݠ["?ѡ**I)0Z?gXH9WJ{U)EyכdK_:ӡc`~\ ʵ !&F_Q8v5's_UYSl*e=?֕%'ny>Uw8:h1\Ss}E}c^7/4 +3P|^-SL"Qbg)Qv !vʉ@?<υh\VkQiToپ :$u8Yh2MpLEmh3mpBoo43oԞb$m`mHP+xVOldM4 !in!i Y9!44"2ۘ"!!2 qSq•(}|vb'G??{4@"=3@ T?_3ڶXW='Q QL⿿,p0ȨxbtWe_>'V\-#iXp7O‚-(34pxj$IPK+=jj x6sV? g1-I51h}[)JZF>hl!AkizףĜ;>Ld&0&19TØ191%&sxۯRmxuv^^fS#(;p <+KY`]0x.2€$]BGB6^q,a(%V`qLLAFcK{H2m 3V ~nz6Ub Mge?gD#z&h(VcjtԯQ愥; J ȍ~>%V?`vr65f\_=ҵ90XOpy}(d̙ ms,sdcP"wӷ*⻩f$+>ց *b2EXo%gGJ^J)tx23w;eC.7J>n2d6\cyB%5/#Gs~{k5]_0^}uN ^e;% vpi,ً yְƝ_Qg76Wcks$r=ks|mvD?օ&+y}[$oeUDy]P٬HQ|gw/s?b/ˤ'&?lRlI&&9۱qEaJ0šƣ5v2[=1۷~3@9%v0TX7G9jݨyex-hAaPxuMKA̴[Z$dcN<ҢEtДq*l%i+OCS;yt-!žNQ:P> >J6 Zp ZU4k;ĭE;3Ҧ]g1R-ߕ e<5Jw" 9\.qQE}^x߹kP@`PxTnA"B. E$D I qvP,SREtH)EPP > R*D* @β=̛ٝE;$ wEe[8;G؈*.6Zkp2ĻzPCzdLSKQZ8+HG8ZՐ֗Z7ОZ'noVڃ^u֧σw^p 5sR8TQwI~#^w>)d\Ǭdo5PAmeZzl{w>u>n.RI ^z;c? ^c,.{$H W<!CGsnTUbpx]OƏُyN;]C&՞qBp%֯~EI,PoyWs*m¤YIkfh)c@]~##fRiTzx;w$r5RHr) >9,b1Z8pRȒĘKN>x/9:Z.4O8rqzI=WyN 'ioJ2GrM:3ǥ@͍s].nς g%; ϋIV})wrs\GI>TgF=< 5ʁZ[aΉpnK{[\Ujd`@+ x;HAgb.٨,QA +T"D0B41>I Xj vVIe%6Vvbݚvgn8! @ⷓZ%Č-" -AjdF5Qϣnڠi}\'=#V`N|'(6 z4 '|hhz^DAg l|_?!L<݆#Y]OzӒo$_5 /J,~rEB_r|/yJbS/iɣ+` JlPc7KO?xJ|>n*}4t b9FzџZgkO5',3Wg,({p!_uп@7@gR_:,1#e7SĤq3gmBiBj7BbpDP׬Ĩ0Tfs5dͻKᑞeOlzDc(;T"a4:^xcaD%FA< 10p2@0\2010) k;Pd`gCM,,He`x2hۡ 9?73Y!81X7Yp! P%/c  0ij20,`bPFQd&6qQFlBTb&v,Xa/004311&ϓMFwl?Ţ@E3$bōĥjI2Ih7vx'bcM 0(0(d+~K8[penR~02S/ğw 5F[\ j~d="`4:xcdd`` @b1##X`=F !#T57LS3 A?dm @=P5< %! .=W0188f&+'+3mCh=6ZFTSM.) [ ~ /)^ׄ󏰡`W?̽xpw31)tdeysK`-Iw$G\ 8pi7Z`0@ J8 xKK@g'TPB,O"(x ZmՂ1bP =z#<:+wvMGf7 tt]1>t,d,A ΍cP8ώr? ) ouŇ9ȥPJy`>҃:>ݎY-"[&U'RyƄavs+ʳ_20˜ (ǹQrO+Q`)~z="拮%OnRLbgRvm+"kJRT|5+^I}T{e%OF~VխASYBȫCQZŖj=f&*Z]Ok5eG~'ݶ{+5oW.TߵnS!砋xXD<5d-It#vʥ ~nkNcMH0TϸP71H 7^ t'?lur2a1L8yxU=hSQ>羛I߫I$m4iڤV(viR!Mj@@,NG+(d tHQhIK$ ^w}{߽]H 2dUGDf:HmhFYĬU^QQ0*:jrLQK#,]M"\dpuρ.8ៈ/u!Kpc}J$[Ymcf}-{ś&R $5UUOʼY|jõFRC#JO 7Ǿ"6EJû|Zz` ;ɱ pC!Ai k+va NE2?¸m8d'gڬVKg .M_rFb:D|Q\$ЩݑR ?pc~Wg_ c`1L8xMhAgfg&G41iH+ ^l*Ib~@{0CC=[P{VDQEA^ԃHiM77o2 !mO_(ĥED`l`bٓfO`+Cl#H/dЂ"&[exƋ˷'K3<Y;`d]$ ˗ &{>T߁RMG[~eEFd~^t@W(jhykF@V`8rd#&n:%WTuoF ڞ7’ Xa_xปtNuAޢyQ9T;T’C:mbg֫#P/WOp; 53zMw9 |er<>ł8A\Щ˨7-,So'\ƟC @Dy-ƯĎ߭~FߵǀjQ{Ԫ^%:ԫ&x'u2:|VN3U] Hv#>tLlfsFP{LH<_ZZCGkq0Tc$&<$v| =8n@ x xcr`&Fއ Lyc`d` ,dHaP wc`uS @@1 ^pR 7$# !$7 dPۡ ~?$37X/\!(?71/c   b&ΙŨM-9,Ou3`7,L`91(lz#s0`gag||7,0p2W&0003݆:A5?nhd x 6xuJPL? *.DA[W*vŠ H }_@A\s¥{x$^M&slj~ʪl]IɄ8MSlPC&LoWj/z2n?>%!Б|)A@.p oè?@[5n9'xylּbROK XE2lB'c$ ; 5e͇߬JE62.vttxsY>9iUKrdBϲwb^&80=%/P0T<ȐeB8( lݤb=) xkAߛ۽5qOUOr$"?b&ҋ ;46FMg* 6Y!0+20|vw$^VZDuV=4Ahtj8dyeRf\"j؛¶25nܾ!%WC}2[fG+=Ozt4'1Y*b")ZL}|zȠ" JN{ͽ4GOvd33Mj=ַϧsG@gU4?Q3Cu:¿:I~7ݧȿp_;򯻅 W$c}Oxrβ7Ԥger铜w_)pGv1{}>xט }BtL'Q 𛤷(VX=Vuy:6 yV}AJ9lnޣwCb#sS)XhZM="z Г7֛/zZ# Zu^;LB.;of@YAFѣ!!r`rd ˑq$;I\#x"-*4=}P7oMn 1>F!T%*{t" b2$"`W?SD~Vcfqt>| p?Ҽpy",%- |*Eio%mWX xK OoͿ\ś++;l$kwضk?ǧ9[@'kDmU5UlNS!/[R^T:,7'9)dΙжwaҶ=ǶGt({&{|1up|ޅ6wXƷyd׉Zשu4:ws6G\c *8+@3X X_#铯0T*#@"4 hL#٨f^,3*A!XJxcb&E>GA< 10p2B0\2010) k pR 7$# !$7 dPۡ \ ]Wіat3t:f&+'+30 .d`a`db&ĨM%)VI̳ `яQC pH1V`5vq=,ay7D(Ɯ TϺ>A5 ?dsr@ XJ@xcdd``dd``baV dX,XĐ IɁORcgb րx| gjx|K2B* R*35;a& MρH(xAJl.Gq=s~nfBpb^o0úF~F9 {!VƷ wbu4z3H./ߏp'#^w2@7]p_`D1 XhokEIQ"CY R/.p2́Ė;R321)W2ʷUM0TD:-mnAi]ʼދ.,P|_,@ %xAkAogvͶiP eEۃ hZ6EдuUh!B(_ٓ$ !ނN^,3ϼ,~n֗B|#ISȦAZi./]*/nDxzjhgk6 Bw3;ê)Q qb1W{dώTTÑ=![[͵ m ! ERٽVOqvE w+W+e?9u%&7q]F9;yqQd{q~a>".'ppw޳ 8 /Axtq~#Α~ *XݺE$(O5T5qľ5sdqg.c>qUK99 }>6byÇ|3us>Eܸm{t E^y,9J?CW| wEVs~Dcv*W3t]=xS%t~j~)O4%@*_,@ _x}kAnvf󣭭A"*HzPzElRkrjZ*TlT =x$'Qzɛq~d7kpÄ}oޛ]AcP13H @Z}2OP=F8~w9>;le`"|/L+E] 5k S!YQ 4EKr糟fdKE傲Xn{RQGZŠPZl\_VPj}d8nۥBYr~2%|_JW)_Mmm=~C|}v&3XO{9=ϾzD׻L;O*G}si$gsJa[GHTOϯk7U8ꌈxslNi[ ?q]`sH|o$Owd0ഐ:Y\̟DZgYB]AYﲋi_8He?$O=WTFӿ}^8y]fyC18;\~ݺU{OU.;`)vA}A]%2:}R;Blh@3?B?0TN(ٶ4bϰiGYCA!(dsxmO@ƿw`ALM@qhĵELe㨉$YVcj ^s}^wA:ZED8A8߀oZ(aoe]3;}KJsvP?:$n6%H_ƇS7nSnsU #Z|ʙnDZPd[i*Ѥ ^N6Jl +|%k*sDϡWISt0&|zX=$ ?4;qa|x`8mp_4h +B+5ai,=Ww(5 Yk޺f_BlVwa@ (dx}KKQ9r|TJ\=&%?h9Fȝ#hݢuD˖:B /3" ?{ggdH4vXA be b@hC4 SF+c̚E :Gf=pJyX CiTu>IF++*QfOP9έ8RJ왠jfΔ =9&{%xṞ)i9-N9$~uߤ0b-֟>/[ ydMlnЍn:{oH?Y*ε^$GJfsTTs;(ynw|H5N hr0TVf0Nzz?KʂM6eS4q!a4PxK[Aǿ{&&D HIipR %P*(hmIs\j>\;?A/D/xs]Zփe"h]!tj]q[45yN\9GAG~E o]tf.!`4Px};oAgfN|v"HC PBA@yn,MJ"O@( HHicw+Y{gvwfVP/I)Q"5s DmAJD[.c][e*5r6[9{/j`L.SN}*?h6Y4Ь^̗^x{=-Mgņ/=ghr#_>:pFwgTsvJ񹔷q?mz,.}[j#ܝO'xuO<~`toda9")Wg99pfٓ'}}?#[E&[=k}>>oLuo9es,q^(ؾ;K$^kн:@ߧNX5#\)pK?|^8ŊR~9}O.|wcWmpfGqsQm.oG65`$+D֠_pτ{_0T1#j{H'`8?.3'T0A(e>X xuTKKQ=_fc>h8 ;Ņ-ɦPd_(VS .ܸ2J6.m҂H)]ܸ7I =^a|sν=DWnFP{(b*3ުSu:%_f=S嶗g =ѵT8 _Muvy_[w2-Ge`D053"{Y͑0җpؠWd}IypG _֥%^.ß|,;pkI0ZOR2i 5Ɣ d}!B<˪q;c d ksX0Se?zjZVw@vy:퇜z!i?|ko\|*}d:zw8"lԌ[f9?xJB."-@(e>X xK[QϹ/^oZ!"T_RC*g"\7Qqq.sH?$/GZùp @@'.WSH]QT*zeʠjvBtIѱlzD J=eN"JSI_(-Δv*xϺ 06V\K [i?#c>}Rg{kܚM|a?Mox߷s;?ڈg oMY?=^qr~}I1A Ϥ'KW/:(ӟ5OpCj=VO~Q|y4_ק?&0Ta6 e;u'P;zlb9exA#`6v xmKBQǿ羗iej/2{i(AD Ep $,ZfҐMMEs`}x\s{ ˃.QjB*3H_,KV8e^n/?)z|o#Z$swPAw^*]륽w}Mr0A8]yNrsMķӚCU;fgFU}@i U;8>=O4S뛫\(3^Fh= Uη_J%p嚂wD,:9Txߡi9^F1[ i4^zT?N.7pWT~VJ?rh@#`6v xcdd``6a " ĜL  312Ec21BUs30`sP41 2C1PT obIFHeA*C#- :@@ l.#t&so 0X L !jW̰b#6bR(ˁP> S2Y?._, P?0pwf'l4Fsiʟd%,?EįxQ6\δU?ndz ~8RI903Bwp8%Ji> X M`pid4Fo9dt οaG7әjKD@F:p~z$oCosAXylQYAK)ZQRЄ=3{3pAs)8'3́Ħ;R121)W2䡇3X900T 4q.g+c;SH YgN2AN"oxcF0pXȐ =@ TBobIFHeA*C#Pn8F9 Ch8[penR~38/s v iO0LG1( 7QC(.hMz y ̂ X0``,|0`Kfc6qI+_nk .B ef|ˢkQi5HwVg [-x* c^w[\h2(sەJ@fClQf[QG#?S!޿=F3cVu^⨑Bl|3K7}$Huס%Z7'Y) Äؿs|B|{eY'rBpك+x]nE6>E)F<9|_! >s~> /J%;dQSd#OTE ; ۄҞda{'|Xyٜ \+j0T`pDsK 6)XeiЖ~$Aȕ8 ~xmAKA]W\S7l#tO( `0Eˢ ;:S\ ""(Cl0Y{y^2>qGUU\8Mm*N)*'wJ(ѻ '\icY&ܝZ1qYiRaQ9$QaR1L/SzfN1Ks.\즍gT(-Y\q˲30*5@ "ubwOf=G\~ws~A~.B8⼡^.7qÃ= x^[,K;G =.G<<=y+@ܹ<\E0${Q/Q'qc?u5=y TN|$@ȕ8 xcdd``cb``baV d,FYzP1C&,7\G A?db@=P5< %!@5 Wfjv L@(\pbr- >g`b`*F\3AZ?1@˼<\1psϲʂ(dPw, -tʆ] ,L  y (71TTyvAd^w@UÊ"#D=z2W`|߁ y^9e@8_WUraT|/q g/>VUYU~3yL<=Pg1`N.p'Oh^``<#RpeqIj.CW;C#(<ـr06zej0Tqhw6-D 84>GJ(Cmf2AIN8 ^xmSKa}ߺ(eH+ (2hAK7K+a͠.?CI^:t sÜ޼fdh;&ӉHA8/%!_27wSڅU 0IrPA_qA1in)\EuXRЍVxZ)NڴRGZuR)QOxx,Ҽ~> \g«sn%5?߆ gUr+,)}fnWU968h'qxpюy,[IjNrRhwk>٤egïNj _DF2@IN8 xKBAg|H(D" -=Qe%hAAPtؽ!?%sQ6拠}{쾙e2|Ahqiq v-0U'f֨AhSxo:ǙL5HQ>f:=~ݪ_Ѥެ0P%3tns|ZqT6Zn7:-?=QWV& .K,^pq>+"T[e2_CiM}{;0Tqhw6-D 84>GJ(Cmf2AIN8 ^xmSKa}ߺ(eH+ (2hAK7K+a͠.?CI^:t sÜ޼fdh;&ӉHA8/%!_27wSڅU 0IrPA_qA1in)\EuXRЍVxZ)NڴRGZuR)QOxx,Ҽ~> \g«sn%5?߆ gUr+,)}fnWU968h'qxpюy,[IjNrRhwk>٤egïNj _DF2@IN8 xKBAg|H(D" -=Qe%hAAPtؽ!?%sQ6拠}{쾙e2|Ahqiq v-0U'f֨AhSxo:ǙL5HQ>f:=~ݪ_Ѥެ0P%3tns|ZqT6Zn7:-?=QWV& .K,^pq>+"T[e2_CiM}{;0Tac*S]ڨɐ7mnp(>_xmJA5J IhV)XAh6|;v`gur! 2|;fGC.ݳB~^H9 8vҗ @ӓ" ^S"3E-Zf|"tDAZtt իB5=hT BԌ}>W=S? 2o)7K⋲QrpI$ś,p|,O8ᷨZ6sÒ?B̾8mp|r ?V""o" :(>xJ@L[Nb7"ޭU]((Qp[j^ҝt/JEVp#L)&Lr̙?L(tԎ(!VDa"!ֳQj*rJ=iQB`0ryt>S+GxV%r]|ۏ}R~Rz3Ǖyw^ <1ʳ}>n}FX-Ge P=O§R%%TK.WNl^#OYc{Bl&NmukL;ÿZݿ9 BJ59(sB^!gWɕW|*l%q$lo$+qWY/{?ֵ|WobW;mȓǛ^gyB(12e2-^Aw_@(JB2 q#>;\3ڹ0ThM3$Z{Õ|r6YhdkY"@{5 xxmMKQϙ4m̯tIB $ZGAI*MTh) BB-Qڵh[D $CU!0<\U>H rKJ/:ih64bn"rƛx$rw?˔wӕR(ݒ2"֝BPd9 Z'<) =CF|'| .b#b[~g 5Up4_c.cK W:{Q' #>v=[$ w?\~C!^Evs-8Wƹ6/KȍB*lqH&}Vt?Ijb~|0];;הJ"@{5 xuOKA[fe:FAIy*YsdPЪPFD:Dg $:yffשha<˻ @! |f"N,K,8YQRt2$&?]4@FrԬD!pKDұiSy4U :d{U9 O\UГ1so||d&$zgwNL5D.dC f"N4?=2zhhsHjKFS>ո]2Ÿezcp/#;?@"5@CxmKKBA5ŲE$` 5EB A--{[m>AvI*6gFGG0?sq`z||0DAfY$0,"IM|dXϦl Ը lVOr<.?' m102NXv4σ.^Ju KiOs;r1d h:u鏢NN@S:I8!P$N+~4Owwk'@w 7.T>WSW8A|^7czTןoc85@os^ͥ8upM_G/ۺW^uk ^R|K7xilXZP&t+y0Ty'3})3rO4W(mF"`'h@xcVbFFA< 10p2@0\2010) k;H201AU!T&dT201cd`\p;t:f&+'+30 .d`el`c`CNFM QL`d?XEX c 6qXl" `&Q8Vw>ข=g{{tQ*\ V-X lb܍MuXd 1bǿ8=l✏]23~n>N\ `'hrx}AKA{VVI:DQ n]"BH- AЩcDנc>m69- ;7?! `t Ĉ& DAI5mg aHb xcb Lyc`d` ,dHaP wc`uS @@& Vr**}K2B* R@r@u:f&+'+30 .d`el`c`C7&(yks;2j`gSfM&ÜʜXE!Cϑi\: lI9 ,Ld,f q^ Lȁ r@m) "vd=`Hb@xQ1K`}w&q :(ТP(AӪ`.Oqwq]b*GP]3oD%3DZ!ND+k4u_^NlT!5d&<MT zE~.NtvA5}<݉U|7woN ^Y/,Eu^ -W'¹Fz"g[cJ;9V'`~uˌ2.RyhauNEC^좶^T%t1Q'1/qFh{_80TFB?^G,CNBhܸMKFO,7,!-duxmKQƿmnIZimd$B`x 6AAVjQyյtat%ҡcAǎvi{b3͛C2A8A8?l+n(`ʴ5&LǁWѪ4s[սMnǭdS}'5BQYw#^Zڨ4R ]>p]*Hm="hK`SS?K>{_x'ާ|>iaN~K/ ,_)L -dx;KAgc^FMPOmD8;CW%) X^Я`c)l KSAϝEqwۛf]~ ˇ)m$YdNvۺz"ʣ M"97xtG*qLT.yRrq=oZRXpOAN3uR;|do5HQ#UK&TO47k?x.?x|xOb+F<8/ֽ8JOTBobIFHeA*CHn8F9 CA@p!s~nfBpb^o0B^=4d50LG1 (.hM;[pr~Q6s2s1dX"r.tcte<{ (ʘEf),:`g M0Y|&̸),) "bZ@@h $ ^xRJ@MS-ZRTу=⡧m-\JϞ_  3Mhvy˄A@1=)|3?"g_̟HʶyYD|ޟFW򫙿H6IEIH3;竇d&Wf`oA) xvǨG=-Z ?r=0T.),0!}xG$&o' F0=a0 J x?hAƿ7mӋq3nȩ-D a`q4FrhNBkhi, SI;{ {tM$'y00 5XJHh&րu;1Qo.V:p!!rݪM_@sb.v^CgYK̝V[ _=IY䆧e+x-ц]D2OT?\Wrp/N]auFi~#tǗuD:)gBx>-5ZNܓ$K>x"Yeytl;2Jv|e/U;Ȥ4YEZ b?,-ˌ=";t72Cjly^;xM:ME$—R y@7Pc֧A`0 J ^x=l@ߝs4MSTBK RPB@[\H$#&!$P'@Y@,l*w;Bhm]~޽wv+G*[z'$ύPڰ.ǹe=< x)礋~2/DN҆V[wjUsG%B~-^L+} +̗2\1Gb蠧x!2Io |?|r$5IWCܚsgJr,{33m8f;pMa7e4sF3G"j-L:HiN.J_(JO Ц20hK;xAhA߼Imim)"51FBAMbOYyXRaLM-^=W> L߸b.-̗݋y7t{z+%blxDEq3_eoo|ѹU>&R.isFZ NO.^uƳ&α%ȃ؆<3W\cAܻ+ၽ\jwPgHHG>ZpEZAᕐ;(5vPjj9(TשXJoÏrw!υ =O[ʣ)|8}3`\Or `qO g=_{ZZב:RwKٙ7p'!.+? ο| ֭fh'p֑H{3GK;#0GX9Aܺ=`uRqMRk:| $us[bvkNaB(W[(o?wXz ْeUاuGL6ļe}/w(F,U1y1LfƇcDN ^'Y-jhUq8^y&ǐV +,{{ЪZzQݱq. OG 7xMhAg6Iw7m4fJMmS[ 1Y FP(R)ZlB#ms ғ'</!"<)ś?|nRd~f6:B cC#64 ֪ժQ13awyBZBlM(|h}I.[l&Fr'lҮBЊTGӓglnfɢ׹$ _~^9vsKfb&7U(&U*8}δsN5FSsQbgbz#l}>WȈ8L ̋QqU%*Jw(YA(9@ ̋%0/ѐydRTHZy'}~!xh`=Gv+A6egkk|VUjڿXoYdJU# !NusEVN7;ĉ Ty^߬o<\ۙ?)b?)Otp陜ǟEm3~/Ly5@K[3!|!NuدIOW7\(y,5z QQQB"7G}0R}vwՐëB!'jke;kIg͟␟7B^ #hX*Wb;{ ٠oIlqz>ؠ}l\hm_$6qfPp3"oӂO8K@NnBB_{7ټ# A6()xAOA̶. DL[Ě`Cڨ46Ѭ$^\ u54zƁ‰؋r޸ :ovfm(m3y;;LmPKfRQBDDhV+JTA ]$¢H^>6Q$@L&a`ic 1LX%Q$&PHI$79Ϲ]] T:`1wr:9]_7nLtH̆flN%7זY{0!o!} ZLj#jī6ʼo>y*xk%-cR0PbZ/[{1~[{u|\Ǔ9TW*+U5Kr_{甏<ƉN^ALͯ.?;"bU<"]o#7v xTkSAikۚ6 V^+h+mS0ЅC顇wQ/ AWzŐHk"&̈́y2DbEd_ ;-NZS?g=0z5P:FyzQ%М8'K3sr@4+#yRvzekfVp. 9ߣgxo+.SJuv1V ]jK?j{R~#qmAiX1c?;'}/!q#8|)`?vKc~_Ө_+O~?ߟhJ' )]]+xoODa~%PO1)F=ů{0b7ֻ`?o_"GFq?'֫+w4`o&sࠑ˚v=).E05PZؽv/rw}װ0TQ\J-5ב\K I951v8- `xmKKQo4Mi7c iQPQJEJBfPTBԶvµD E!0<=w87 $r faN`3qh"g Ӱ|jjnDd"wqTgN5GfOJs#Y2R6G az19k8R|vUbO|8 /JyVPv[A  !. ':)RTQ ) ,/0"&   Q  *     *     ]   *h*!/).<QUZ`S'=1/abck`     #. #. #. #. #. :  GOKJJJh[HJLOX]+f6ghi8j(klmnopq!r3stuvwxyz{}~T ..  ?;B _*+3 W6 e| JJGG       ) )*+,-./0 1)2+34X56H7(819(:;<-=>J?J@GAGB !"#$%&'(      !"#%&'()*+,-./012345789:;<=>?@ABCDEFGHIJKLMNOPQRSUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}~B$pgRZnAPB$_=Aηܸ8JgPB$D5Ԏ8ũ;sgPB$mޘ<?Pg_- PB$Ž2TXPo40 PB$AZwق\JdPB$wյUh&ת1EqPB$DP:+N7PB$3-M(VPB$4=}*oPB$%pg|Dr}E%PB$o.5<,; m:+PB$]1Y ԡcf:/PB$vh][LOT0PB$J?#@>)54PB$@xv^S^8PB$juaEͤAYc;PB$'p!r_\6G@PB$̹ ڲOeEPB$@P jVwth* IPB$UFV,@iOPB$ >?SҊUPB$$)%wV}3LhY^PB$tU4z+beaPB$9ϳ= MfPB$=n(ΧDP@ujPB$X6dޮxCmPB$Uhb$tpPB$0+Jx`le.tPB$a]wPB$pꑰ[\H6 ZyPB$| S+uDL|PB$!6[Q_oEg̀PB$9f OVm/@bPB$ c(pgn[EPB$٥ƕ fB݉.-n͊PB$E_nuuoXPR$=VYjL#=/}> PB$EcN-qlP$PB$~Sz,GWX#ZPB$IW'J[ƺt!}PB$PrfQfIj36-PB$}MBSn]j/ [F˱PB$Q $@ZY7PB$:BhCm~ז?PB$2@n+UNcPB$emD]]SF[0PB$6,UMPB$6ƳtD|z0P$P$PB$@\'-vs! _PB$Dhl9f7 PB$gh?5ڦJoRƧ؝$ PB$XY5D.uPB$nDWjbnPB$ݨyex-h%qPB$(aq?phP~PB$OlzDc(;nbPB$)ƧuJޓθ PB$^ t'?lurjPB$v| =8n"PB$B8( lݤb=PB$٨f^,3PB$ʼދ.,P|O PB$ϰiGYCAPB$M6eS4q^PB$3'T09'PB$;zlb9e`PB$;SH YgNyPB$)XeiЖ/#PB$E=5޳Y.. 'PB$GJ(Cm*PB$7mn#.PB$|r6YhdkY1PB$YDwIVGO's`5PB$3rO4W(mF 8PB$J¦VWB>I5mg;PB$BhܸMKFO,7,>PB${.|ged"?BPB$$&o' F0=EPB$,54K+NIPB$d[CX l-jEMPB$cHMAoMD'"OutlineIntroduction Bayesian networks: a review Parameter learning: Complete data Parameter learning: Incomplete data Structure learning: Complete data Application: classification Learning causal relationships Structure learning: Incomplete data Conclusion6 d Learning (in this context)Process Input: dataset and prior information Output: Bayesian network Prior information: background knowledge a Bayesian network (or fragments of it) time ordering prior probabilities ... >(O  $$((,,00(44O88 Why learning?cFeasibility of learning Availability of data and computational power Need for learning Characteristics of current systems and processes Defy closed form analysis need data-driven approach for characterization Scale and change fast need continuous automatic adaptation Examples: communication networks, economic markets, illegal activities, the brain... -1/d%d KZ-1 /% K$$((Why learn a Bayesian network?Combine knowledge engineering and statistical induction Covers the whole spectrum from knowledge-intensive model construction to data-intensive model induction More than a learning black-box Explanation of outputs Interpretability and modifiability Algorithms for decision making, value of information, diagnosis and repair Causal representation, reasoning, and discovery Does smoking cause cancer?8h07   $$/((,,00%What will I get out of this tutorial?An understanding of the basic concepts behind the process of learning Bayesian networks from data so that you can Read advanced papers on the subject Jump start possible applications Implement the necessary algorithms Advance the state-of-the-art&rrmOutlinedIntroduction Bayesian networks: a review Probability 101 What are Bayesian networks? What can we do with Bayesian networks? The learning problem... Parameter learning: Complete data Parameter learning: Incomplete data Structure learning: Complete data Application: classification Learning causal relationships Structure learning: Incomplete data Conclusionj dk k  Probability 101KBayes rule Chain rule Introduction of a variable (reasoning by cases) f  2   2(Representing the Uncertainty in a DomainZA story with five random variables: Burglary, Earthquake, Alarm, Neighbor Call, Radio Announcement Specify a joint distribution with 25-1 =31 parameters maybe& An expert system for monitoring intensive care patients Specify a joint distribution over 37 variables with (at least) 237 parameters no way!!! $9$c  I3f9 A$$((H,, 003f4488BProbabilistic Independence: a Key for Repr      !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz|}~esentation and ReasoningRecall that if X and Y are independent given Z then In our story& if burglary and earthquake are independent burglary and radio are independent given earthquake then we can reduce the number of probabilities neededE\6       $$((,,00 4488 <<6BProbabilistic Independence: a Key for Representation and Reasoning&In our story& if burglary and earthquake are independent burglary and radio are independent given earthquake then instead of 15 parameters we need 8\(      $$(( ,,00 4488(<<Bayesian Networks Bayesian Networks  Qualitative part: statistical independence statements (causality!) Directed acyclic graph (DAG) Nodes - random variables of interest (exhaustive and mutually exclusive states) Edges - direct (causal) influenceHCr3r Bayesian Network SemanticsCompact & efficient representation: nodes have k parents O(2 kn) vs. O(2 n) params parameters pertain to local interactions$]$   ( $$(( ,,(00144"Monitoring Intensive-Care PatientsThe  alarm network 37 variables, 509 parameters (instead of 237)*GCQualitative partdNodes are independent of non-descendants given their parents P(R|E=y,A) = P(R|E=y) for all values of R,A,E Given that there is and earthquake, I can predict a radio announcement regardless of whether the alarm sounds d-separation: a graph theoretic criterion for reading independence statements Can be computed in linear time (on the number of edges)>>.      $$ ((,,00 4488 <<{ d-separationTwo variables are independent if all paths between them are blocked by evidence Three cases: Common cause Intermediate cause Common EffectPP0<  0ExampleWI(X,Y|Z) denotes X and Y are independent given Z I(R,B) ~I(R,B|A) I(R,B|E,A) ~I(R,C|B)~1'   'I-Equivalent Bayesian NetworksrNetworks are I-equivalent if their structures encode the same independence statements Theorem: Networks are I-equivalent iff they have the same skeleton and the same  V structuresQuantitative PartdAssociated with each node Xi there is a set of conditional probability distributions P(Xi|Pai:Q) If variables are discrete, Q is usually multinomial Variables can be continuous, Q can be a linear Gaussian Combinations of discrete and continuous are only constrained by available inference mechanismsa9   $$((,,0044;88<<zP;ma&What Can We Do with Bayesian Networks?:Probabilistic inference: belief update P(E =Y| R = Y, C = Y) Probabilistic inference: belief revision Argmax{E,B} P(e, b | C=Y) Qualitative inference I(R,C| A) Complex inference rational decision making (influence diagrams) value of information sensitivity analysis Causality (analysis under interventions) ') ]*')      $$((\,,00*4488$eBayesian Networks: SummaryBayesian networks: an efficient and effective representation of probability distributions Efficient: Local models Independence (d-separation) Effective: Algorithms take advantage of structure to Compute posterior probabilities Compute most probable instantiation Decision making But there is more: statistical induction LEARNINGxf)5U4f)5U ) h%Learning Bayesian networks (reminder) The Learning Problem Learning ProblemLearning ProblemLearning ProblemLearning ProblemOutlineIIntroduction Bayesian networks: a review Parameter learning: Complete data Statistical parametric fitting Maximum likelihood estimation Bayesian inference Parameter learning: Incomplete data Structure learning: Complete data Application: classification Learning causal relationships Structure learning: Incomplete data Conclusiont)"dP)!O  .Example: Binomial Experiment (Statistics 101) (/When tossed, it can land in one of two positions: Head or Tail We denote by q the (unknown) probability P(H). Estimation task: Given a sequence of toss samples x[1], x[2], & , x[M] we want to estimate the probabilities P(H)= q and P(T) = 1 - qnu2    "$$ ((,,00(4488<<  Statistical parameter fittingConsider instances x[1], x[2], & , x[M] such that The set of values that x can take is known Each is sampled from the same distribution Each sampled independently of the rest The task is to find a parameter Q so that the data can be summarized by a probability P(x[j]| Q ). The parameters depend on the given family of probability distributions: multinomial, Gaussian, Poisson, etc. We will focus on multinomial distributions The main ideas generalize to other distribution families1}d    |! 5$$((,,004488The Likelihood Function" How good is a particular q? It depends on how likely it is to generate the observed data Thus, the likelihood for the sequence H,T, T, H, H isxSufficient Statistics To compute the likelihood in the thumbtack example we only require NH and NT (the number of heads and the number of tails) NH and NT are sufficient statistics for the binomial distribution A sufficient statistic is a function that summarizes, from the data, the relevant information for the likelihood If s(D) = s(D ), then L(q |D) = L(q |D ) 3+C   2  $$((, ,0044"88<<[    $$((,,004488<<   Maximum Likelihood EstimationMLE Principle: Learn parameters that maximize the likelihood function This is one of the most commonly used estimators in statistics Intuitively appealing 7B%  Y;%%Maximum Likelihood Estimation (Cont.)@Consistent Estimate converges to best possible value as the number of examples grow Asymptotic efficiency Estimate is as close to the true value as possible given a particular training set Representation invariant A transformation in the parameter representation does not change the estimated probability distribution JTh JT h Example: MLE in Binomial DataTApplying the MLE principle we get (Which coincides with what one would expect) ($0U>&*Learning Parameters for the Burglary Story General Bayesian NetworksWe can define the likelihood for a Bayesian network: The likelihood decomposes according to the structure of the network. n;F3   +?'!General Bayesian Networks (Cont.)6Decomposition Independent Estimation Problems If the parameters for each family are not related, then they can be estimated independently of each other.."l From Binomial to MultinomialFor example, suppose X can have the values 1,2,& ,K We want to learn the parameters q 1, q 2. & , q K Sufficient statistics: N1, N2, & , NK - the number of times each outcome is observed Likelihood function: MLE:d=<     $$((,,004488<<   $$/((,,004488'#Likelihood for Multinomial NetworksWhen we assume that P(Xi | Pai ) is multinomial, we get further decomposition: For each value pai of the parents of Xi we get an independent multinomial problem The MLE is      C$$((, ,004488< <6@(Is MLE all we need?Suppose that after 10 observations, ML estimates P(H) = 0.7 for the thumbtack Would you bet on heads for the next toss? :a*a*Bayesian InferenceMLE commits to a specific value of the unknown parameter(s) MLE is the same in both cases Confidence in prediction is clearly different HBayesian Inference (cont.)(Frequentist Approach: Assumes there is an unknown but fixed parameter q Estimates q with some confidence Prediction by using the estimated parameter value Bayesian Approach: Represents uncertainty about the unknown parameter Uses probability to quantify this uncertainty: Unknown parameters as random variables Prediction follows from the rules of probability: Expectation over the unknown parametersc'2(0  "2c $$((2,,(00Bayesian Inference (cont.)We can represent our uncertainty about the sampling process using a Bayesian network The observed values of X are independent given q The conditional probabilities, P(x[m] | q ), are the parameters in the model Prediction is now inference in this network [[  $$((,,0044M88<<Bayesian Inference (cont.)5Prediction as inference in this network where >-2 - Example: Binomial Data RevisitedSuppose that we choose a uniform prior P(q ) = 1 for q in [0,1] Then P(q |D) is proportional to the likelihood L(q :D) (NH,NT ) = (4,1) MLE for P(X = H ) is 4/5 = 0.8 Bayesian prediction is z7'   $$((,,004488<<"  $$((,,00448 8<<       $$((,,00Bayesian Inference and MLE4In our example, MLE and Bayesian prediction differ But& If prior is well-behaved Does not assign 0 density to any  feasible parameter value Then: both MLE and Bayesian prediction converge to the same value Both converge to the  true underlying distribution (almost surely) 4<GD:S  @D1Dirichlet Priors Recall that the likelihood function is A Dirichlet prior with hyperparameters a1,& ,aK is defined as for legal q 1,& , q K Then the posterior has the same form, with hyperparameters a1+N 1,& ,aK +N K i@xN,   A  $$((,,004488<<+    $$6Dirichlet Priors (cont.)$  bWe can compute the prediction on a new event in closed form: If P(Q) is Dirichlet with hyperparameters a1,& ,aK then Since the posterior is also Dirichlet, we get ><5Z>  #  $$((,,6004488C)Priors IntuitionThe hyperparameters a1,& ,aK can be thought of as  imaginary counts from our prior experience Equivalent sample size = a1+& +aK The larger the equivalent sample size the more confident we are in our prior _q  D  $$((,,0044)888Effect of PriorsPrediction of P(X=H ) after seeing data with NH = 0.25" NT for different sample sizesU      $ $((Effect of Priors (cont.)IIn real data, Bayesian estimates are less sensitive to noise in the data &II:$Conjugate FamiliesThe property that the posterior distribution follows the same parametric form as the prior distribution is called conjugacy Dirichlet prior is a conjugate family for the multinomial likelihood Conjugate families are useful since: For many distributions we can represent them with hyperparameters They allow for sequential update within the same representation In many cases we have closed-form solution for prediction |E&r   &  $$,r|:)Bayesian Networks and Bayesian PredictionkPriors for each parameter group are independent Data instances are independent given the unknown parametersD*1Bayesian Networks and Bayesian Prediction (Cont.)+We can also  read from the network: Complete data posteriors on parameters are independent V%V%D  ;Bayesian Prediction(cont.)Since posteriors on parameters for each family are independent, we can compute them separately Posteriors for parameters within families are also independent: Complete data the posteriors on qY|X=0 and q Y|X=1 are independent y4   $$((,,0044<Bayesian Prediction(cont.)Given these observations, we can compute the posterior for each multinomial q Xi | pai independently The posterior is Dirichlet with parameters a(Xi=1|pai)+N (Xi=1|pai),& , a(Xi=k|pai)+N (Xi=k|pai) The predictive distribution is then represented by the parameters which is what we expected! The Bayesian analysis just made the assumptions explicit e+7DnL     + $$((,,004488<<    $$A((,,0084488M9Dp&Assessing Priors for Bayesian NetworksWe need thea(xi,pai) for each node xj We can use initial parameters Q0 as prior information Need also an equivalent sample size parameter M0 Then, we let a(xi,pai) = M0P(xi,pai|Q0) This allows to update a network using new data'8[/      $$((,,0044 88<<     $$((,,004488<<  w 5'Learning Parameters: Case Study (cont.)!Experiment: Sample a stream of instances from the alarm network Learn parameters using MLE estimator Bayesian estimator with uniform prior with different strengths: LM LM'Learning Parameters: Case Study (cont.)!~Comparing two distribution P(x) (true model) vs. Q(x) (learned distribution) -- Measure their KL Divergence 1 KL divergence (when logs are in base 2) = The probability P assigns to an instance will be, on average, twice as small as the probability Q assigns to it KL(P||Q) 0 KL(P||Q) = 0 iff are P and Q equal m/p4  * /  $$((N,,004488 <<     $$((,,'Learning Parameters: Case Study (cont.)! ?Learning Parameters: Summary;Estimation relies on sufficient statistics For multinomial these are of the form N (xi,pai) Parameter estimation Bayesian methods also require choice of priors Both MLE and Bayesian are asymptotically equivalent and consistent Both can be implemented in an on-line manner by accumulating sufficient statistics +G&   $$((/,,yOutlineIntroduction Bayesian networks: a review Parameter learning: Complete data Parameter learning: Incomplete data Structure learning: Complete data Application: classification Learning causal relationships Structure learning: Incomplete data ConclusionJK$dK$d-Incomplete DataData is often incomplete Some variables of interest are not assigned value This phenomena happen when we have Missing values Hidden variables ^3$  w  e.Missing ValuesRExamples: Survey data Medical records Not all patients undergo all possible tests 2&- -f/Missing Values (cont.)Complicating issue: The fact that a value is missing might be indicative of its value The patient did not undergo X-Ray since she complained about fever and not about broken bones& . To learn from incomplete data we need the following assumption: Missing at Random (MAR): The probability that the value of Xi is missing is independent of its actual value given other observed values BaZpBa@ # / $$((g0Missing Values (cont.)If MAR assumption does not hold, we can create new variables that ensure that it does We now can predict new examples (w/ pattern of ommisions) We might not be able to learn about the underlying processh1Hidden (Latent) VariablesAttempt to learn a model with variables we never observe In this case, MAR always holds Why should we care about unobserved variables? :9191i2Hidden Variables (cont.)Hidden variables also appear in clustering Autoclass model: Hidden variables assigns class labels Observed attributes are independent given the class N=[    [, bj3)Learning Parameters from Incomplete Data Complete data: Independent posteriors for qX, qY|X=H and qY|X=T Incomplete data: Posteriors can be interdependent Consequence: ML parameters can not be computed separately for each multinomial Posterior is not a product of independent posteriors 2.x    $$((,,.004488:<<&k4ExampleSimple network: P(X) assumed to be known Likelihood is a function of 2 parameters: P(Y=H|X=H), P(Y=H|X=T) Contour plots of log likelihood for different number of missing values of X (M = 8):r?    K  l52Learning Parameters from Incomplete Data (cont.). *.In the presence of incomplete data, the likelihood can have multiple global maxima Example: We can rename the values of hidden variable H If H has two values, likelihood has two global maxima Similarly, local maxima are also replicated Many hidden variables a serious problemZ]dWS dC m6MLE from Incomplete Data6Finding MLE parameters: nonlinear optimization problem&7 n7Gradient Ascent<Main result Requires computation: P(xi,Pai|o[m],Q) for all i, m Pros: Flexible Closely related to methods in neural network training Cons: Need to project gradient onto space of legal parameters To get reasonable convergence we need to combine with  smart optimization techniques K?$      $$((,,?004488<<,&o8Expectation Maximization (EM)A general purpose method for learning from incomplete data Intuition: If we had access to counts, then we can estimate parameters However, missing values do not allow to perform counts  Complete counts using current parameter assignment<; ; p9 EM (cont.) q: EM (cont.)Formal Guarantees: L(Q1:D) L(Q0:D) Each iteration improves the likelihood If Q1 = Q0 , then Q0 is a stationary point of L(Q:D) Usually, this means a local maximum Main cost: Computations of expected counts in E-Step Requires a computation pass for each instance in training set These are exactly the same as for gradient ascent!(7% h3    $$(((,,004488<<   %$$ ((h,,300r;Example: EM in clusteringConsider clustering example E-Step: Compute P(C[m]|X1[m],& ,Xn[m],Q) This corresponds to  soft assignment to clusters M-Step Re-estimate P(Xi|C) For each cluster, this is a weighted sum over examplesRK     $$((2,,00 4488< <7,=N;s<EM in PracticeInitial parameters: Random parameters setting  Best guess from other source Stopping criteria: Small change in likelihood of data Small change in parameter values Avoiding bad local maxima: Multiple restarts Early  pruning of unpromising ones9E6 F6t='Bayesian Inference with Incomplete DataRecall, Bayesian estimation: Complete data: closed form solution for integral Incomplete data: No sufficient statistics (except the data) Posterior does not decompose No closed form solution Need to use approximationsXc`d #  |u>MAP ApproximationSimplest approximation: MAP parameters MAP --- Maximum A-posteriori Probability where Assumption: Posterior mass is dominated by a MAP parameters Finding MAP parameters: Same techniques as finding ML parameters Maximize P(q|D) instead of L(q:D) '+0K'    K2  $$ ((,,0044v?Stochastic ApproximationsStochastic approximation: Sample q1, & , qk from P(q|D) Approximate *!  w@!Stochastic Approximations (cont.)How do we sample from P(q|D)? Markov Chain Monte Carlo (MCMC) methods: Find a Markov Chain whose stationary probability Is P(q|D) Simulate the chain until convergence to stationary behavior Collect samples for the  stationary regions Pros: Very flexible method: when other methods fails, this one usually works The more samples collected, the better the approximation Cons: Can be computationally expensive How do we know when we are converging on stationary distribution?,Hc  4h $$((,,0044 88<<B,(xA*Stochastic Approximations: Gibbs Sampling JGibbs Sampler: A simple method to construct MCMC sampling process Start: Choose (random) values for all unknown variables Iteration: Choose an unknown variable A missing data variable or unknown parameter Either a random choice or round-robin visits Sample a value for the variable given the current values of all other variables31 ZP 42 ZO $$EyB0Parameter Learning from Incomplete Data: SummaryNon-linear optimization problem Methods for learning: EM and Gradient Ascent Exploit inference for learning Difficulties: Exploration of a complex likelihood/posterior More missing data many more local maxima Cannot represent posterior must resort to approximations Inference Main computational bottleneck for learning Learning large networks exact inference is infeasible resort to stochastic simulation or approximate inference (e.g., see Jordan s tutorial)MF/f M   /S   $$D((,, 0044=88<<zOutline7Introduction Bayesian networks: a review Parameter learning: Complete data Parameter learning: Incomplete data Structure learning: Complete data Scoring metrics Maximizing the score Learning local structure Application: classification Learning causal relationships Structure learning: Incomplete data Conclusiono"dd.io"  -izDBenefits of Learning Structure:Efficient learning -- more accurate models with less data Compare: P(A) and P(B) vs joint P(A,B) former requires less data! Discover structural properties of the domain Identifying independencies in the domain helps to Order events that occur sequentially Sensitivity analysis and inference Predict effect of actions Involves learning causal relationship among variables defer to later part of the tutorial :I6%:   ~I $$6(($,,004488{E Approaches to Learning Structure5Constraint based Perform tests of conditional independence Search for a network that is consistent with the observed dependencies and independencies Score based Define a score that evaluates how well the (in)dependencies in a structure match the observations Search for a structure that maximizes the score     |FConstraints versus ScoresVConstraint based Intuitive, follows closely the definition of BNs Separates structure construction from the form of the independence tests Sensitive to errors in individual tests Score based Statistically motivated Can make compromises Both Consistent---with sufficient amounts of data and computation, they learn the correct structure F -<a  ,  a$$((}GLikelihood Score for StructuresFirst cut approach: Use likelihood function Recall, the likelihood score for a network structure and parameters is Since we know how to maximize parameters from now we assume <~H'Likelihood Score for Structure (cont.)  <Rearranging terms: where H(X) is the entropy of X I(X;Y) is the mutual information between X and Y I(X;Y) measures how much  information each variables provides about the other I(X;Y) 0 I(X;Y) = 0 iff X and Y are independent I(X;Y) = H(X) iff X is totally predictable given Y K   $$ ((,,004488<<I     $$((,,004488<<I'Likelihood Score for Structure (cont.)  Good news: Intuitive explanation of likelihood score: The larger the dependency of each variable on its parents, the higher the score Likelihood as a compromise among dependencies, based on their strength Bad news: Adding arcs always helps I(X;Y) I(X;Y,Z) Maximal score attained by  complete networks Such networks can overfit the data --- the parameters they learn capture the noise in the data +  +      $$JAvoiding Overfitting  Classic issue in learning. Standard approaches: Restricted hypotheses Limits the overfitting capability of the learner Example: restrict # of parents or # of parameters Minimum description length Description length measures complexity Choose model that compactly describes the training data Bayesian methods Average over all possible parameter values Use prior knowledge4c_?4c _? wKAvoiding Overfitting (cont..), Other approaches include: Holdout/Cross-validation/Leave-one-out Validate generalization on data withheld during training Structural Risk Minimization Penalize hypotheses subclasses based on their VC dimension x':;': ;LMinimum Description LengthRationale: prefer networks that facilitate compression of the data Compression summarization generalization L e D  M"Minimum Description Length (cont.)lComputing the description length of the data, we get Minimizing this term is equivalent to maximizing 66N,Minimum Description: Complexity Penalization Likelihood is (roughly) linear in M Penalty is logarithmic in M As we get more data, the penalty for complex structure is less harsh $E    M 'eOMinimum Description: ExampleIdealized behavior:P$Minimum Description: Example (cont.)Real data illustration with three network:  True alarm (509 param), simplified (359 param), tree (214 param) @,C,B>= QConsistency of the MDL ScoreMDL Score is consistent As M the  true structure G* maximizes the score (almost surely) For sufficiently large M, the maximal scoring structures are equivalent to G* Proof (outline): Suppose G implies an independence statement not in G*, then as M , l(G:D) l(G*:D) - eM (e depends on G) so MDL(G*:D) - MDL(G:D) eM - (dim(G*)-dim(G))/2 log M Now suppose G* implies an independence statement not in G, then as M , l(G:D) l(G*:D) so MDL(G:D) - MDL(G*:D) (dim(G)-dim(G*))/2 log M .<j@N    %  $$$(( ,,004488<<<    $$((,, 004488<<   @ $$((,, 004488<<   6.RBayesian Inference{Bayesian Reasoning---compute expectation over unknown G where Assumption: Gs are mutually exclusive and exhaustive^<@6   (S"Marginal Likelihood: Binomial case$Assume we observe a sequence of coin tosses& . By the chain rule we have: recall that where NmH is the number of heads in first m examples.fMFc   ! c,T&Marginal Likelihood: Binomials (cont.)& We simplify this by using Thus , 'UBinomial Likelihood: Example%Idealized experiment with P(H) = 0.25(& V$Marginal Likelihood: Example (cont.)"Actual experiment with P(H) = 0.25(# W!Marginal Likelihood: Multinomials`The same argument generalizes to multinomials with Dirichlet prior P(Q) is Dirichlet with hyperparameters a1,& ,aK D is a dataset with sufficient statistics N1,& ,NK Then XDcD #  $$)((,,004488<< >! ,-X&Marginal Likelihood: Bayesian Networks8Network structure determines form of marginal likelihood99YMarginal Likelihood (cont.)DIn general networks, the marginal likelihood has the form: where N(..) are the counts from the data a(..) are the hyperparameters for each family given G |;hI  ) zZPriors and BDe scoreWe need: prior counts a(..) for each network structure G This can be a formidable task There are exponentially many structures& Possible solution: The BDe prior Use prior of the form M0, B0=(G0, Q0) Corresponds to M0 prior examples distributed according to B0 Set a(xi,paiG) = M0 P(xi,paiG| G0, Q0) Note that paiG are, in general, not the same as the parents of Xi in G0. We can compute this using standard BN tools This choice also has desirable theoretical properties Equivalent networks are assigned the same scoreW*!&=(u60  )  $$((, ,004 488< <   )  $$( (,,004488<<    $$((, ,004488<<   1  $ $+((, ,600/4488t  1[#Bayesian Score: Asymptotic Behavior\The Bayesian score seems quite different from the MDL score However, the two scores are asymptotically equivalent Theorem: If the prior P(Q |G) is  well-behaved , then Proof: (Simple) Use Stirling s approximation to G( ) Applies to Bayesian networks with Dirichlet priors (General) Use properties of exponential models and Laplace s method for approximating integrals Applies to Bayesian networks with other parametric familiesr@.3`<q    $$!((,,20044 88V<<;, | _\#Bayesian Score: Asymptotic BehaviorConsequences: Bayesian score is asymptotically equivalent to MDL score The terms log P(G) and description length of G are constant and thus they are negligible when M is large. Bayesian score is consistent Follows immediately from consistency of MDL score Observed data eventually overrides prior information Assuming that the prior does not assign probability 0 to some parameter settings9j26Q9   0   $$(( ,,20044588P<<]Scores -- Summary@Likelihood, MDL and (log) BDe have the form BDe requires assessing prior network. It can naturally incorporate prior knowledge and previous experience Both MDL and BDe are consistent and asymptotically equivalent (up to a constant) All three are score-equivalent---they assign the same score to equivalent networks :@6 $,{Outline7Introduction Bayesian networks: a review Parameter learning: Complete data Parameter learning: Incomplete data Structure learning: Complete data Scoring metrics Maximizing the score Learning local structure Application: classification Learning causal relationships Structure learning: Incomplete data Conclusiono"ddio"  i_Optimization Problem Input: Training data Scoring function (including priors, if needed) Set of possible structures Including prior knowledge about structure Output: A network (or networks) that maximize the score Key Property: Decomposability: the score of a network is a sum of terms.X*0;X*  0, `Learning TreesTrees: At most one parent per variable Why trees? Elegant math we can solve the optimization problem Sparse parameterization avoid overfitting! &!  &aLearning Trees (cont.) Let p(i) denote the parent of Xi, or 0 if Xi has no parents We can write the score as Score = sum of edge scores + constant     $ $[((bLearning Trees (cont)Algorithm: Construct graph with vertices: 1, 2, & Set w(ij) be Score( Xj | Xi ) - Score(Xj) Find tree (or forest) with maximal weight This can be done using standard algorithms in low-order polynomial time by building a tree in a greedy fashion (Kruskal s maximum spanning tree algorithm) Theorem: This procedure finds the tree with maximal score When score is likelihood, then w(ij) is proportional to I(Xi; Xj) this is known as the Chow & Liu methodz | +    $$ ((,,00*4488 < 1lCr 7  +eHeuristic SearchWe address the problem by using heuristic search Define a search space: nodes are possible structures edges denote adjacency of structures Traverse this space looking for high-scoring structures Search techniques: Greedy hill-climbing Best first search Simulated Annealing ...dIC9?IC9 ?fHeuristic Search (cont.)Typical operations: g*Exploiting Decomposability in Local Search Caching: To update the score of after a local change, we only need to re-score the families that were changed in the last move& vhGreedy Hill-Climbing7Simplest heuristic local search Start with a given network empty network best tree a random network At each iteration Evaluate all possible changes Apply change that leads to best improvement in score Reiterate Stop when no modification improves score Each step requires evaluating approximately n new changes *]*: ) ]*,   $$iGreedy Hill-Climbing (cont.)Greedy Hill-Climbing can get struck in: Local Maxima: All one-edge changes reduce the score Plateaus: Some one-edge changes leave the score unchanged Both are occur in the search space (& 0%( & 0%jGreedy Hill-Climbing (cont.)To avoid these problems, we can use: TABU-search Keep list of K most recently visited structures Apply best move that does not lead to a structure in the list This escapes plateaus and local maxima and with  basin smaller than K structures Random Restarts Once stuck, apply some fixed number of random edge changes and restart search This can escape from the basin of one maxima to another % %     $$((,,004488kGreedy Hill-Climbing@Greedy Hill Climbing with TABU-list and random restarts on alarmlOther Local Search HeuristicsStochastic First-Ascent Hill-Climbing Evaluate possible changes at random Apply the first one that leads  uphill Stop when a fix amount of  unsuccessful attempts to change the current candidate Simulated Annealing Similar idea, but also apply  downhill changes with a probability that is proportional to the change in score Use a temperature to control amount of random downhill steps Slowly  cool temperature to reach a regime where performing strict uphill moves&%  %  mI-Equivalence Class SearchSo far, we seen generic search methods& Can exploit the structure of our domain? Idea: Search the space of I-equivalence classes Each I-equivalence class is represented by a PDAG (partially ordered graph) -- skeleton + v-structures Benefits: The space of PDAGs has fewer local maxima and plateaus There are fewer PDAGs than DAGs r(* YQ   YLn"I-Equivalence Class Search (cont.)^Evaluating changes is more expensive These algorithms are more complex to implement %:_oSearch and StatisticsdEvaluating the score of a structure requires the corresponding counts (sufficient statistics) Significant computation is spent in collecting these counts Requires a pass over the training data Reduce overhead by caching previously computed counts Avoid duplicated efforts Marginalize counts: N(X,Y) N(X) '6:&6 -  $$$ p'Learning in Practice: Time & Statistics8Using greedy Hill-Climbing on 10000 instances from alarmq"Learning in Practice: Alarm domainrModel Averaging`Recall, Bayesian analysis started with This requires us to average over all possible models :)7)7`sModel Averaging (cont.)aSo far, we focused on single model Find best scoring model Use it to predict next example Implicit assumption: Best scoring model dominates the weighted sum Pros: We get a single structure Allows for efficient use in our tasks Cons: We are committing to the independencies of a particular structure Other structures might be as probable given the data#7.@w#7.  @  v$$((PtModel Averaging (cont.) Can we do better? Full Averaging Sum over all structures Usually intractable---there are exponentially many structures Approximate Averaging Find K largest scoring structures Approximate the sum by averaging over their prediction Weight of each structure determined by the Bayes Factor V.(     $$uSearch: Summary$Discrete optimization problem In general, NP-Hard Need to resort to heuristic search In practice, search is relatively fast (~100 vars in ~10 min): Decomposability Sufficient statistics In some cases, we can reduce the search problem to an easy optimization problem Example: learning trees b3b'P3b'P |Outline7Introduction Bayesian networks: a review Parameter learning: Complete data Parameter learning: Incomplete data Structure learning: Complete data Scoring metrics Maximizing the score Learning local structure Application: classification Learning causal relationships Structure learning: Incomplete data Conclusiono"d%dio"%  i Learning Loop: Summary #Why Struggle for Accurate Structure$$fIncreases the number of parameters to be fitted Wrong assumptions about causality and domain structure bCannot be compensated by accurate fitting of parameters Also misses causality and domain structure Local and Global Structure   Context Specific Independence;Variable independence: structure of the graph P(A|B,C) = P(A|C) for all values of A,B,C Knowledge acquisition, probabilistic inference, and learning CSI: local structure on the conditional probability distributions P(A|B,C=c) = p for all values of A,B Knowledge acquisition, probabilistic inference, and learningl.jBb._  BY  Effects on learningGlobal structure: Enables decomposability of the score Search is feasible Local structure: Reduces the number of parameters to be fitted Better estimates More accurate global structure!%.1% .  3$$ `Local Structure More Accurate Global Structure(1   "Learning loop with local structureyWhat is needed: Bayesian and/or MDL score for the local structures Control loop (search) for inducing the local structure&jj Scores@MDL Go through the exercise of building the description length + show maximum likelihood parameter fitting BDe Priors... From M and P (xi,pai) compute priors for the parameters of any structure Need two assumptions,besides parameter independence 1. Semantic coherence 2. Composition h&h i( Experimental EvaluationSet-up Use Bayesian networks to generate data Measures of error based on cross-entropy Results Tree-based learn better networks given the same amount of data. For M > 8k in the alarm experiment BDe tree-error is 70% of tabular-error MDL tree-error is 50% of tabular-error Tree-based learn better global structure given the same amount of data Check cross-entropy of found structure with optimal parameter fit against generating Bayesian network P cNGfP cNG f$$((Other Types of Local StructureGraphs Noisy-or, Noisy-max (Causal independence) Regression Neural nets Continuous representations, such as Gaussians To  plug in a different representation, we need the following Sufficient Statistics Estimation of parameters Marginal likelihood<{?C{?C$Learning with Complete Data: Summary }OutlineIntroduction Bayesian networks: a review Parameter learning: Complete data Parameter learning: Incomplete data Structure learning: Complete data Application: classification Learning causal relationships Structure learning: Incomplete data ConclusionJdMMJThe Classification Problem$From a data set describing objects by vectors of features and a class Find a function F: features class to classify a new object F @1  2 23f $$((,,0044288<< 2KExamplesPredicting heart disease Features: cholesterol, chest pain, angina, age, etc. Class: {present, absent} Finding lemons in cars Features: make, brand, miles per gallon, acceleration,etc. Class: {normal, lemon} Digit recognition Features: matrix of pixel descriptors Class: {1, 2, 3, 4, 5, 6, 7, 8, 9, 0} Speech recognition Features: Signal characteristics, language model Class: {pause/hesitation, retraction}NRLWM  QK  $$W((8# ApproachesMemory based Define a distance between samples Nearest neighbor, support vector machines Decision surface Find best partition of the space CART, decision trees Generative models Induce a model and impose a decision rule Bayesian networksv L6< L6 <NGenerative Models$Bayesian classifiers Induce a probability describing the data P(F1,& ,Fn,C) Impose a decision rule. Given a new object < f1,& ,fn > c = argmaxC P(C = c | f1,& ,fn) We have shifted the problem to learning P(F1,& ,Fn,C) Learn a Bayesian network representation for P(F1,& ,Fn,C)) 8q)  / $$((,,00 4488 <<  )   $ $((.,,004 488< <O<Optimality of the decision rule Minimizing the error rate...Let ci be the true class, and let lj be the class returned by the classifier. A decision by the classifier is correct if ci=lj, and in error if ci lj. The error incurred by choose label lj is Thus, had we had access to P, we minimize error rate by choosing li when which is the decision rule for the Bayesian classifier PK , $$((3f,,004488<< 3f   #$$((,,!0044"88<<@P+Advantages of the Generative Model Approach.Output: Rank over the outcomes---likelihood of present vs. absent Explanation: What is the profile of a  typical person with a heart disease Missing values: both in training and testing Value of information: If the person has high cholesterol and blood sugar, which other test should be conducted? Validation: confidence measures over the model and its parameters Background knowledge: priors and structure3f';   3f'@3f  3f$$'[(( ,,3f00'744883f<<&Advantages of Using a Bayesian NetworkEfficiency in learning and query answering Combine knowledge engineering and statistical induction Algorithms for decision making, value of information, diagnosis and repair&++QThe Nave Bayesian ClassifierFixed structure encoding the assumption that features are independent of each other given the class. Learning amounts to estimating the parameters for each P(Fi|C) for each Fi.\  R%The Nave Bayesian Classifier (cont.)Common practice is to estimate These estimate are identical to MLE for multinomials Estimates are robust consisting of low order statistics requiring few instances Has proven to be a powerful classifierImproving Nave BayesNave Bayes encodes assumptions of independence that may be unreasonable: Are pregnancy and age independent given diabetes? Problem: same evidence may be incorporated multiple times The success of nave Bayes is attributed to Robust estimation Decision may be correct even if probabilities are inaccurate Idea: improve on nave Bayes by weakening the independence assumptions Bayesian networks provide the appropriate mathematical language for this taskJ8<,OGNJ     $$((3,,,00O4488C<