No subject


Mon Feb 22 19:56:25 PST 2010


correspondence course without a bit more math.


FWIW, in my toy models, I like 3 hidden states and 3 emitted states, with
the hidden states having even chances of transitioning to each other (1/3
all round), and having each hidden state correspond to an observed state at
.8, with .1 to the other observed states.

Happy hacking,
--
/jbm

I work at a Phylogenomics lab and we do HMM stuff all the time -- with
> either the SAM software tool (http://compbio.soe.ucsc.edu/sam.html), or
> the newer tool HMMR (http://hmmer.janelia.org/). Conceptually I have
> 'enough' knowledge -- like a user to a pc has enough to be able to point and
> click to read their email.
>
> But, the fundamental "could I write my own toy example" always escapes me.
>
> Here's what I know (from the ground up):
>
> Conditional Probability:
>     Although I never reviewed this Wiki page, it's probably a fairly good
> introduction:
>     http://en.wikipedia.org/wiki/Conditional_probability
>
>
> Baye's Theorem:
>      This wiki page doesn't explain it in a clear way like I would to
> someone just starting off. I'd start with a conditional tree (classic text
> book examples are - base decision (root of the tree) is "if the owner
> purchased a warranty or not (either yes (left branch) or no (right branch).
> And, then a second condition -- let's say the radio breaks (another fork of
> the left tree), for example. Then, we can say things like "what's the
> probability that, given the condition that someone purchased a warranty,
> what's the probability that they have a broken radio)...  Obviously this
> would be much clear with a nice whiteboard and a decision tree. Baye's
> theorem has always twisted a mind or two in probability class - but it's
> actually not that hard once you get your mind around it (although I need a
> refresher).
>
>     http://en.wikipedia.org/wiki/Bayes'_theorem
>
>
> The Simplest Baysean Network
>     I'm obviously working myself up to the Wiki page for an HMM (
> http://en.wikipedia.org/wiki/Hidden_Markov_model). This explains that an
> HMM is "A HMM can be considered as the simplest dynamic Bayesian network."
> [1]).
>
>     But, the simples dynamic Bayesian network Wiki page (
> http://en.wikipedia.org/wiki/Dynamic_Bayesian_network) used to talk about
> grass, water and sprinklers (as an example). Although I didn't get it 100%,
> I always felt comfortable there was a real-world example.
>
>     With all of that said, I still couldn't "do it." How was this different
> than a Baysean decision tree? If I had a simple exercise that I could work
> through, with an 'answer to compare to', it would help.
>
>
> Best of all, I'd like to write a very small toy program. I'm most
> comfortable with Protein Multiple Sequence Alignments (MSA), so if I could
> make a very basic hmm from a small MSA, it'd really help..
>
>     Can anyone explain to me, like they were speaking to a 6-year old, how
> a simple baysean network (HMM) can be created. If you want an example,
> imagine the following:
>
>
>
>
> * There are 20 letters of an alphabet (all letters except JOBZUX
> (interesting way to remember it :))
> * An example of a set of sequences that have been aligned follows:
>
> DLITPLHTYMITGN-VCVHVIKKIIELGGDMDMKCV
> NLITPLHSYLRRDELISASVLKKVIELGADRNLRCC
> HLITPLHSYLRRDESISASVLKKVIELGADRNLRCC
> HLITPLHSYLRRDESISASVLKKVIELGADRNLRCC
> HLITPLHSYLRRDESISASVLKKVIELGADRNLRCC
> DLITPLHTYMITGN-VCVHVIKKIIELGGDMDMKCV
> NLITPLHTYMITGN-VCVHVIKKIIELGGDMDMKCI
> NLITPLHTYMITGN-VCVDVIKKIIELGGDMDMKCV
> DLITPLHTYTITGN-VCAYVIKKIIELGGDMDMKCV
> DLITPLHTYMITGN-VCVHVIKKIIELG--------
>
> So, in this example, the first column is more variable than the second or
> the fourteenth.
>
> I should be able to score an HMM and see if it matches another sequence
> (notice lower case letters mean an HMM insert state):
>
> masltehaivnvrkliystcledfdnristnarinnydpddgycsdgdiysynhtvrykhikvfkkkyyg
> idnrqrqqytdsktalidiigsmilmlkadrknkslvdqykkfvkyiikdnksktanhvfdipnngdmdi
> lytyfnsprtrcikldlikymvdvgivnlnyvckktgygilhaylgnmnvdidilewlcnngvdvnlqns
> ......................................................................
> ...............NLITPLHTYMITGN.VCVDVIKKIIELGGDMDMKCVngmspimtymtnidnvnpe
> itnayiesldgdkvknipmilhsyitlarnidisvvysflqpgvklhykdsagrtclhqyilrhnistni
> ikllheygndvnepdnigntvlhtylsmlsvvhildpetdndirldviqcllslgaditavnclgytplt
> syictaqnymyydiidclisdkvlnmvkhrilqdllirvddtpciihhiiakyniptdlytdeyepydst
> dihdvyhcaiierynnavcetsgmtplhvsiishtnanivmdsfvyllsiqaniniptkngvdplmltme
> nnmlsghqwylvknildkrpnvdivisfldkcyaagkfpslllseddiikptlrlalmlagldycnkcie
> ymerdiaildnshamflafdklvsirdnidkltklhinsrsnisiydilvskcykediithrenhnlvac
> chgndplydiinkyitdarsmyyiandisryimdmypvmripvpllfsciigifrltyfkkiiidrhhds
> finarltdea
>
>
> I know we can use a substitution matrix (i.e., blosum62;
> http://www.uky.edu/Classes/BIO/520/BIO520WWW/blosum62.htm) to say how
> likely it is for one of the letters to change to another letter, but how do
> I incorporate that likelihood given that likelihood of the previous letter
> (the likelihood that the second column is an L, given the first column is a
> D (obviously pretty high in the above example).
>
> Here's a snippet of what was generated for this example:
>
> HMM          A        C        D        E        F        G        H
>  I        K        L        M        N        P        Q        R        S
>      T        V        W        Y
>             m->m     m->i     m->d     i->m     i->i     d->m     d->d
>   COMPO   2.95757  3.22921  2.84828  2.93689  4.02506  2.84493  3.33255
>  2.33133  2.65067  2.12155  3.45566  3.32303  3.36052  3.85947  3.13260
>  2.99602  2.82345  2.43542  5.63061  3.48580
>           2.68618  4.42225  2.77519  2.73123  3.46354  2.40513  3.72494
>  3.29354  2.67741  2.69355  4.24690  2.90347  2.73739  3.18146  2.89801
>  2.37887  2.77519  2.98518  4.58477  3.61503
>           0.01686  4.48732  5.20967  0.61958  0.77255  0.00000        *
>       1   3.06925  5.62515  1.55218  2.36308  4.87352  3.48242  2.02657
>  4.44300  2.84967  3.92317  4.73346  1.82577  4.04654  3.07283  3.40453
>  2.97597  3.33762  4.00967  6.04161  4.55323      1 x -
>           2.68618  4.42225  2.77519  2.73123  3.46354  2.40513  3.72494
>  3.29354  2.67741  2.69355  4.24690  2.90347  2.73739  3.18146  2.89801
>  2.37887  2.77519  2.98518  4.58477  3.61503
>           0.01686  4.48732  5.20967  0.61958  0.77255  0.48576  0.95510
>
>
> Err.. at this point I'm so confused on what the actual product of an HMM
> actually is, I forget all I did know...
>
> Has anyone worked with HMMs in a different context? Are there toy examples
> we could write?  Do you have any input on this?
>
>
>
> Cheers,
>
>
> Glen
> [1] http://en.wikipedia.org/wiki/Hidden_Markov_model
> --
> Whatever you can do or imagine, begin it;
> boldness has beauty, magic, and power in it.
>
> -- Goethe
>
> _______________________________________________
> ml mailing list
> ml at lists.noisebridge.net
> https://www.noisebridge.net/mailman/listinfo/ml
>
>


-- 
Josh Myer 650.248.3796
josh at joshisanerd.com

--0003255760023a1c0a048b4e82ff
Content-Type: text/html; charset=ISO-8859-1
Content-Transfer-Encoding: quoted-printable

<div class=3D"gmail_quote">On Tue, Jul 13, 2010 at 2:16 PM, Glen Jarvis <sp=
an dir=3D"ltr">&lt;<a href=3D"mailto:glen at glenjarvis.com">glen at glenjarvis.c=
om</a>&gt;</span> wrote:<br><blockquote class=3D"gmail_quote" style=3D"marg=
in:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;">
I was invited to this list for questions I have about Hidden Markov Models.=
 Please forgive the extra chatter.<br clear=3D"all"><br></blockquote><div><=
br></div><div>This is exactly what this list is for!</div><div><br></div>
<div>&quot;A HMM can be considered as the simplest dynamic Bayesian network=
.&quot;</div><div><br></div><div>Eek! =A0That&#39;s an awful place to start=
 from, unless you already know Bayesian networks (if you do, I have a very =
dense 600 page book I&#39;d love to get an explanation of). =A0It&#39;s lik=
e saying a NOT gate is the simplest quantum tunneling gate, or something si=
milarly startlingly &quot;This is the simplest example of a [insert family =
of obscure and generally brain-fucking objects here].&quot; =A0Bayes nets a=
re nasty awful things to implement, and HMMs are incredibly much simpler.</=
div>
<div><br></div><div>Stick to the &quot;Concrete Example&quot; on wikipedia.=
 =A0I did a presentation on it in the last ML incarnation; it might have be=
en one of the things we programmed our way through. =A0Here&#39;s that pres=
entation as a correspondence course:</div>
<div><br></div><div>There are two big conceptual hurdles to understanding H=
MMs:=A0</div><div><br></div><div>1) the hidden part (which is always poorly=
 explained, because it&#39;s impossible to explain in a straightforward and=
 concise way, and academics choose concise), and</div>
<div><br></div><div>2) the dynamic programming way (which is typically poor=
ly explained, because it&#39;s difficult to explain in a straightforward an=
d concise way, and academics choose concise)</div><div><br></div><div>Load =
up the &quot;Concrete Example&quot; section of wikipedia&#39;s HMM page. =
=A0Do not click any other links, especially not the Markov Model page, as i=
t&#39;s obtuse and useless to a beginner. =A0Start with writing something t=
hat generates random observations from the underlying model, to get a feel =
for what an HMM is modeling. =A0It&#39;s not intuitive at first, and then i=
t suddenly clicks and you get it. =A0You will then forget, and repeat the m=
oment of enlightenment several times as you see different confusing aspects=
 of the model (unless you&#39;ve been thinking probabilistically for a long=
 time).</div>
<div><br></div><div>Once you get the idea of how the HMM is &quot;hiding&qu=
ot; its state, and have an intuitive feeling for how the combined probabili=
ties work on the generation side, you can step back to conditional probabil=
ities bit. =A0At that point, you want to make a big hidden*observed matrix =
on a piece of paper, draw a toy model on another piece of paper. =A0Make th=
e probabilities friendly. =A0From there, start at the very beginning of the=
 piece of paper, and do what your software did above. =A0Do this for 4 or 5=
 columns, which is a lot of time spent copying numbers and feeling how the =
cells interconnect. =A0As you fill in each cell, draw arrows between cells =
if you need to, and say out loud what&#39;s happening. =A0Do this in the pr=
ivacy of your own home. =A0But, really, do talk yourself through it, it&#39=
;s very helpful.</div>
<div><br></div><div>You can implement this process (it&#39;s very computati=
onally expensive, but that doesn&#39;t matter for our toy problems), and ha=
ve one thing that implements the use of HMMs to explain a sequence of obser=
vations. =A0It&#39;s only useful for pedagogy, so, write it, commit it to y=
our dvcs of choice, and forget about it.</div>
<div><br></div><div>Go read up on Levenshtein distance. =A0<a href=3D"http:=
//en.wikipedia.org/wiki/Levenshtein_distance">http://en.wikipedia.org/wiki/=
Levenshtein_distance</a> -- this is dynamic programming. =A0It&#39;s the cl=
ass of things Viterbi fits into. =A0Once you have implemented this and it w=
orks, make sure you understand how dynamic programming worked there.</div>
<div><br></div><div>Next, go to the wikipedia page for the viterbi algorith=
m. =A0It&#39;s a dynamic algorithm that has a bunch of cells, much like lev=
enshtein, but, this time, you&#39;re keeping track of each transition (so y=
ou have the whole string). =A0In Levenshtein, this is equivalent to remembe=
ring the sequence of commands you used to edit the string. =A0In the cells,=
 instead of an edit distance, you have the probability that you went into t=
hat state. =A0Unlike Levenshein, this path isn&#39;t necessarily a connecte=
d sequence of cells (you can transition out to any of the next states at ea=
ch point; I&#39;m pretty sure this actually a higher-dimensional walk and t=
hey are therefore adjacent in the next dimension, but haven&#39;t bothered =
to think about it). =A0Anyway, you&#39;ve got cells that get highlighted on=
e per column, as in viterbi, which show the most probable explanation of ho=
w you got to that observation.</div>
<div><br></div><div>From here, you&#39;re ready for Baum-Welch, but that&#3=
9;s not as amenable to a correspondence course without a bit more math.</di=
v><div><br></div><div><br></div><div>FWIW, in my toy models, I=A0like 3 hid=
den states and 3 emitted states, with the hidden states having even chances=
 of transitioning to each other (1/3 all round), and having each hidden sta=
te correspond to an observed state at .8, with .1 to the other observed sta=
tes.</div>
<div>=A0</div><div>Happy hacking,</div><div>--</div><div>/jbm</div><div><br=
></div><blockquote class=3D"gmail_quote" style=3D"margin:0 0 0 .8ex;border-=
left:1px #ccc solid;padding-left:1ex;"><div>I work at a Phylogenomics lab a=
nd we do HMM stuff all the time -- with either the SAM software tool (<a hr=
ef=3D"http://compbio.soe.ucsc.edu/sam.html" target=3D"_blank">http://compbi=
o.soe.ucsc.edu/sam.html</a>), or the newer tool HMMR (<a href=3D"http://hmm=
er.janelia.org/" target=3D"_blank">http://hmmer.janelia.org/</a>). Conceptu=
ally I have &#39;enough&#39; knowledge -- like a user to a pc has enough to=
 be able to point and click to read their email.</div>

<div><br></div><div>But, the fundamental &quot;could I write my own toy exa=
mple&quot; always=A0escapes=A0me.</div><div><br></div><div>Here&#39;s what =
I know (from the ground up):</div><div><br></div><div>Conditional=A0Probabi=
lity:</div>

<div>=A0=A0 =A0Although I never reviewed this Wiki page, it&#39;s probably =
a fairly good introduction:</div><div>=A0=A0 =A0<a href=3D"http://en.wikipe=
dia.org/wiki/Conditional_probability" target=3D"_blank">http://en.wikipedia=
.org/wiki/Conditional_probability</a></div>

<div><br></div><div>=A0</div><div>Baye&#39;s Theorem:</div><div>=A0=A0 =A0 =
This wiki page doesn&#39;t explain it in a clear way like I would to someon=
e just starting off. I&#39;d start with a conditional tree (classic text bo=
ok examples are - base decision (root of the tree) is &quot;if the owner pu=
rchased a warranty or not (either yes (left branch) or no (right branch). A=
nd, then a second condition -- let&#39;s say the radio breaks (another fork=
 of the left tree), for example. Then, we can say things like &quot;what&#3=
9;s the probability that, given the condition that someone purchased a=A0wa=
rranty, what&#39;s the probability that they have a broken radio)... =A0Obv=
iously this would be much clear with a nice whiteboard and a decision tree.=
 Baye&#39;s theorem has always twisted a mind or two in=A0probability=A0cla=
ss - but it&#39;s actually not that hard once you get your mind around it (=
although I need a refresher).</div>

<div><br></div><div>=A0=A0 =A0<a href=3D"http://en.wikipedia.org/wiki/Bayes=
&#39;_theorem" target=3D"_blank">http://en.wikipedia.org/wiki/Bayes&#39;_th=
eorem</a></div><div><br></div><div><br></div><div>The Simplest Baysean Netw=
ork=A0</div>
<div>=A0=A0 =A0I&#39;m obviously working myself up to the Wiki page for an =
HMM (<a href=3D"http://en.wikipedia.org/wiki/Hidden_Markov_model" target=3D=
"_blank">http://en.wikipedia.org/wiki/Hidden_Markov_model</a>). This explai=
ns that an HMM is &quot;A HMM can be considered as the simplest dynamic Bay=
esian network.&quot; [1]). =A0</div>

<div><br></div><div>=A0=A0 =A0But, the simples dynamic=A0Bayesian=A0network=
 Wiki page (<a href=3D"http://en.wikipedia.org/wiki/Dynamic_Bayesian_networ=
k" target=3D"_blank">http://en.wikipedia.org/wiki/Dynamic_Bayesian_network<=
/a>)=A0used to talk about grass, water and sprinklers (as an example). Alth=
ough I didn&#39;t get it 100%, I always felt comfortable there was a real-w=
orld example.</div>

<div><br></div><div>=A0=A0 =A0With all of that said, I still couldn&#39;t &=
quot;do it.&quot; How was this different than a Baysean decision tree? If I=
 had a simple exercise that I could work through, with an &#39;answer to co=
mpare to&#39;, it would help.=A0</div>

<div><br></div><div><br></div><div>Best of all, I&#39;d like to write a ver=
y small toy program. I&#39;m most comfortable with Protein Multiple Sequenc=
e Alignments (MSA), so if I could make a very basic hmm from a small MSA, i=
t&#39;d really help..</div>

<div><br></div><div>=A0=A0 =A0Can anyone explain to me, like they were spea=
king to a 6-year old, how a simple baysean network (HMM) can be created. If=
 you want an example, imagine the following:</div><div><br></div><div><br><=
/div>

<div><br></div><div><br></div><div>* There are 20 letters of an alphabet (a=
ll letters except JOBZUX (interesting way to remember it :))</div><div>* An=
 example of a set of sequences that have been aligned follows:</div><div>

<br></div><div><span style=3D"font-family:&#39;courier new&#39;, monospace"=
>DLITPLHTYMITGN-VCVHVIKKIIELGGDMDMKCV</span></div><div><div><font face=3D"&=
#39;courier new&#39;, monospace">NLITPLHSYLRRDELISASVLKKVIELGADRNLRCC</font=
></div>

<div><font face=3D"&#39;courier new&#39;, monospace">HLITPLHSYLRRDESISASVLK=
KVIELGADRNLRCC</font></div><div><font face=3D"&#39;courier new&#39;, monosp=
ace">HLITPLHSYLRRDESISASVLKKVIELGADRNLRCC</font></div>
<div><font face=3D"&#39;courier new&#39;, monospace">HLITPLHSYLRRDESISASVLK=
KVIELGADRNLRCC</font></div><div><font face=3D"&#39;courier new&#39;, monosp=
ace">DLITPLHTYMITGN-VCVHVIKKIIELGGDMDMKCV</font></div>
<div><font face=3D"&#39;courier new&#39;, monospace">NLITPLHTYMITGN-VCVHVIK=
KIIELGGDMDMKCI</font></div><div><font face=3D"&#39;courier new&#39;, monosp=
ace">NLITPLHTYMITGN-VCVDVIKKIIELGGDMDMKCV</font></div>
<div><font face=3D"&#39;courier new&#39;, monospace">DLITPLHTYTITGN-VCAYVIK=
KIIELGGDMDMKCV</font></div><div><font face=3D"&#39;courier new&#39;, monosp=
ace">DLITPLHTYMITGN-VCVHVIKKIIELG--------</font></div>
</div><div><br></div><div>So, in this example, the first column is more var=
iable than the second or the fourteenth.=A0</div><div><br></div><div>I shou=
ld be able to score an HMM and see if it matches another sequence (notice l=
ower case letters mean an HMM insert state):</div>

<div><br></div><div><div><font face=3D"&#39;courier new&#39;, monospace">ma=
sltehaivnvrkliystcledfdnristnarinnydpddgycsdgdiysynhtvrykhikvfkkkyyg</font>=
</div><div><font face=3D"&#39;courier new&#39;, monospace">idnrqrqqytdsktal=
idiigsmilmlkadrknkslvdqykkfvkyiikdnksktanhvfdipnngdmdi</font></div>

<div><font face=3D"&#39;courier new&#39;, monospace">lytyfnsprtrcikldlikymv=
dvgivnlnyvckktgygilhaylgnmnvdidilewlcnngvdvnlqns</font></div><div><font fac=
e=3D"&#39;courier new&#39;, monospace">....................................=
..................................</font></div>

<div><font face=3D"&#39;courier new&#39;, monospace">...............NLITPLH=
TYMITGN.VCVDVIKKIIELGGDMDMKCVngmspimtymtnidnvnpe</font></div><div><font fac=
e=3D"&#39;courier new&#39;, monospace">itnayiesldgdkvknipmilhsyitlarnidisvv=
ysflqpgvklhykdsagrtclhqyilrhnistni</font></div>

<div><font face=3D"&#39;courier new&#39;, monospace">ikllheygndvnepdnigntvl=
htylsmlsvvhildpetdndirldviqcllslgaditavnclgytplt</font></div><div><font fac=
e=3D"&#39;courier new&#39;, monospace">syictaqnymyydiidclisdkvlnmvkhrilqdll=
irvddtpciihhiiakyniptdlytdeyepydst</font></div>

<div><font face=3D"&#39;courier new&#39;, monospace">dihdvyhcaiierynnavcets=
gmtplhvsiishtnanivmdsfvyllsiqaniniptkngvdplmltme</font></div><div><font fac=
e=3D"&#39;courier new&#39;, monospace">nnmlsghqwylvknildkrpnvdivisfldkcyaag=
kfpslllseddiikptlrlalmlagldycnkcie</font></div>

<div><font face=3D"&#39;courier new&#39;, monospace">ymerdiaildnshamflafdkl=
vsirdnidkltklhinsrsnisiydilvskcykediithrenhnlvac</font></div><div><font fac=
e=3D"&#39;courier new&#39;, monospace">chgndplydiinkyitdarsmyyiandisryimdmy=
pvmripvpllfsciigifrltyfkkiiidrhhds</font></div>

<div><font face=3D"&#39;courier new&#39;, monospace">finarltdea</font></div=
></div><div><br></div><div><br></div><div>I know we can use a=A0substitutio=
n matrix (i.e., blosum62; <a href=3D"http://www.uky.edu/Classes/BIO/520/BIO=
520WWW/blosum62.htm" target=3D"_blank">http://www.uky.edu/Classes/BIO/520/B=
IO520WWW/blosum62.htm</a>) to say how likely it is for one of the letters t=
o change to another letter, but how do I incorporate that=A0likelihood=A0gi=
ven that=A0likelihood=A0of the previous letter (the=A0likelihood=A0that the=
 second column is an L, given the first column is a D (obviously pretty hig=
h in the above example).=A0</div>

<div><br></div><div>Here&#39;s a snippet of what was generated for this exa=
mple:</div><div><br></div><div><div><font face=3D"&#39;courier new&#39;, mo=
nospace">HMM =A0 =A0 =A0 =A0 =A0A =A0 =A0 =A0 =A0C =A0 =A0 =A0 =A0D =A0 =A0=
 =A0 =A0E =A0 =A0 =A0 =A0F =A0 =A0 =A0 =A0G =A0 =A0 =A0 =A0H =A0 =A0 =A0 =
=A0I =A0 =A0 =A0 =A0K =A0 =A0 =A0 =A0L =A0 =A0 =A0 =A0M =A0 =A0 =A0 =A0N =
=A0 =A0 =A0 =A0P =A0 =A0 =A0 =A0Q =A0 =A0 =A0 =A0R =A0 =A0 =A0 =A0S =A0 =A0=
 =A0 =A0T =A0 =A0 =A0 =A0V =A0 =A0 =A0 =A0W =A0 =A0 =A0 =A0Y</font></div>

<div><font face=3D"&#39;courier new&#39;, monospace">=A0=A0 =A0 =A0 =A0 =A0=
 =A0m-&gt;m =A0 =A0 m-&gt;i =A0 =A0 m-&gt;d =A0 =A0 i-&gt;m =A0 =A0 i-&gt;i=
 =A0 =A0 d-&gt;m =A0 =A0 d-&gt;d</font></div><div><font face=3D"&#39;courie=
r new&#39;, monospace">=A0=A0COMPO =A0 2.95757 =A03.22921 =A02.84828 =A02.9=
3689 =A04.02506 =A02.84493 =A03.33255 =A02.33133 =A02.65067 =A02.12155 =A03=
.45566 =A03.32303 =A03.36052 =A03.85947 =A03.13260 =A02.99602 =A02.82345 =
=A02.43542 =A05.63061 =A03.48580</font></div>

<div><font face=3D"&#39;courier new&#39;, monospace">=A0=A0 =A0 =A0 =A0 =A0=
2.68618 =A04.42225 =A02.77519 =A02.73123 =A03.46354 =A02.40513 =A03.72494 =
=A03.29354 =A02.67741 =A02.69355 =A04.24690 =A02.90347 =A02.73739 =A03.1814=
6 =A02.89801 =A02.37887 =A02.77519 =A02.98518 =A04.58477 =A03.61503</font><=
/div>

<div><font face=3D"&#39;courier new&#39;, monospace">=A0=A0 =A0 =A0 =A0 =A0=
0.01686 =A04.48732 =A05.20967 =A00.61958 =A00.77255 =A00.00000 =A0 =A0 =A0 =
=A0*</font></div><div><font face=3D"&#39;courier new&#39;, monospace">=A0=
=A0 =A0 =A01 =A0 3.06925 =A05.62515 =A01.55218 =A02.36308 =A04.87352 =A03.4=
8242 =A02.02657 =A04.44300 =A02.84967 =A03.92317 =A04.73346 =A01.82577 =A04=
.04654 =A03.07283 =A03.40453 =A02.97597 =A03.33762 =A04.00967 =A06.04161 =
=A04.55323 =A0 =A0 =A01 x -</font></div>

<div><font face=3D"&#39;courier new&#39;, monospace">=A0=A0 =A0 =A0 =A0 =A0=
2.68618 =A04.42225 =A02.77519 =A02.73123 =A03.46354 =A02.40513 =A03.72494 =
=A03.29354 =A02.67741 =A02.69355 =A04.24690 =A02.90347 =A02.73739 =A03.1814=
6 =A02.89801 =A02.37887 =A02.77519 =A02.98518 =A04.58477 =A03.61503</font><=
/div>

<div><font face=3D"&#39;courier new&#39;, monospace">=A0=A0 =A0 =A0 =A0 =A0=
0.01686 =A04.48732 =A05.20967 =A00.61958 =A00.77255 =A00.48576 =A00.95510</=
font></div></div><div><br></div><div><br></div><div>Err.. at this point I&#=
39;m so confused on what the actual product of an HMM actually is, I forget=
 all I did know...=A0</div>

<div><br></div><div>Has anyone worked with HMMs in a different context? Are=
 there toy examples we could write? =A0Do you have any input on this?</div>=
<div><br></div><div><br></div><div><br></div><div>Cheers,</div><div><br>
</div>
<div><br></div><div>Glen</div><div>[1]=A0<a href=3D"http://en.wikipedia.org=
/wiki/Hidden_Markov_model" target=3D"_blank">http://en.wikipedia.org/wiki/H=
idden_Markov_model</a></div><div>-- <br>Whatever you can do or imagine, beg=
in it;<br>
boldness has beauty, magic, and power in it.<br>
<br>-- Goethe <br>
</div>
<br>_______________________________________________<br>
ml mailing list<br>
<a href=3D"mailto:ml at lists.noisebridge.net">ml at lists.noisebridge.net</a><br=
>
<a href=3D"https://www.noisebridge.net/mailman/listinfo/ml" target=3D"_blan=
k">https://www.noisebridge.net/mailman/listinfo/ml</a><br>
<br></blockquote></div><br><br clear=3D"all"><br>-- <br>Josh Myer 650.248.3=
796<br> <a href=3D"mailto:josh at joshisanerd.com">josh at joshisanerd.com</a><br=
>

--0003255760023a1c0a048b4e82ff--


More information about the ml mailing list