aboutsummaryrefslogtreecommitdiffstats
path: root/notes/extensions.tex
diff options
context:
space:
mode:
authorjeanpouget-abadie <jean.pougetabadie@gmail.com>2015-11-05 09:35:44 -0500
committerjeanpouget-abadie <jean.pougetabadie@gmail.com>2015-11-05 09:35:44 -0500
commit15e4871fc224e9e74c93b772b15aea7031f262ab (patch)
tree0bb5d7f012d4029dedd179b58027bf45a8b2cc64 /notes/extensions.tex
parentd4fff5add651e98a1ce2e7c7aa6a2223c5771ca9 (diff)
downloadcascades-15e4871fc224e9e74c93b772b15aea7031f262ab.tar.gz
adding simple bayes function
Diffstat (limited to 'notes/extensions.tex')
-rw-r--r--notes/extensions.tex6
1 files changed, 5 insertions, 1 deletions
diff --git a/notes/extensions.tex b/notes/extensions.tex
index dd9933a..cc247ef 100644
--- a/notes/extensions.tex
+++ b/notes/extensions.tex
@@ -31,7 +31,11 @@ network learning however, we can place more informative priors. We can
\item Take into account common graph structures, such as triangles
\end{itemize}
-We can sample from the posterior by MCMC.
+We can sample from the posterior by MCMC. This might not be a very fast solution
+however. In the case of the independent cascade model, there is an easier
+solution: we can use the EM algorithm to compute the posterior, by using the
+parent that \emph{does} infect us (if at all) as the latent variable in the
+model.
\subsection{Active Learning}