aboutsummaryrefslogtreecommitdiffstats
path: root/finale/sections/experiments.tex
blob: 14c83f65f80ca99b1075da45f1bcea01ee89869d (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
In this section, we apply the framework from Section~\ref{sec:bayes}
and~\ref{sec:active} on synthetic graphs and cascades to validate the Bayesian
approach as well as the effectiveness of the Active Learning heuristics.

We started with using the library PyMC to sample from the posterior distribution
directly. This method was shown to scale poorly with the number of nodes in the
graph, such that graphs of size $\geq 100$ could not be reasonably be learned
quickly. In Section~\ref{sec:appendix}, we show the progressive convergence of
the posterior around the true values of the edge weights of the graph for a
graph of size $4$.

In order to show the effect of the active learning policies, we needed to scale
the experiments to graphs of size $\geq 1000$, which required the use of the
variational inference procedure. A graph of size $1000$ has $1M$ parameters to
be learned ($2M$ in the product-prior in Eq.~\ref{eq:gaussianprior}). The
maximum-likelihood estimator converges to an $l_\infty$-error of $.05$ for most
graphs after having observed at least $100M$ distinct cascade-steps.


baseline

fair comparison of online learning

graphs/datasets

bullshit