blob: b04b6dc67a8fb39a0dc641405f841b81f53141f4 (
plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
|
The experimental results obtained in Section 5 look impressive and confirm the
relevance of using a Bayesian approach for the Network Inference problem.
However, we believe that many other aspects of Bayesian Inference could and
should be exploited in the context of Network Inference. We wish to explore
this in future work and only highlight a few possible directions here:
\begin{itemize}
\item obtain formal guarantees on the convergence of measure of the
Bayesian posterior. Similarly to convergence rate results obtained with MLE
estimation, we believe that convergence results could also be obtained in
the Bayesian setting, at least in restricted settings or by making certain
assumptions about the network being learned.
\item strengthening the experimental results by systematically studying how
different network properties impact the speedup induced by active learning.
\item finish formally deriving the update equations when using Bohning
approximations for Variational Inference.
\item extend the combined Variational Inference and Bohning approximation
to Hawkes processes to obtain a unified Bayesian framework for both
discrete-time and continuous-time models.
\item explore the impact of using more expressive (in particular
non-factorized) on the speed of convergence, both in offline and active
online learning.
\end{itemize}
|