aboutsummaryrefslogtreecommitdiffstats
path: root/paper/sections/intro.tex
diff options
context:
space:
mode:
Diffstat (limited to 'paper/sections/intro.tex')
-rw-r--r--paper/sections/intro.tex23
1 files changed, 13 insertions, 10 deletions
diff --git a/paper/sections/intro.tex b/paper/sections/intro.tex
index a04b5f1..4d44395 100644
--- a/paper/sections/intro.tex
+++ b/paper/sections/intro.tex
@@ -12,9 +12,9 @@ of nodes which become `infected' over time without knowledge of who has
`infected' whom, can we recover the graph on which the process takes place? The
spread of a particular behavior through a network is known as an {\it Influence
Cascade}. In this context, the {\it Graph Inference}\ problem is the recovery of
-the graph's edges from the observation of few influence cascades. We propose to
-show how sparse recovery can be applied to solve this recently introduced graph
-inference problem.
+the graph's edges from the observation of few influence cascades. {\color{red}
+Cite references} We propose to show how sparse recovery can be applied to solve
+this recently introduced graph inference problem.
{\color{red} Graph inference to Network inference}
Tackling the graph inference problem means constructing a polynomial-time
@@ -70,8 +70,8 @@ as a ``sparse signal'' measured through influence cascades and then recovered.
The challenge is that influence cascade models typically lead to non-linear
inverse problems. The sparse recovery literature suggests that
$\Omega(s\log\frac{m}{s})$ cascade observations should be sufficient to recover
-the graph. However, the best known upper bound to this day is $\O(s^2\log
-m)$.{\color{red} Add reference}
+the graph.{\color{red} Add reference} However, the best known upper bound to
+this day is $\O(s^2\log m)$.{\color{red} Add reference}
The contributions of this paper are the following:
@@ -83,7 +83,8 @@ The contributions of this paper are the following:
\item we give an algorithm which recovers the graph's edges using $\O(s\log
m)$ cascades. Furthermore, we show that our algorithm is also able to
recover the edge weights (the parameters of the influence model),
- a problem which has been seemingly overlooked so far.
+ a problem which has been seemingly overlooked so far. {\color{red} NOT
+ TRUE}
\item we show that our algorithm is robust in cases where the signal to
recover is approximately $s$-sparse by proving guarantees in the
\emph{stable recovery} setting.
@@ -91,10 +92,10 @@ The contributions of this paper are the following:
observations required for sparse recovery.
\end{itemize}
-The organization of the paper is as follows: we conclude the introduction by
-a survey of the related work. In Section~\ref{sec:model} we present our model
-of Generalized Linear Cascades and the associated sparse recovery formulation.
-Its theoretical guarantees are presented for various recovery settings in
+The organization of the paper is as follows: we conclude the introduction by a
+survey of the related work. In Section~\ref{sec:model} we present our model of
+Generalized Linear Cascades and the associated sparse recovery formulation. Its
+theoretical guarantees are presented for various recovery settings in
Section~\ref{sec:results}. The lower bound is presented in
Section~\ref{sec:lowerbound}. Finally, we conclude with experiments in
Section~\ref{sec:experiments}.
@@ -120,6 +121,8 @@ of~\cite{Abrahao:13} studies the same continuous-model framework as
\cite{GomezRodriguez:2010} and obtains an ${\cal O}(s^9 \log^2 s \log m)$
support recovery algorithm, without the \emph{correlation decay} assumption.
+{\color{red} Du et.~al make a citation}
+
\begin{comment}
They assume a single-source model, where only one source is selected at random,
which is less realistic in practice since `patient 0' is rarely unknown to us.