aboutsummaryrefslogtreecommitdiffstats
path: root/paper/sections
diff options
context:
space:
mode:
authorjeanpouget-abadie <jean.pougetabadie@gmail.com>2015-02-01 20:01:59 -0500
committerjeanpouget-abadie <jean.pougetabadie@gmail.com>2015-02-01 20:02:06 -0500
commita4397760f744840005fd66dc87395493c521376b (patch)
treedfd2322879e0089c56d960cd07bf3fcd272646f7 /paper/sections
parent0482865b3fc128964584db1af66be9fb0783a4af (diff)
downloadcascades-a4397760f744840005fd66dc87395493c521376b.tar.gz
assumptions section+references
Diffstat (limited to 'paper/sections')
-rw-r--r--paper/sections/assumptions.tex38
1 files changed, 21 insertions, 17 deletions
diff --git a/paper/sections/assumptions.tex b/paper/sections/assumptions.tex
index 89113b4..7cf9345 100644
--- a/paper/sections/assumptions.tex
+++ b/paper/sections/assumptions.tex
@@ -1,4 +1,24 @@
-In this section, we discuss the main assumption of Theorem~\ref{thm:neghaban} namely the restricted eigenvalue condition. We begin by comparing to the irrepresentability condition considered in \cite{Daneshmand:2014}.
+In this section, we discuss the main assumption of Theorem~\ref{thm:neghaban} namely the restricted eigenvalue condition. We then compare to the irrepresentability condition considered in \cite{Daneshmand:2014}.
+
+\subsection{The Restricted Eigenvalue Condition}
+
+The restricted eigenvalue condition, introduced in \cite{bickel:2009}, is one of the weakest sufficient condition on the design matrix for successful sparse recovery \cite{vandegeer:2009}. Several recent papers show that large classes of correlated designs obey the restricted eigenvalue property with high probability \cite{raskutti:10} \cite{rudelson:13}. Expressing the minimum restricted eigenvalue $\gamma$ as a function of the cascade model parameters is highly non-trivial. However, the restricted eigenvalue property is however well behaved in the following sense:
+
+\begin{lemma}
+\label{lem:expected_hessian}
+Expected hessian analysis!
+\end{lemma}
+
+This result is easily proved by adapting slightly a result from \cite{vandegeer:2009} XXX. Similarly to the analysis conducted in \cite{Daneshmand:2014}, we can show that if the eigenvalue can be showed to hold for the `expected' hessian, it can be showed to hold for the hessian itself. It is easy to see that:
+
+\begin{proposition}
+\label{prop:expected_hessian}
+If result holds for the expected hessian, then it holds for the hessian!
+\end{proposition}
+
+It is most likely possible to remove this extra s factor. See sub-gaussian paper by ... but the calculations are more involved.
+
+
\subsection{The Irrepresentability Condition}
@@ -32,20 +52,4 @@ If the irrepresentability condition holds with $\epsilon > \frac{2}{3}$, then th
\end{proposition}
-\subsection{The Restricted Eigenvalue Condition}
-
-In practical scenarios, such as in social networks, recovering only the `significant' edges is a reasonable assumption. This can be done under the less restrictive eigenvalue assumption. Expressing $\gamma$ as a function of the cascade model parameters process is non-trivial. The restricted eigenvalue property is however well behaved in the following sense:
-
-\begin{lemma}
-\label{lem:expected_hessian}
-Expected hessian analysis!
-\end{lemma}
-This result is easily proved by adapting slightly a result from \cite{vandegeer:2009} XXX. Similarly to the analysis conducted in \cite{Daneshmand:2014}, we can show that if the eigenvalue can be showed to hold for the `expected' hessian, it can be showed to hold for the hessian itself. It is easy to see that:
-
-\begin{proposition}
-\label{prop:expected_hessian}
-If result holds for the expected hessian, then it holds for the hessian!
-\end{proposition}
-
-It is most likely possible to remove this extra s factor. See sub-gaussian paper by ... but the calculations are more involved.