diff options
Diffstat (limited to 'paper/sections/lowerbound.tex')
| -rw-r--r-- | paper/sections/lowerbound.tex | 64 |
1 files changed, 64 insertions, 0 deletions
diff --git a/paper/sections/lowerbound.tex b/paper/sections/lowerbound.tex new file mode 100644 index 0000000..bbd8566 --- /dev/null +++ b/paper/sections/lowerbound.tex @@ -0,0 +1,64 @@ +In this section, we will consider the stable sparse recovery of section XXX. +Our goal is to obtain an information-theoretic lower bound on the number of +measurements necessary to approximately recover the parameter $\theta$ of +a cascade model from observed cascades. Similar lower bounds were obtained for +sparse linear inverse problems in XXXs. + +\begin{theorem} + Let us consider a cascade model of the form XXX and a recovery algorithm + $\mathcal{A}$ which takes as input $n$ random cascade measurements and + outputs $\hat{\theta}$ such that with probability $\delta<\frac{1}{2}$ + (over the measurements): + \begin{displaymath} + \|\hat{\theta}-\theta^*\|_2\leq + C\min_{\|\theta\|_0\leq s}\|\theta-\theta^*\|_2 + \end{displaymath} + where $\theta^*$ is the true paramter of the cascade model. Then $n + = \Omega(s\log\frac{m}{s}/\log C)$. +\end{theorem} + +This theorem should be contrasted with Corollary XXX: up to an additive +$s\log s$ factor, the number of measurements required by our algorithm is +tight. + +The proof of Theorem XXX follows an approach similar to XXX. Let us consider an +algorithm $\mathcal{A}$ as in the theorem. Intuitively, $\mathcal{A}$ allows +Alice and Bob to send $\Omega(s\log\frac{m}{s})$ quantity of information over +a Gaussian channel. By the Shannon-Hartley theorem, this quantity of +information is $O(n\log C)$. These two bounds together give the theorem. + +Formally, let $\mathcal{F}$ be a family such that: +\begin{itemize} + \item + \item + \item +\end{itemize} +and let $X = \big\{t\in\{-1,0,1\}^m\,|\, \mathrm{supp}(t)\in\mathcal{F}\big\}$. + +Consider the following communication game between Alice and Bob. +\begin{itemize} + \item Alice chooses $S$ uniformly at random from $\mathcal{F}$ and + $t$ uniformly at random from $X$ subject to $\mathrm{supp}(x) = S$ + \item Let $w\sim\mathcal{N}(0, \alpha \frac{s}{m}I_m)$ and $\theta + = t + w$. Since for all $\theta$, $\mathcal{A}$ recovers $\theta$ with + probability $1-\delta$ over the measurements, there exists a fixed set + of measurements $x_1,\ldots, x_n$ such that with probability + $1-\delta$ over $\theta$, $\mathcal{A}$ recovers $\theta$. Alice sends + those measurements to Bob. + \item Bob uses $\mathcal{A}$ to recover $\hat{\theta}$ from the + measurements, then computes $\hat{t}$ the best $\ell_2$ approximation + of $\hat{\theta}$ in $X$. +\end{itemize} + +We have the following from XXX and XXX respectively. + +\begin{lemma} + Let $S'=\mathrm{supp}(\hat{t})$. If $\alpha = \Omega(\frac{1}{C})$, $I(S, S') + = \O(n\log C)$. +\end{lemma} + +\begin{lemma} + If $\alpha = \Omega(\frac{1}{C})$, $I(S, S') = \Omega(s\log\frac{m}{s})$ +\end{lemma} + +Lemmas XXX and XXX together give Theorem XXX. |
