summaryrefslogtreecommitdiffstats
path: root/general.tex
diff options
context:
space:
mode:
Diffstat (limited to 'general.tex')
-rw-r--r--general.tex6
1 files changed, 3 insertions, 3 deletions
diff --git a/general.tex b/general.tex
index 4f9aaad..550d6b7 100644
--- a/general.tex
+++ b/general.tex
@@ -79,13 +79,13 @@ The value function given by the information gain \eqref{general} is submodular.
\end{lemma}
\begin{proof}
-The theorem is proved in a slightly different context in \cite{guestrin}; we
+The theorem is proved in a slightly different context in \cite{krause2005near}; we
repeat the proof here for the sake of completeness. Using the chain rule for
the conditional entropy we get:
-\begin{displaymath}\label{eq:chain-rule}
+\begin{equation}\label{eq:chain-rule}
V(S) = H(y_S) - H(y_S \mid \beta)
= H(y_S) - \sum_{i\in S} H(y_i \mid \beta)
-\end{displaymath}
+\end{equation}
where the second equality comes from the independence of the $y_i$'s
conditioned on $\beta$. Recall that the joint entropy of a set of random
variables is a submodular function. Thus, our value function is written in