From 2a4664283998d5bf9c6615d251fd62c30001b73e Mon Sep 17 00:00:00 2001 From: Thibaut Horel Date: Mon, 5 Nov 2012 02:18:05 +0100 Subject: Fix missing references --- general.tex | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) (limited to 'general.tex') diff --git a/general.tex b/general.tex index 4f9aaad..550d6b7 100644 --- a/general.tex +++ b/general.tex @@ -79,13 +79,13 @@ The value function given by the information gain \eqref{general} is submodular. \end{lemma} \begin{proof} -The theorem is proved in a slightly different context in \cite{guestrin}; we +The theorem is proved in a slightly different context in \cite{krause2005near}; we repeat the proof here for the sake of completeness. Using the chain rule for the conditional entropy we get: -\begin{displaymath}\label{eq:chain-rule} +\begin{equation}\label{eq:chain-rule} V(S) = H(y_S) - H(y_S \mid \beta) = H(y_S) - \sum_{i\in S} H(y_i \mid \beta) -\end{displaymath} +\end{equation} where the second equality comes from the independence of the $y_i$'s conditioned on $\beta$. Recall that the joint entropy of a set of random variables is a submodular function. Thus, our value function is written in -- cgit v1.2.3-70-g09d2