diff options
| author | Thibaut Horel <thibaut.horel@gmail.com> | 2012-11-05 02:18:05 +0100 |
|---|---|---|
| committer | Thibaut Horel <thibaut.horel@gmail.com> | 2012-11-05 02:18:31 +0100 |
| commit | 2a4664283998d5bf9c6615d251fd62c30001b73e (patch) | |
| tree | 1554852eba4f43f1c6bc356d1df665ba6cd066f3 /general.tex | |
| parent | 86b8f967a12fe5870fe7c8d0f765149c003832c6 (diff) | |
| download | recommendation-2a4664283998d5bf9c6615d251fd62c30001b73e.tar.gz | |
Fix missing references
Diffstat (limited to 'general.tex')
| -rw-r--r-- | general.tex | 6 |
1 files changed, 3 insertions, 3 deletions
diff --git a/general.tex b/general.tex index 4f9aaad..550d6b7 100644 --- a/general.tex +++ b/general.tex @@ -79,13 +79,13 @@ The value function given by the information gain \eqref{general} is submodular. \end{lemma} \begin{proof} -The theorem is proved in a slightly different context in \cite{guestrin}; we +The theorem is proved in a slightly different context in \cite{krause2005near}; we repeat the proof here for the sake of completeness. Using the chain rule for the conditional entropy we get: -\begin{displaymath}\label{eq:chain-rule} +\begin{equation}\label{eq:chain-rule} V(S) = H(y_S) - H(y_S \mid \beta) = H(y_S) - \sum_{i\in S} H(y_i \mid \beta) -\end{displaymath} +\end{equation} where the second equality comes from the independence of the $y_i$'s conditioned on $\beta$. Recall that the joint entropy of a set of random variables is a submodular function. Thus, our value function is written in |
