diff options
| author | Stratis Ioannidis <stratis@stratis-Latitude-E6320.(none)> | 2012-11-04 17:57:10 -0800 |
|---|---|---|
| committer | Stratis Ioannidis <stratis@stratis-Latitude-E6320.(none)> | 2012-11-04 17:57:10 -0800 |
| commit | 36e95cddab11a42e9e2893e534d2f74aed76b876 (patch) | |
| tree | 2c45ce9c8bf290f9d9020c397d2156ce5aa3988e /general.tex | |
| parent | 2befde8163f17a70c698a9ab099043ef2c76d8a0 (diff) | |
| parent | 2a4664283998d5bf9c6615d251fd62c30001b73e (diff) | |
| download | recommendation-36e95cddab11a42e9e2893e534d2f74aed76b876.tar.gz | |
Merge branch 'master' of ssh://74.95.195.229:1444/git/data_value
Diffstat (limited to 'general.tex')
| -rw-r--r-- | general.tex | 6 |
1 files changed, 3 insertions, 3 deletions
diff --git a/general.tex b/general.tex index 4f9aaad..550d6b7 100644 --- a/general.tex +++ b/general.tex @@ -79,13 +79,13 @@ The value function given by the information gain \eqref{general} is submodular. \end{lemma} \begin{proof} -The theorem is proved in a slightly different context in \cite{guestrin}; we +The theorem is proved in a slightly different context in \cite{krause2005near}; we repeat the proof here for the sake of completeness. Using the chain rule for the conditional entropy we get: -\begin{displaymath}\label{eq:chain-rule} +\begin{equation}\label{eq:chain-rule} V(S) = H(y_S) - H(y_S \mid \beta) = H(y_S) - \sum_{i\in S} H(y_i \mid \beta) -\end{displaymath} +\end{equation} where the second equality comes from the independence of the $y_i$'s conditioned on $\beta$. Recall that the joint entropy of a set of random variables is a submodular function. Thus, our value function is written in |
