summaryrefslogtreecommitdiffstats
path: root/general.tex
diff options
context:
space:
mode:
Diffstat (limited to 'general.tex')
-rw-r--r--general.tex2
1 files changed, 2 insertions, 0 deletions
diff --git a/general.tex b/general.tex
index 589b176..cb2154b 100644
--- a/general.tex
+++ b/general.tex
@@ -67,4 +67,6 @@ eigenvalue is larger than 1. Hence $\log\det R\geq 0$ and an approximation on
$\tilde{V}$ gives an approximation ration on $V$ (see discussion above).
\subsection{Beyond Linear Models}
+Selecting experiments that maximize the information gain in the Bayesian setup leads to a natural generalization to other learning examples beyond linear regression. In particular, suppose that the measurements
+
TODO: Independent noise model. Captures models such as logistic regression, classification, etc. Arbitrary prior. Show that change in the entropy is submodular (cite Krause, Guestrin).