summaryrefslogtreecommitdiffstats
path: root/general.tex
diff options
context:
space:
mode:
authorStratis Ioannidis <stratis@stratis-Latitude-E6320.(none)>2012-11-03 23:47:39 -0700
committerStratis Ioannidis <stratis@stratis-Latitude-E6320.(none)>2012-11-03 23:47:39 -0700
commit722aa31fef933d37bbf14c27a4055a20a1df5b2f (patch)
tree7d2b59b6cc61d80863eb6627fccbe58e2e7ca0a1 /general.tex
parentbf077992f679691fcb498db922e11cf53e6415b6 (diff)
downloadrecommendation-722aa31fef933d37bbf14c27a4055a20a1df5b2f.tar.gz
general
Diffstat (limited to 'general.tex')
-rw-r--r--general.tex2
1 files changed, 2 insertions, 0 deletions
diff --git a/general.tex b/general.tex
index 589b176..cb2154b 100644
--- a/general.tex
+++ b/general.tex
@@ -67,4 +67,6 @@ eigenvalue is larger than 1. Hence $\log\det R\geq 0$ and an approximation on
$\tilde{V}$ gives an approximation ration on $V$ (see discussion above).
\subsection{Beyond Linear Models}
+Selecting experiments that maximize the information gain in the Bayesian setup leads to a natural generalization to other learning examples beyond linear regression. In particular, suppose that the measurements
+
TODO: Independent noise model. Captures models such as logistic regression, classification, etc. Arbitrary prior. Show that change in the entropy is submodular (cite Krause, Guestrin).