diff options
| author | Stratis Ioannidis <stratis@stratis-Latitude-E6320.(none)> | 2012-11-03 23:47:39 -0700 |
|---|---|---|
| committer | Stratis Ioannidis <stratis@stratis-Latitude-E6320.(none)> | 2012-11-03 23:47:39 -0700 |
| commit | 722aa31fef933d37bbf14c27a4055a20a1df5b2f (patch) | |
| tree | 7d2b59b6cc61d80863eb6627fccbe58e2e7ca0a1 /general.tex | |
| parent | bf077992f679691fcb498db922e11cf53e6415b6 (diff) | |
| download | recommendation-722aa31fef933d37bbf14c27a4055a20a1df5b2f.tar.gz | |
general
Diffstat (limited to 'general.tex')
| -rw-r--r-- | general.tex | 2 |
1 files changed, 2 insertions, 0 deletions
diff --git a/general.tex b/general.tex index 589b176..cb2154b 100644 --- a/general.tex +++ b/general.tex @@ -67,4 +67,6 @@ eigenvalue is larger than 1. Hence $\log\det R\geq 0$ and an approximation on $\tilde{V}$ gives an approximation ration on $V$ (see discussion above). \subsection{Beyond Linear Models} +Selecting experiments that maximize the information gain in the Bayesian setup leads to a natural generalization to other learning examples beyond linear regression. In particular, suppose that the measurements + TODO: Independent noise model. Captures models such as logistic regression, classification, etc. Arbitrary prior. Show that change in the entropy is submodular (cite Krause, Guestrin). |
