\subsection{Bayesian Experimental Design} TODO: Introduce prior with covariance $\sigma^2 R$. Change in entropy/ mutual information is then ... So our scheme can be seen as Baysian prior with $R=I_d$. Extension of our main theorem. \subsection{Beyond Linear Models} TODO: Independent noise model. Captures models such as logistic regression, classification, etc. Arbitrary prior. Show that change in the entropy is submodular (cite Krause, Guestrin).