summaryrefslogtreecommitdiffstats
path: root/intro.tex
diff options
context:
space:
mode:
authorThibaut Horel <thibaut.horel@gmail.com>2013-12-12 17:48:03 -0500
committerThibaut Horel <thibaut.horel@gmail.com>2013-12-12 17:48:03 -0500
commit7f6b960702e38cf2e34d5594ba2b37a45f8a9520 (patch)
tree0959b7bb483fd3af4568941b95f53778479ac80d /intro.tex
parent184fa7504c3f2b4c1ee797e91fe8b919eff51ae9 (diff)
downloadrecommendation-7f6b960702e38cf2e34d5594ba2b37a45f8a9520.tar.gz
Reimport things from the appendix to the main part for the camera-ready version
Diffstat (limited to 'intro.tex')
-rwxr-xr-xintro.tex2
1 files changed, 1 insertions, 1 deletions
diff --git a/intro.tex b/intro.tex
index da38f8f..8e684a0 100755
--- a/intro.tex
+++ b/intro.tex
@@ -61,7 +61,7 @@ Our convex relaxation of \EDP{} involves maximizing a self-concordant function s
%Our approach to mechanisms for experimental design --- by
% optimizing the information gain in parameters like $\beta$ which are estimated through the data analysis process --- is general. We give examples of this approach beyond linear regression to a general class that includes logistic regression and learning binary functions, and show that the corresponding budgeted mechanism design problem is also expressed through a submodular optimization. Hence, prior work \cite{chen,singer-mechanisms} immediately applies, and gives randomized, universally truthful, polynomial time, constant factor approximation mechanisms for problems in this class. Getting deterministic, truthful, polynomial time mechanisms with a constant approximation factor for this class or specific problems in it, like we did for \EDP, remains an open problem.
-In what follows, we describe related work in Section~\ref{sec:related}. We briefly review experimental design and budget feasible mechanisms in Section~\ref{sec:peel} and define \SEDP\ formally. We present our convex relaxation to \EDP{} in Section~\ref{sec:approximation} and use it to construct our mechanism in Section~\ref{sec:main}. We conclude in Section~\ref{sec:concl}. All proofs of our technical results are provided in the appendix. %we present our mechanism for \SEDP\ and state our main results. %A generalization of our framework to machine learning tasks beyond linear regression is presented in Section~\ref{sec:ext}.
+In what follows, we describe related work in Section~\ref{sec:related}. We briefly review experimental design and budget feasible mechanisms in Section~\ref{sec:peel} and define \SEDP\ formally. We present our convex relaxation to \EDP{} in Section~\ref{sec:approximation} and use it to construct our mechanism in Section~\ref{sec:main}. We conclude in Section~\ref{sec:concl}. All proofs of our technical results are provided in the full version of this paper \cite{arxiv}. %we present our mechanism for \SEDP\ and state our main results. %A generalization of our framework to machine learning tasks beyond linear regression is presented in Section~\ref{sec:ext}.
\junk{