From 4d665f8693faf85ee20556508db27b86bb62f219 Mon Sep 17 00:00:00 2001 From: Thibaut Horel Date: Fri, 2 Nov 2012 15:15:50 +0100 Subject: Compute the constant in the upper bound of the general theorem --- general.tex | 9 ++------- 1 file changed, 2 insertions(+), 7 deletions(-) (limited to 'general.tex') diff --git a/general.tex b/general.tex index 530b8d3..12e01a5 100644 --- a/general.tex +++ b/general.tex @@ -57,19 +57,14 @@ For two reasons: For the function $\tilde{V}$ defined above, there is a truthful, budget feasible mechanism which achieves an approximation ratio of: \begin{displaymath} - \frac{5e-1}{e-1}\frac{\log(1+\mu)}{\mu} + A + \frac{5e-1}{e-1}\frac{2\mu}{\log(1+\mu)} + 5.1 \end{displaymath} -where $\mu$ is the smallest eigenvalue of $R$ (and $A$ is a constant that -I will compute tomorrow, it should be roughly around 10). +where $\mu$ is the smallest eigenvalue of $R$. \end{theorem} Note that everything becomes nice when $R \geq I_d$. In this case, the smallest eigenvalue is larger than 1. Hence $\log\det R\geq 0$ and an approximation on $\tilde{V}$ gives an approximation ration on $V$ (see discussion above). -Furthermore, we can bound $\log(1+\mu)/\mu$ by 1 and I think we fall back on -the approximation ratio of section 2. - -Can we motivate that $R\geq 1$ ? \subsection{Beyond Linear Models} TODO: Independent noise model. Captures models such as logistic regression, classification, etc. Arbitrary prior. Show that change in the entropy is submodular (cite Krause, Guestrin). -- cgit v1.2.3-70-g09d2