summaryrefslogtreecommitdiffstats
diff options
context:
space:
mode:
-rw-r--r--general.tex9
1 files changed, 2 insertions, 7 deletions
diff --git a/general.tex b/general.tex
index 530b8d3..12e01a5 100644
--- a/general.tex
+++ b/general.tex
@@ -57,19 +57,14 @@ For two reasons:
For the function $\tilde{V}$ defined above, there is a truthful, budget
feasible mechanism which achieves an approximation ratio of:
\begin{displaymath}
- \frac{5e-1}{e-1}\frac{\log(1+\mu)}{\mu} + A
+ \frac{5e-1}{e-1}\frac{2\mu}{\log(1+\mu)} + 5.1
\end{displaymath}
-where $\mu$ is the smallest eigenvalue of $R$ (and $A$ is a constant that
-I will compute tomorrow, it should be roughly around 10).
+where $\mu$ is the smallest eigenvalue of $R$.
\end{theorem}
Note that everything becomes nice when $R \geq I_d$. In this case, the smallest
eigenvalue is larger than 1. Hence $\log\det R\geq 0$ and an approximation on
$\tilde{V}$ gives an approximation ration on $V$ (see discussion above).
-Furthermore, we can bound $\log(1+\mu)/\mu$ by 1 and I think we fall back on
-the approximation ratio of section 2.
-
-Can we motivate that $R\geq 1$ ?
\subsection{Beyond Linear Models}
TODO: Independent noise model. Captures models such as logistic regression, classification, etc. Arbitrary prior. Show that change in the entropy is submodular (cite Krause, Guestrin).