Posted by: jonkatz | June 19, 2009

## Computing vs. verifying

Recently, I have been thinking on-and-off about the following problem and variants thereof. Fix a function $f: \{0,1\}^n \rightarrow \{0,1\}$. When is it easier to verify that a pair $(x, b)$ satisfies $f(x)=b$ then to compute $f(x)$ from scratch? (One can always verify by computing $f(x)$, so it can never be harder.) Without being given any additional information along with $(x, b)$, verification can never be much easier than computation (proof: given a verification procedure $V$ we can compute $f$ as follows: if $V(x, 0)=1$ then output 0, else output 1), so to make things interesting we allow verification to additionally take as input a “proof string” $\pi$.

Let’s define what we want more formally. There are various possibilities here, but to keep things clean let’s say we want an algorithm $V$ such that:

• For all $x$ there exists a $\pi$ such that $V(x,f(x), \pi)=1$.
• For all $x$ and all $\pi$, we have $V(x,\overline{f(x)}, \pi)=0$.

The reader can easily see that this is nothing more than a question about when non-deterministic computation is “better” than deterministic computation. Assuming $P \neq NP$, there are several interesting functions $f$ (e.g., the characteristic function for 3SAT) for which verification is faster than computation.

All very well and good, but here are some things I really want to know:

1. For what functions $f$ can we prove, without assumptions, that verification is “better” than computation?
2. Even allowing reasonable complexity-theoretic assumptions, is verification faster than computation for every poly-time computable function $f$? For a concrete question, is $DTIME(n^k) \subseteq NTIME(o(n^k))$ for every $k$?

Related to the first question, Lance has a recent post that shows, in particular, that any function in $P$ can be verified by an $AC^0$ verifier. Rather than reduce to 3SAT, let’s prove this directly. Fix a function $f$ and a machine $M$ that computes it. Given an input $x$, the proof $\pi$ will be the complete tableau of $M$‘s computation on the input $x$. A tableau can be checked for correctness by some polynomial number of local computations, each looking at only a constant number of bits. An unbounded fan-in AND gate at the bottom can thus be used to express correctness of all the local checks.

Since parity is in $P$ but not in $AC^0$, this gives a provable example of where verification is easier than computation.

What’s the crypto connection? Well, say we have a low-power clinet that wants to compute $f(x)$. One thing it can do is to outsource computation of $f$ to a more-powerful server. But since the client may not trust the server, it will also request the server to provide proof that the computation was done correctly. Of course ,we would like the process of verification to be cheaper than simply having the client compute $f$ itself.

See the work of Goldwasser et al. and Hohenberger-Lysyanskaya for some related work in this area.

## Responses

1. Welcome to the blog world.

By the way Michael Rabin has old paper on computing vs checking that is probably one of oldest on this issue.

2. If you have a reference, I’d appreciate it.

PS: I am, of course, thrilled that you’re the first commenter on my blog. =)

3. Looking forward to reading the blog.

4. Its great that you have decided to start a blog. I have been thinking about doing that for a long time. Anyway, I hope you will keep it alive, well, as opposed to another blog that was recently started in our community 😉

5. Here is a randomized “algorithm” that I’ve been bringing up in various places. For some reason nobody but me thinks it is interesting :(. It builds on an example given by Leonid Levin.

Let R = an n-bit random string, for some sufficiently large n (say n=2 million).

Let P be the proposition “R cannot be compressed to less than half its length” (Levin). This asserts that R’s Kolmogorov complexity is > |R|/2, which is obviously almost certainly true if R is really random. (For any epsilon there is a large enough n so that P is true with probability > 1-epsilon).

By a theorem of Chaitin, if P is true, it cannot be proved by any reasonable axiom system (Chaitin’s version of Godel’s incompleteness theorem). To be more precise, for any recursive axiom system there is a constant c, such that the axioms cannot prove any statement of the form K(x) > c.

So here you’ve got a function for which randomized “computation” is trivial, but verification is undecidable.

I guess that is sort of a trick answer, but I felt I had to bring it up.

6. I think the definitions implies that the language is in co-NP as well.