Reviews

Reviews for the paper “Plain stopping time and conditional complexities revisited”:

First reviewer

In their paper, Andreev, Posobin and Shen study the notion of "stopping time complexity", corresponding to the amount of information needed to specify when to stop while reading an infinite sequence. They provide various simple characterizations of the notion, thereby showing the robustness and in some sense the naturalness of the notion. They also prove that some of the characterizations cannot be extended to a more general setting.

Overall, the paper is clear and very well-written. The proofs are correct, and the exposition is of interest for a general audience. The notion of stopping time complexity admits clear motivations. For all these reasons, I recommend to accept this paper to STACS.

I found only one typo: p23, last paragraph: "desciption" should be "description".

Second reviewer

The submission is about stopping time complexity and related notions of Kolmogorov complexity. The main technical results are several equivalent characterizations of stopping time complexity. Similar equivalences for a related but more general notion are investigated and some, partially negative results are obtained.

The submission contains new and interesting results about a variant of Kolmogorov complexity that is worth investigating. The authors present a neat classification scheme for variants of Kolmogorov complexity that comprises several standard variants plus the variant considered here, thus setting the latter variant in context. The exposition features some innovative ideas and interesting proof methods.

On the downside, it must be said that at least some of the technical contributions can be viewed as neither that surprising nor particularly difficult.

The submission contains solid work of a somewhat specialized and partially technical nature. The submission can be accepted for a competitive conference such as STACS if their is enough space.

The numerical overall evaluation is meant rather as 1.8 than as 2.4.

Third reviewer

Plain stopping time and conditional complexities revisited - Referee report

The paper looks at a recent new notion in the field of algorithmic information theory (AIT) - stopping time complexity.

The plain stopping-time complexity of $x$ (let's call it $S(x)$) is the length of the smallest program which, when given $x$ written in a one-way input tape (and some working memory), succeeds in stopping after reading the last symbol of $x$.

The paper offers various different characterizations of S(x). It is easy to see that $S(x)$, up to additive constant terms, equals the length of the smallest program enumerating a prefix-free set containing $x$.

From this it follows, also easily, that $S(x)$ is the complexity of a program which outputs $x$, when given access to *any* infinite string $X$ prefixed by $x$, where the same program must output $x$ for every infinite extension $X$ of $x$. Oddly, this isn't mentioned in the paper. (see proof [1] below)

The key result in the paper (in my view) is that for each $x$ there is a "hardest infinite extension" $X$ of $x$ such that S(x) = C^X(x). I.e. there is an $X$ such that any program which outputs $x$ when given $X$ must have size $>= S(x)$.

So infinite extensions of $x$ are adversaries that lower-bound $S(x)$, and we get the following:

  • $S(x) \le l$ iff there exists a length-$l$ program $p$ such that for every $X$ extending $x$ $p$ outputs $x$ on input $X$, and
  • $S(x) > l$ iff there exists an extension $X$ of $x$ such that no length-$l$ program $p$ outputs $x$ when given $X$.

This minimax theorem seems really interesting to me. It is unfortunate that the result isn't presented in this form in the paper itself. To me it just jumps off the page as a very striking duality result. I would expect researchers in the area to see this result as interesting, given that I have that even I found it interesting and I am not easily excited by results in this area.

The proof of this statement is not easy, but one should say that the core of the proof (Theorem 8) already appeared in a previous paper (the full version of the paper where stopping time complexity was first defined). The authors point this out, but provide a proof nonetheless, which makes me think that maybe the two proofs (in the full version of the previous paper, and in this submission) were found independently.

Other results in the paper try to understand what it would mean to have stopping time complexity of $x$ conditioned on $y$. They do not paint a very elegant picture of that situation, suggesting that either the notion is not very robust, or that the right definition is yet to be found. In the first case, the technical reasons of why the notion is not robust would be relevant for experts in AIT, but they would be uninteresting to a general audience; in the second case, one is left wondering if perhaps further thought on the matter would produce a more suitable definition. So in some sense the results seem secondary to the one above.

In summary, I see one result which I find very interesting, and whose proof is non-trivial. This alone would seem to me like it should be reason for recommending acceptance, but alas the most non-trivial part of the proof already appeared in a previous paper (perhaps independently?). I also see a not completely satisfactory attempt at extending the notion of stopping time complexity to a conditional notion.

To offer further criticism, I would say that such an interesting theorem is obviously begging for an application outside of AIT. The mathematics may be beautiful, but a rejection from STACS could be justified on the basis of the paper being disconnected from topics of broader interest. This particular issue would not arise in a more computability-focused conference, such as CiE.

Considering all these things, my overall sense is that this is a borderline paper for STACS. I am surely not an expert in AIT and could be missing something vital from the full picture, so my recommendation would comes with only medium confidence.

[1] One direction of this is trivial. For the other direction, argue thus: suppose we had a machine $M$ which outputs $x$ when given _any_ infinite string $X$ prefixed by $x$. Suppose now that we had $x = x_1 \ldots x_n$ written on a one-way input tape. Consider all the prefixes $z_i = x_1 \ldots x_i$ of $x$. Then for $i = 1$ up to $n$: dovetail $M$ on all possible extensions of $z_i$; if $M$ outputs $z_i y$ for non-empty $i$, then stop dovetailing and increment $i$; if $i < n$ this will always happen when $z_i y = x$. When $i = n$ is reached $M$ will halt after finitely many steps on all extensions; i.e. it will halt at some finite depth in the tree of extensions of $x$; indeed, if it did not, there would be some infinite $X$ causing $M$ to never halt.

On first reading the result and trying to prove it to assess how easy it was, I came up with the proof above, and thought it was a triviality. Only then did I realize that what was being proven was a kind of dual of the above.