Concepedia

Publication | Open Access

Kolmogorov's Contributions to Information Theory and Algorithmic Complexity

94

Citations

23

References

1989

Year

Abstract

Briefly, information theory says a random object X - p(x) has complexity (entropy) H = - Xp(x)log p(x), with the attendant interpretation that H bits are sufficient to describe X on the average. Algorithmic complexity says an object x has a complexity K(x) equal to the length of the shortest (binary) program that describes x. It is a beautiful fact that these ideas are much the same. In fact, it is roughly true that EK(X) - H. Moreover, if we let Pu(x) = Pr{U prints x} be the probability that a given computer U prints x when given a random program, it can be shown that log(1/Pu(x)) - K(x) for all x, thus establishing a vital link between the universal probability measure Pu and the universal complexity K. More on this later. The relationship of these ideas to probability theory was summed up in Kolmogorov's 1983 paper which was based on his 1970 talk in Nice. Perhaps only

References

YearCitations

Page 1