Can randomness---or more technically, ``information''---be effectively
extracted from a semi-random source? A special case of this question
was answered by von Neumann in 1951. He described a simple method for
extracting an unbiased random sequence from the flips of a biased
coin. A more general form of the question was asked by Reimann and
Terwijn in the context of algorithmic randomness, so we will start
with an introduction to Kolmogorov complexity, effective Hausdorff
dimension, and Martin-L{\"o}f randomness. Kolmogorov complexity
measures the information content of a finite binary string.
Informally, the complexity of a string is defined to be the length of
the shortest program that generates it. A closely related notion,
effective (Hausdorff) dimension, measures the information density of
an infinite binary sequences. We can now formulate the question in
terms of effective dimension: is there a sequence that has effective
Hausdorff dimension 1/2---in other words, a half-random
sequence---that does not compute a sequence of higher effective
dimension? As it turns out, such sequences exist. We will discuss this
and related results.