nonuniform complexity
in contrast to the standard (or uniform) TM model where the same TM is used on all the infinitely many input sizes, a nonuniform model allows a different algorithm to be used for each input size.
[cite:;taken2 from @computability_arora_2009 chapter 6 boolean circuits]
[cite:;taken2 from @computability_arora_2009 chapter 6 boolean circuits]
let
be a class of sets. let
be a class of functions from
. the class
is defined to consist of all sets
for which there exist a set
and a function
such that for all
,
[cite:;taken2 from @complexity_vollmer_1999 definition 2.11]
let
. then
. if
is a class of functions then
. set
, i.e., the set of all polynomially length-bounded functions.
[cite:;taken2 from @complexity_vollmer_1999 definition 2.12]
there are non-recursive languages in [cite:;taken2 from @complexity_vollmer_1999 definition 2.12]
[cite:;taken2 from @complexity_vollmer_1999 chapter 2.3 non-uniform turing machines]
let
. an
advice sequence is a sequence of binary strings
, where
. for a language
* let
. let
. let
. apply this notation to other complexity classes, e.g.,
. these classes are sometimes referred to as "nonuniform P" or "nonuniform NP".
[cite:;taken2 from @tcs_leeuwen_1990 chapter 14 definition 2.2]
[cite:;taken2 from @tcs_leeuwen_1990 chapter 14 definition 2.2]
there is a way to make the circuit model of computation a uniform one: we restrict the families of circuits to those which can be output by a Turing Machine. so we obtain an alternative characterization of
:
.
we can also make the usual turing machine model of computation non-uniform. to do this, we give it a polynomially sized advice string which depends only on the length of the input. there is no constraint on what kind of function this advice can be; if we wish, it could even be undecidable.
the class of languages decided by turing machines equipped with an advice string is equivalent to
.