The dependence of computability on numerical notations


Research output: Journal PublicationsJournal Article (refereed)peer-review

2 Citations (Scopus)


Which function is computed by a Turing machine will depend on how the symbols it manipulates are interpreted. Further, by invoking bizarre systems of notation (i.e., interpretation schemes mapping symbols to their denotations) it is easy to define Turing machines that compute textbook examples of uncomputable functions, such as the solution to the decision problem for first-order logic. Thus, the distinction between computable and uncomputable functions depends on the system of notation used. This raises the question: which systems of notation are the relevant ones for determining whether a function is computable? These are the acceptable notations. I argue for a use
criterion of acceptability: the acceptable notations for a domain of objects D are the notations that we can use for the usual D-activities. Acceptable notations for natural numbers are ones that we can count with. Acceptable notations for logical formulas are the ones that we can use in inference and logical analysis. And so on. This criterion makes computability a relative notion—whether a function is computable depends on which notations are acceptable, which is relative to our interests and abilities. I argue that this is not a problem, however, since there is independent reason for taking the extension of computable function to be relative to contingent facts. Similarly, the fact that the use criterion makes a mathematical distinction depend on practical concerns is also not a problem because it dovetails with similar phenomena in other areas of computability theory, namely the roles of notation in computation over real numbers and of Extended Church’s Thesis in complexity theory.
Original languageEnglish
Pages (from-to)10485-10511
Number of pages27
Issue number11
Early online date17 Jun 2020
Publication statusPublished - Nov 2021


  • Acceptable notation
  • Church–Turing thesis
  • Computability
  • Computable semantics
  • Deviant encoding


Dive into the research topics of 'The dependence of computability on numerical notations'. Together they form a unique fingerprint.

Cite this