Theory of Computation is defined as “the branch that deals with how efficiently problems can be solved on a model of computation, using an algorithm”. Its purpose is to create computational mathematical models that can reflect real-world computers. Some pioneers of the theory of computation were Alonzo Church, Kurt Gödel, Alan Turing, Stephen Kleene, John von Neumann and Claude Shannon.

Theory of Computation (TOC) has been advancing repeatedly over a short period of time. It was initially evolved as a branch of mathematics allied with computational complexity until it “took a foundational role in addressing challenges arising in computer systems and networks, such as error-free communication, cryptography, routing, and search”.

Theory of Computation is branched into three – automata theory, computability theory and computational complexity theory.

Automata, literally meaning “something is doing something by itself”, are referred to the computing machines that are also used for computability proofs. Automata theory deals with the theory with which these machines work. This theory compromises of theoretical computer science and discrete mathematics. An automaton can be in various states. One state is the state the automaton starts in and the others are the final or exit states. Some of the models are:

i. Finite Automata: These are used in text processing, compilers, and hardware design.

ii. Context-Free Grammars: There are used to define programming languages and in Artificial Intelligence

iii. Turing machines: Abstract models of the simple PCs we have at home.

Computability theory is also called the recursion theory. It deals with the extent to which a computer can solve a problem. When a problem can be solved by a computer, the problems are called computable problems.

Rudimentary questions answered by this theory are : "What does it mean for a function on the natural numbers to be computable?" and "How can noncomputable functions be classified into a hierarchy based on their level of noncomputability?".

Computational Complexity Theory deals with the efficiency with which a computer can solve a problem. Time and Space are considered to be the two vectors responsible for a problem’s efficiency. When a problem requires significant resources, regardless of the algorithms used, it is regarded as a difficult problem.

This theory mainly asks a general question about possible algorithm to solve a problem, whereas analysis of algorithms involves analysing the amount of resources needed to solve a problem.

1. https://en.wikipedia.org/wiki/Theory_of_computation

2. http://toc.csail.mit.edu/

3. https://en.wikipedia.org/wiki/Computational_complexity_theory

4. https://en.wikipedia.org/wiki/Computability_theory

5. https://en.wikipedia.org/wiki/Automata_theory

6. http://gradestack.com/gate-exam/computer-science/theory-of-computation/introduction-of-theory-of-computation/

7. http://www.ioenotes.edu.np/media/notes/theory-of-computation-toc/Chapter1-introduction.pdf

1. What exactly are the factors that distinguish computable problems from the non computable ones?

2. What do you think is the future of Theory of Computation?

3. To what extent can the Turing machine help us in solving computable problems?