ScientificAmerican.com  
What is Godel's Theorem?
Alpha Nernst
Melbourne, Australia
Melvin Henriksen--Professor of Mathematics Emeritus at Harvey Mudd College--offers this explanation:

KURT GODEL
KURT GODEL achieved fame in 1931 with the publication of his Incompleteness Theorem.

Giving a mathematically precise statement of Godel's Incompleteness Theorem would only obscure its important intuitive content from almost anyone who is not a specialist in mathematical logic. So instead, I will rephrase and simplify it in the language of computers.

Imagine that we have access to a very powerful computer called Oracle. As do the computers with which we are familiar, Oracle asks that the user "inputs" instructions that follow precise rules and it supplies the "output" or answer in a way that also follows these rules. The same input will always produce the same output. The input and output are written as integers (or whole numbers) and Oracle performs only the usual operations of addition, subtraction, multiplication and division (when possible). Unlike ordinary computers, there are no concerns regarding efficiency or time. Oracle will carry out properly given instructions no matter how long it takes and it will stop only when they are executed--even if it takes more than a million years.

Let's consider a simple example. Remember that a positive integer (let's call it N) that is bigger than 1 is called a prime number if it is not divisible by any positive integer besides 1 and N. How would you ask Oracle to decide if N is prime? Tell it to divide N by every integer between 1 and N-1 and to stop when the division comes out evenly or it reaches N-1. (Actually, you can stop if it reaches the square root of N. If there have been no even divisions of N at that point, then N is prime.)

What Godel's theorem says is that there are properly posed questions involving only the arithmetic of integers that Oracle cannot answer. In other words, there are statements that--although inputted properly--Oracle cannot evaluate to decide if they are true or false. Such assertions are called undecidable, and are very complicated. And if you were to bring one to Dr. Godel, he would explain to you that such assertions will always exist.

Even if you were given an "improved" model of Oracle, call it OracleT, in which a particular undecidable statement, UD, is decreed true, another undecidable statement would be generated to take its place. More puzzling yet, you might also be given another "improved" model of Oracle, call it OracleF, in which UD would be decreed false. Regardless, this model too would generate other undecidable statements, and might yield results that differed from OracleT's, but were equally valid.

Do you find this shocking and close to paradoxical? It was even more shocking to the mathematical world in 1931, when Godel unveiled his incompleteness theorem. Godel did not phrase his result in the language of computers. He worked in a definite logical system and mathematicians hoped that his result depended on the peculiarities of that system. But in the next decade or so, a number of mathematicians--including Stephen C. Kleene, Emil Post, J.B. Rosser and Alan Turing--showed that it did not.

Research on the consequences of this great theorem continues to this day. Anyone with Internet access using a search engine like Alta Vista can find several hundred articles of highly varying quality on Godel's Theorem. Among the best things to read, though, is Godel's Proof by Ernest Nagel and James R. Newman, published in 1958 and released in paperback by New York University Press in 1983.

Answer posted on January 25, 1999
© 1996-2004 Scientific American, Inc. All rights reserved.
Reproduction in whole or in part without permission is prohibited.