A lot vs. Alot: 9 Grammatical Pitfalls
British mathematician who in 1937 formulated a precise mathematical concept for a theoretical computing machine, a key step in the development of the first computer. After the war he designed computers for the British government and helped in developing the concept of artificial intelligence.
Our Living Language : Alan Turing—father of computer science, codebreaker, cognitive scientist, theoretician in artificial intelligence—achieved fame in 1936 at the age of 24 with a paper in which he showed that no universal algorithm exists that can determine whether a proposition in a given mathematical system is true or false. In the process of his proof he invented what has been called the Turing machine, an imaginary idealized computer that can compute any calculable mathematical function. The essentials of this machine (an input/output device, a memory, and a central processing unit) formed the basis for the design of all digital computers. After World War II broke out, he worked in England as a cryptanalyst, where he put his extraordinary talents to work on breaking the famous Enigma code used by the German military. By 1940, Turing was instrumental in designing a machine that broke the German code, allowing the Allies to secretly decipher intercepted German messages for the rest of the war. At war's end, Turing was hired to help develop the world's first electronic computer and ultimately designed the programming system of the Ferranti Mark 1, the first commercially available digital computer, in 1948. His guiding principle that the brain is simply a computer was an important founding assumption for the new fields of cognitive science and artificial intelligence. He was making advances in modeling the chemical mechanisms by which genes control the structural development of organisms when he suddenly died, just before his forty-second birthday.