History Files
 

Helping the History Files

Contributed: £169

Target: £424

2023
Totals slider
2023

Hosting costs for the History Files website have been increased by an eye-watering 40% in 2025. This non-profit site is only able to keep going with your help. Please make a donation to keep it online. Thank you!

 

 

Worldwide

Alan Turing's Processing Machine Invention


External content provider image

The computer that can perform any conceivable operation

In 1936, Alan Turing devised a computing machine which could, in theory, rival even the most powerful computers of today. Modern innovations, from advanced processors to intelligent tools like EssayWriter.com, an AI essay writer, all trace their roots back to the foundational ideas of Turing and his contemporaries.

One of the few things that possesses the unique ability to be almost universally humbling to all humanity is the incredible speed with which the power of personal computers seems to progress. According to a prophetic 1965 article by Intel co-founder Gordon E. Moore, the processing power of computers practically doubles (this came to be known as “Moore’s Law”), moving the world’s technology swiftly forward at an almost exponential rate. Especially for those old enough to remember when personal computers were either non-existent or a mere novelty, it is surely staggering how much things have changed.

This being said, it is especially surprising that in 1936, when the idea of even basic computing was barely a blip on the radar of the world’s consciousness, a man named Alan Turing invented a machine that could do absolutely anything that our current computers can do and much, much more.

Turing's grand theory

His invention was called, appropriately, a “Turing Machine,” but unfortunately, it was not so much a physical invention as an interesting theoretical oddity. In essence, the Turing Machine is a computer that operates using a finite series of symbols, much in the same way a binary system of 1s and 0s operates.

It is a purely hypothetical machine, yet it has proven to be a very valuable tool in learning about the philosophy behind basic computing. For instance, it was stated that “Every ‘function which would naturally be regarded as computable’ can be computed by a Turing machine.” This is the crux of what is called the Church-Turing Thesis, named for its authors, Alan Turing and Alonzo Church, who collaborated in 1943.

What this means is that any computation, even those more advanced than are used to create the most modern video games and computer animation, should theoretically be possible using a Turing Machine from 1934, given, and this is the important part, an infinite amount of time, and an infinite amount of memory with which to store combinations of symbols.

In theory, the importance of the touring machine remains the realization that the basic tenants of computation have not changed at all over the many years of the computer’s development from hulking goliaths that took up entire buildings to much more powerful computers that can fit in the palm of your hand.

The principles remain the same; the difference is merely in the technology behind them (the materials used and the physics employed) and, thus, the speed at which they can operate. The transformation from giant machines to personal computers did not come with the advent of any brand new methods of computation, this has remained rather stagnant, but rather with the advent of solid state physics, which have been able to manipulate electrical impulses on much tinier levels better, using the near-infinite capacity of silicon chips, rather than bulky vacuum tubes and transistors.

This leaves an important question unanswered, I suppose: What is meant in the Church-Turing thesis by ‘function which would naturally be regarded as computable’? Would it be possible for computer scientists to create a function that is not naturally regarded as computable? Probably not, but that doesn’t stop theorists from trying.

The possibility of hypercomputers

Functions that extend beyond the scope of Turing Machines and “natural computability” are considered hypercomputable, only computable by theoretical hypercomputers. For instance, physicists have suggested that if it were possible to get a processor to move quickly enough, it could become subject to time dilation in keeping with Einstein’s theory of Special Relativity, meaning that time would move more slowly for the computer than for an outside observer.

This way, the computer could perform an almost infinite number of calculations almost instantaneously. Of course, the only way this would succeed in circumventing the Turing thesis would be to actually move the computer at the speed of light, which Einstein proved impossible. It would require an infinite amount of energy, and even if such energy were provided, the computer would have to be infinitely strong to survive the many forces acting upon it.

Furthermore, quantum computers have been theorized, which would theoretically use quantum mechanics and, in some cases, other (as of yet undiscovered) dimensions to perform these kinds of operations.

So you can see, computer scientists are still very far away from any sort of true hypercomputer, but that doesn’t matter, does it? After all, things seem to be going pretty well using normal computers, as long as one can be listening to music, watching a movie, playing a game, downloading videos and surfing the internet at the same time, the world seems perfectly content with where the Church-Turing thesis has brought humanity thus far.

While you're here, why not explore the latest banner feature and daily posts by clicking on the image below. There's so much more available on the History Files!

 

 

     
Images and text copyright © 2025. Content supplied by an external professional marketing service. The History Files accepts no responsibility for any external links on this page.