Site BLOG PAGE🔎 SEARCH  Ξ INDEX  MAIN MENU  UP ONE LEVEL
 OJB's Web Site. Version 2.1. Blog Page.You are here: entry1748 blog owen2 
Blog

Add a Comment   Listen to Podcast   Up to OJB's Blog List

The Enigma

Entry 1748, on 2015-11-04 at 21:11:06 (Rating 1, Computers)

I seem to have had a theme of blogging about people recently. First it was Grace Hopper, then Richard Feynman, and today I'm discussing Alan Turing, the famous computer pioneer, mathematician, and World War II code breaker.

I am currently listening to an audio book biography of his life called "Alan Turing: the Enigma" (a reference to his enigmatic personality and the German code he broke: Enigma) and the thing which I have found most interesting is the way he advocated for, and in some cases invented, many of the ideas and technologies we have today. He did this work after the code breaking he is probably most well known for.

So I'll list a few of his innovations here. I should say that he can't claim sole credit for some of these because he often worked by himself and "reinvented" ideas (often in a better form) which other early theoreticians and inventors had already found. For example, some ideas go back to Charles Babbage who didn't have the theory or the technology to exploit them at the time.

Anyway, here's the list...

Instructions and data should both be stored in main memory.

Many people see these two as being quite separate and on many machines the instructions would be read linearly from paper tape or cards and data would be stored in fast, random access memory. By putting the code in memory too it could be accessed much more quickly, plus there were two other benefits: any instruction could be accessed at any time so conditional jumps and loops could be done, and instructions could be modified by other instructions (see below for details).

It's just taken for granted today that code is loaded into working memory (RAM). That's (mainly) what's happening when you launch a program (the program's instructions are being copied from the disk to main memory) or boot your system (the operating system is being loaded into memory) but in the early days (the 1940s and 1950s) this wasn't obvious.

Programs stored in main memory allow conditional jumps and loops.

Conditional statements and loops allow a lot of efficiency and flexibility. A conditional statement allows the computer to run a set of instructions if a certain condition occurs. For example it could test if a bank balance is less than zero and show a warning if it is. Loops allow a chunk of code to be executed multiple times. For example, survey results for 100 participants could be analysed one at a time by skipping back to the start of the analysis code 100 times.

Any modern programmer would find it bizarre not to have access to conditional statements and loops, but some early machines didn't have these abilities.

Code in memory allows self modifying code.

If the programming code is in main memory it can be read and written freely, just like any other data. This allows instructions to be modified by other instructions. Turing used this for incrementing memory locations and other simple stuff but potentially it can be used for far more complex tasks.

I can remember when I did computer science being told that self modifying code was a bad thing because it made code hard to understand and debugging difficult, but it has its place and I use it a lot in modern interpreted languages.

Simple, fast processing units and more memory is the best strategy.

Some early American computer designs tried to provide a lot of complex operations built into the main processing unit. This made them more complicated and required more valves (this was before transistors or integrated circuits, of course) for the main processor and less for memory. Turing advocated simpler instruction sets which would allow for more memory and more efficient execution, and the programmer could write the complex code using simpler instructions.

This sounds very much like the modern concept of RISC (reduced instruction set computing) processors which provide a limited range of very fast, efficient instructions and use the extra space on the CPU for cache memory. The more complex instructions are generated by combining simpler ones by the compiler.

Microcode and pipelines.

Turing's computer, the ACE, ran at 1 MHz (one million cycles per second) which was the fastest of any machine at the time. But interpreting each instruction (figuring out what it meant, like where the data should come from) took several cycles and actually carrying out the function took several more. To make things go faster he interpreted the next instruction while the current one was being executed.

Modern processors have a "pipeline" where several stages of processing can be performed simultaneously. Today we also have deep, multiple pipelines (multiple steps of several streams of code being processed at once) and code prediction (figuring out what the next instruction will be) but the basic idea is the same.

Subroutines and libraries.

Most CPUs could only do very basic things. For example they could add whole numbers (like 42) but not multiply at all or work with real numbers (those with decimals, like 3.33). But many programs needed these operations, so instead of reinventing them over and over for each new program Turing created libraries of subroutines.

A library is a collection of useful chunks of code to do particular things, like work with real numbers. Modern processors have this function built in but more complex tasks, like reading a packet of data from the internet, still require long sequences of code.

Today computers typically have hundreds of libraries of thousands of subroutines (a rather old term for a chunk of code which can perform a task then return to what the computer was doing before it was run) and in many ways that is mostly what a modern operating system is: a collection of useful libraries.

Computers could be accessed remotely.

Turing thought that since there were so few computers around it made sense to allow people to access a computer they needed by remote control. He thought this could be done with special devices attached to the phone system.

Simple modems and other serial interfaces allowed this, and now we have the internet. Even though computers are no longer rare (I have 14 conventional computers at home plus many other devices, like iPads and iPhones, which are effectively computers) it is still useful to be able to access other computers easily.

Computers for entertainment.

Turing thought that "ladies would take their computers to the park and say 'my little computer said something so funny this morning'" (or something similar to this).

I couldn't help but think of this when my daughter showed me an amusing cat video on her iPhone today. Yes, ladies carry their computers everywhere and are constantly entertained by the funny little things they say.

No one's perfect.

So what did he get wrong? Well a few things, actually. For example, he wasn't a great enthusiast for making the computer easy to use. It was necessary to enter input and read output in base 32 expressed using a series of obscure symbols, plus he used binary but with the least significant bit first.

Perhaps the most important change in the last thirty years has been making computers easier to use. Turing can't claim much credit in this trend. Still, I see where he was coming from: if it's hard to build a computer and hard to write code, it should be hard to use them too!


Comment 1 (4440) by SM on 2015-11-04 at 22:10:50: Fascinating read, thanks.

Comment 2 (4442) by OJB on 2015-11-04 at 23:16:35:

Thanks for the positive feedback. I tried to translate my enthusiasm for some of the technicalities of computing into easy to understand terms so there are a few inaccuracies.

Comment 3 (4443) by SM on 2015-11-06 at 08:54:31:

For someone who's pretty much computer illiterate, it was understandable and interesting. That quote re: computers for entertainment - classic!

Comment 4 (4445) by OJB on 2015-11-06 at 10:00:30:

Excellent! I succeeded then. You mean "ladies would take their computers to the park and say 'my little computer said something so funny this morning'". Yes, the way the Brits spoke back in the 40s and 50s was classic.


You can leave comments about this entry using this form.

Enter your name (optional):
Enter your email address (optional):
Enter the number shown here:number
Enter the comment:

To add a comment: enter a name and email (optional), type the number shown, enter a comment, click Add.
Note that you can leave the name blank if you want to remain anonymous.
Enter your email address to receive notifications of replies and updates to this entry.
The comment should appear immediately because the authorisation system is currently inactive.

I do podcasts too!. You can listen to my latest podcast, here: OJB's Podcast 2024-08-22 Stirring Up Trouble: Let's just get every view out there and fairly debate them..
 Site ©2024 by OJBWeb ServerWhy Macs are BestMade & Served on Mac 
Site Features: Blog RSS Feeds Podcasts Feedback Log04 Nov 2024. Hits: 40,889,450
Description: Blog PageKeywords: BlogLoad Timer: 12ms