Colossus and the breaking of the wartime "fish" codes

Colossus and the breaking of the wartime "fish" codes
Michie, Donald
COLOSSUS AND THE BREAKING OF THE WARTIME "FISH" CODES*
ADDRESS: Medina Apartments, 63-65 St. Marks Road,
Randwick, Sydney NSW 2031 AUSTRALIA.
ABSTRACT: One of the authors of the recently released
"General Report on Tunny," here describes his threeyear experience as a founder member of the "Testery"
and "Newmanry" teams. Their combined use of innovative
methods and machines led from the breaking of the
German Lorenz military traffic to its large-scale daily
decipherment.
KEYWORDS: Bletchley, Lorenz, Tunny, Fish, Heath
Robinson, Colossus, Tiltman, Tutte, Turing, Tester,
Newman, Flowers, high-speed electronic computing.
PERSONAL PREHISTORY
In late 1941, following my 18 th birthday, a normal
next phase would have been two further terms at
boarding school, with an option for scholarship holders
to proceed to a shortened University degree course
before joining up. But over that Christmas my teenage
imagination was fired by a tale from my father
concerning a mysterious establishment at Bedford. He
had it on the authority of the then War Minister, Sir
James Grigg, that as preparation for doing something
unspecified but romantic behind enemy lines there were
opportunities to sign up for a Japanese course starting
in a couple of months' time. I duly journeyed to
Bedford and presented myself at the address given.
Sorry, wrong info
My request to enroll elicited from the Intelligence
Corps officer who saw me a somewhat puzzled reply: "Who
told you that we have a Japanese course now? That
particular exercise is planned for the Autumn." Noting
my confusion he added: "But we have courses on codebreaking. There's a new intake just starting. Would
that interest you instead? I'll have someone find you a
billet nearby. Make sure to be back here at 9 a.m.
Monday."
In World War II one did not mess about. Returning to
the London suburbs just long enough to pack a suitcase,
I was back and signed in to the School of Codes and
Ciphers, Official Secrets Act and all, on the Monday
morning. With the rest of the new class I was soon held
in thrall by our instructor, a certain Captain Cheadle,
and by the black arts of codes and ciphers.
With nothing to occupy my evenings, I arranged to have
my own key to the building and classroom. My habit
became to return after hours to the texts and
exercises. The resulting accelerated learning curve
made my selection inevitable when a Colonel Pritchard
arrived from Bletchley. He was on a mission to recruit
for the new section that was being formed by Ralph
Tester to follow up John Tiltman's and William Tutte's
successive coups. The hope was that breaking and
reading Fish traffic could be placed on a regular
basis. The Pritchard interview lasted no more than a
few minutes. I was to present myself within 48 hours at
the entrance to Bletchley Park with a sealed letter.
After admission and a visit to the billeting office, I
was parked in the Mansion House. My first task was to
memorise teleprinter code until I could fluently sightread punched paper tape. Pending completion of the Hut
assigned to Major Tester's new section I sat as an ugly
duckling in a large room filled to capacity by members
of the Women's Auxiliary Air Force. What were they
doing? Who knows? New arrivals were imprinted with a
draconian DON'T ASK DON'T TELL principle in regard to
anyone's immediate business but their own. I did,
however, discover that those whose boy friends were on
active service felt only contempt for an apparently fit
young male in civilian attire. Some of them had lost
boyfriends in the RAF, and many had boyfriends still
alive but in daily peril.
Charm of a Second Lieutenant
The experience did nothing to ease my sense of
disorientation in the new surroundings. Relief appeared
in the person of a uniformed and exquisitely charming
Intelligence Corps officer, Second Lieutenant Roy
Jenkins. My task was to bring him up to my own recently
acquired sight-reading skills. Roy's post-war career
was to include Cabinet Minister and Chancellor of
Oxford University. In my isolation, his company was
rescue and balm. We departed to swell the ranks of
Tester's new section, in my case via a most curious
diversion.
Forty men and a teenager
On reporting to Ralph Tester I was immediately
dispatched to take charge of a room like a small
aircraft hangar. It was located at some distance from
his new Hut. Within it there sat at tables several
dozen uniformed men who remain in my memory as being
all of the rank of Lance Corporal. What I can attest
beyond error is that I quickly became convinced of the
infeasibility of the operation which it was now my job
to supervise.
As later explained, once the offset had been determined
of two intercepts known to constitute a "depth" (the
same message retransmitted with the same key, but with
the plain-language message at a different offset with
respect to that key) they were added so as to cancel
out their common key. The resulting "depth-sum" text
must logically then consist of the addition to itself
at that offset, or "stagger", of a German plain
language message. Given the text of such a depth, if
one guessed that some character-sequence, say
"GESELLSCHAFT" was likely to appear somewhere in the
plaintext, then the experiment could be tried of adding
that 12-character sequence (a "crib") to the depth's
first 12 characters, inspecting the result, then to
characters 2-13, 3-14, ... etc. in tedious progression
through the text, a procedure known as "dragging". To
keep track of what follows, the reader needs only to
keep two things in mind: (1) that in international
teleprint code "9" stands for "space" (as between
words), and (2) that "add" refers to an operation known
as "modulo-2 addition" which makes addition and
subtraction indistinguishable. In consequence, if, say,
HONES + OBDZE = NEST9 then NEST9 + OBDZE = HONES.
Let us return to the Lance Corporal dragging the crib
GESELLSCHAFT step by step through the text of a depthsum, pausing at each step to see what resulted from
each successive trial addition. He stopped only if the
result of his addition at any stage yielded, say,
"SELLSCHAFT9U" (the symbol "9" denotes characterspace),
at once concluding from such a local break that the
offset was 2 and that the plaintext contained the
sequence "GESELLSCHAFT9U". In the hands of a
cryptanalyst the immediate next step would be to extend
by two characters the "crib" that had been dragged,
yielding, perhaps, "SELLSCHAFT9UNT", which would
strongly suggest some further extension, say
"GESELLSCHAFT9 UNTER", which might possibly be rewarded
by "SELLSCHAFT9UNTER9A" and so forth.
Misplaced task-decomposition
So what was wrong with the reasonable-seeming thought
that the task could be decomposed into a brute force
(crib-dragging only) component and a skilled (extending
the breaks) component? Why not first throw brute force
at it and then pass the text on to the cryptanalysts
with candidate breaks already found and flagged? Take a
few dozen Intelligence Corps clerks each equipped with
a list of cribs to be dragged, together with rulesheets for the boolean addition of teleprint characters
and for the recognition of common fragments of military
German. Let them do the dragging, marking all local
breaks found or suspected. Marked-up texts could then
be sent on to the Testery proper, to receive the
attention of cryptanalysts whose time would thus be
conserved by prior delegation of the drag-work.
It sounded good. Experience soon convinced me
otherwise. But my conviction had to be validated in the
eyes of others. My only course was to drive the project
along until its futility became evident, not to the
band of massed Lance Corporals, but to the authors of
the original proposal, whoever they were (this I never
knew).
The flaw lay in the non-decomposability of a task once
talent and much practice has melded it into a fluent
unity. The cognitive psychologists speak of
"automatization". To the eye of an observer from Mars,
delivery of the serve at tennis might appear to be a
sufficiently separate and stereotyped task to suggest a
change of rules. The expert player might be allowed to
employ a brute-force server (crib-dragging posse of
Lance Corporals) who would on delivery of his service
instantly quit the court, leaving the tactically highly
skilled tennis professional (cryptanalyst) to continue
the rally.
Trade-offs can be debated for each separate athletic or
intellectual skill, but can only be quantified
empirically, case by case. In tennis, as in depthbreaking, each opening move (the serve) flows smoothly
and subliminally into the movesequence (the rally) that
follows. The gains from continuity of the single-agent
scheme probably outweigh in tennis the sacrifice of
sheer serving speed. The same principles were
eventually shown to dominate the depth-breaking case.
To the Testery
The dogged endeavours of my well-drilled force of cribdraggers in due course generated sufficient
documentation for me to report that the "human wave"
assault was unlikely to contribute effectively and was
best disbanded. After this interlude, depressing for
all concerned, I gained the long-sought shore of the
Testery proper. I was turned over to a young graduate,
now the internationally distinguished mathematician P.
J. Hilton, for instruction in the earlier mentioned
method known as "Turingery".
Peter knew all the Testery hand-procedures backward and
forward, and played a massive part in perfecting them.
My first and vivid memory was that, although only a
year or two older than me, he smoked a pipe. My second
was of his didactic strictures on my fetish of tidiness
and aesthetics in paper-and-pencil work. I should say
"my then fetish". With efficiency and speed at an
unimaginable premium, not to mention justified awe of
my new mentor, I was cured of this kind of
perfectionism for life!
Other vivid images of my first encounter with the
Testery are first and foremost of Major (later Colonel)
Ralph Tester himself. I recall his mesmeric impact on
female spectators in the lunch break as he leapt,
daemonic and glowing about the tennis court with an
animality that I had only ever envisaged as radiating
from the great god Pan. Yet a year later when I was
already in the Newmanry, engaged in a machine-based
attack on the same Fish ciphers, the same man was ashen
under his tan. He had had to summon me (presumably at
Newman's request) to reprove my conduct. Why had I been
canvassing the cryptographic staff of both Newmanry and
Testery for signatures to a petition for the
administrative merging of the two sections? With the
naivety of a nineteen-year-old I was oblivious of such
facts as that, even if a Foreign Office section and a
War Office section could have been merged, one or other
of Tester and Newman would have had to be dumped, and
that it would not have been Newman. An ingenious
administrative compromise resulted. A fictional "Mr. X"
appeared on Newman's books whose fake identity four
selected Testery staff assumed for periods in rotation,
acting as a species of internal consultant. This gave
good technical liaison, previously absent.
Tester had the sense of purpose and personal humility
of an outstanding leader. At the time of Rommel's
retreat to Tunisia, we suddenly found that some
mysterious change in the system had locked us out of
the Berlin-Tunis channel. A group of us offered to go
flat out round the clock. Ralph's cryptographic skills
were really too unpractised to be of material help, as
he and we knew. But he sat among us, bolt upright as
was normal for him, unflagging as the hours raced by.
In the end the hours were not racing, and we young
Turks were drooping and nodding. Ralph, focussed and
refulgent as ever, saw this: "You know," he said
tactfully, "it's easy for me. Most things go downhill
with age. Stamina for some reason goes the other way.
So you're no good at this sort of thing until you're at
least forty. Another coffee, anyone?"
During the glory days of the American space programme,
when the mean age of space vehicle commanders seemed to
be getting more and more venerable, I recalled Tester's
words.
Strange incident, best forgotten
The Testery's machine operators were ATS girls
("Auxiliary Territorial Service" I think). One of them,
Helen Pollard (now Currie, see References) in her
reminiscence of the Testery speaks not only of the
thrill of it all but briefly hints at a romantic
attachment. That attachment outlasted the war. If there
is to be a dedication of this memoir then let it be to
her.
For all the attractions of the new life, or perhaps
because of them, I could not drop from my mind the
initial "white feather" impact of that roomful of WAAF
girls. While on leave visiting my home in Weybridge, I
learned from my father of questions from his peers at
the St. Georges Hill golf club about what his son was
doing for the war effort. Apart from knowing that I was
not after all learning Japanese, his mind was
unavoidably blank on what I was now up to. It was out
of the question to give information of any kind to any
person outside the wire beyond "sort of clerical work"
or the like. He asked me whether I had ever considered
active service.
Back in BP I asked for an interview with Colonel
Pritchard, and requested a transfer to the North
African desert. Pritchard let me finish. Then he said:
"Who's been getting at you?" Taken off guard, I
waffled. "No-one?" he enquired politely, and let his
question hang in the air.
Eventually I blurted out that my father had mentioned
such a possibility, but had applied no pressure.
Anyway, I maintained, it had nothing to do with my
decision. There was another uncomfortable pause. Then:
"I have to instruct you to return to duty. You see, Mr
Michie, we have a war on our hands. Inconvenient, but
unfortunately true. Unless you have further questions,
you are free to return at once to your Section." Pause.
"And by the way, I do not expect you to raise such
matters again." Pause. "Either with me or with anyone
else." Longer pause. "As for your father, I do not
anticipate that he will raise them either."
I returned to the Testery and I confess I felt
relieved. I don't believe I gave it a further thought.
But many years later my mother told me that my father
had received a visit at his place of work in the City
of London from an army colonel, who presented himself
as my superior officer. Did I know anything about it? I
shook my head. For a decade or two after the war, to
reveal anything whatsoever about Bletchley Park and its
activities continued to be embargoed under the Official
Secrets Act. Inevitably its subjective restraints
weakened over long time. None the less, 25 years passed
before any mention of British use in 1943-45 of
electronic computers for a cryptanalytic purpose
appeared in the open literature [7].
Chess, Turing, and "thinking machines"
It was through needing to consult the originator of
Turingery on some point that I first met Turing. We
soon discovered a common interest in chess, and also
the fact that we were both sufficiently poor players to
be able to give each other a level game. At BP a person
was either a chess-master, having been recruited for
that reason (similarly with winners of national
crossword competitions) or he did not count chess among
his interests. We formed the habit of meeting once a
week for a game of chess in a pub in Wolverton. On the
pervasive need-toknow principle we never discussed his
work in the Naval section. When I was demobilized I
still knew nothing about Enigma, except possibly its
name. Our shared topics of interest were (a) the
possible mechanization of chess-playing and (b)
learning machines. These interests were inspired in me
by him, and were shared with Jack Good, now
internationally renowned in mathematical statistics. In
the post-war years, "thinking machines" continued to
occupy the three of us in occasional correspondence and
meetings until Turing's death.
Good! Now for some backwards extension, the
cryptanalyst would say. How about 9WISSENSCHAFT? ...
9GESELLSCHAFT? ...and so forth. If none of one's stock
cliches leapt to mind either at plaintext or delta
level, then one might ask the sole German-speaking
worker in the Hut, Peter Ericsson. His linguistic
gifts, matched with an even more fluent associative
imagination would kick in. If his first verbal
suggestion didn't hit the mark, he'd be over to your
desk with pencil and eraser. In minutes he would have
extended your small break in both directions in a
lightning succession of trials, retractions and
consolidations.
The mental style of another notable young colleague,
Peter Benenson, was an almost unnerving contrast. Sheer
concentrated doggedness was applied systematically,
obsessively, hour after hour. That too, I noted, could
garner miracles, less swiftly but perhaps more surely,
sometimes reducing jobs that others had tossed onto the
heap as intractable. After the war he went on to found
Amnesty, culminating in a colourful exit with
psychiatric overtones. I did not follow the details.
It was essentially by such generate-and-test cycles of
conjecture that Brigadier Tiltman had got out the very
first depth, intercepted on August 30, 1941. In his
case, the displacement, or "stagger", was three
characters rather than one. This is more adverse in
terms of guessing candidate extensions, but partly
compensated by a gain in the length of each individual
successful extension.
Peter Ericsson and I shared digs for the last two
years. He opened the world of visual arts to me,
including everything, such as it is, that I know about
filmmaking. He was also spellbinding on both the
comical and the crazy aspects of urban cultures of many
lands. After the war he and Lindsay Anderson, destined
to become one of Britain's innovative film directors
(remember "If launched an avant-garde film magazine
Sequence. They worked on the floor of a virtually
unfurnished London apartment. Occasionally I visited
from Oxford and helped where I could.
TURING)S SEQUENTIAL BAYES RULE
A major statistical contribution which Turing had
earlier developed for the Enigma work was of such
generality that it was quickly taken up as a staple sup
port for certain critical operations of our own work on
Fish. After the war "Turing's sequential Bayes rule"
became the foundation stone of an entire new branch of
applied statistics. It was expounded with further
extensions by I. J. Good in his 1950 book Probability
and the Weighing of Evidence. When Jack Good joined me
shortly after my recruitment by Newman to found the new
machine section, he brought with him everything of
Enigma methods which were generalizable. Good came
straight from close collaboration with bring in C. H.
O'D Alexander's Naval Enigma section, of which Turing
had been the original nucleus and first Head.
A digression here will illustrate certain unique
qualities of Turing himself and of the organization in
which history had embedded him.
Turing style and BP style: both unusual
The Naval section had originally been founded to
support Turing's great Enigma breakthrough. Hence it
was natural that he be asked to head it. Unfortunately
his uncanny intellectual gifts were tightly interwoven
with an at least equally uncanny lack of what are
ordinarily called "social skills". The predictable
result was administrative chaos. Rapid ad hoc
extemporizations came to the rescue from one of his
brightest lieutenants, the one-time British Chess
Champion Hugh Alexander. He had made his pre-wax living
(there was in those days no money in chess) as an
experienced and fast-thinking manager of the John Lewis
London department store. So a little job like quietly
and tactfully reorganizing Turing's bewildered section
was to him an interesting challenge. In short order
Alexander flowed into the de facto headship. Turing
continued happily as de jure Head, no longer distracted
by these matters.
One day bring arrived at the gate of the Park late for
work. On such occasions one signed oneself into a book
with ruled columns and headings that included "Name of
Head of Section". Turing unexpectedly wrote "Mr.
Alexander", and proceeded in to work. Nothing was said.
But somewhere wheels turned silently. Records were
updated. Alexander continued his miracles of inspired
and often unorthodox deployment of human and material
resources, but now as the official Head of Naval
section.
I had this from a third party, and never asked Turing
about it. I think he would have found my question
uninteresting.
Post-war development
Jack Good was Alan Turing's close friend and colleague,
and also mine. Postwar secrecy at the time of Good's
publication, and for many years thereafter, constrained
those who had worked at BP to eliminate from their
writings any clue, however miniscule and indirect, that
might lead to public discovery of the very existence of
any such organization. Hence the true authorship of a
new statistical methodology of that Turing developed
for application to the Enigma work, and all other
conceivably traceable links to its origin, had to be
concealed. As far as I am aware, the present article
constitutes the first disclosure on public record that
the originator of what is today called sequential
analysis (following Abraham Wald's coinage of this term
in his post-war publication of a book under that name)
was in fact A.M. Turing. His development of a Bayesderived weights-of-evidence approach to rational
belief-revision was more general and more far-reaching
than Wald's non-Bayes contribution.
Asked in the 1975-76 taped interview to name Turing's
greatest intellectual contribution during the war,
Newman had this to say:
NEWMAN: The main ... the real contribution was the
statistical theory of Turing which he didn't invent for
us [the Newmanry] but for another problem which he was
concerned with [Enigma] which afterwards became a very
important advance in the statistical method.
EVANS: Could you say something about that?
NEWMAN: Well, it's not my field but it's... something
called sequential analysis.
Miring used the dual formulation of probability of the
Bayesian school, which gives two distinct
interpretations according as we wish to speak of the
probability, under some hypothesis (e. g. that "this
shuffled pack contains only one of the red suits") of
the outcome of an observation (e. g. "I drew a club"),
or of the probability of the hypothesis itself.
The first is characterized as a limiting frequency (of
an event E) and is often called "objective
probability".
The second is characterized as a degree of rational
belief (in a hypothesis H) and is often called
"subjective probability"
The wartime problem to be solved was of the following
type:
A hypothesis initially judged to be true with
probability P(H) is subjected to sequential repetitions
of a test with a view to deciding either to accept or
to reject it. For example, we may wish to decide
whether the above-stated hypothesis is true of a given
shuffled pack.
We proceed to withdraw cards one at a time, note the
colour, and then replace it at a randomly chosen place
in the pack. We don't want to commit ourselves to
accepting H if it is going to turn out false. Equally
we don't want to reject H if it is going to turn out to
be true. Each draw of a card costs us x dollars, so we
also don't want to go on sampling for too long.
Before Turing, all statistical methods for calculating
risks of error were based on fixed sample-size
statistics. But can a stopping rule be devised that
controls the risks of the two kinds of error to exactly
the same extent as the best fixed sample-size procedure
and at the cost of substantially fewer repetitions?
SECURE CONDUCT: ENFORCEMENT OR INTERLOCK?
The German operators were, not surprisingly, forbidden
to retransmit a message without changing the wheel
settings. Someone doubtless had figured out the danger
of generating a depth through disregarding this
precaution. An analogous peace-time interdict is
supposed to deter drivers from taking vehicles onto the
road without fastening seat belts. Reflection suggests
a safer expedient. Road vehicles could by law
incorporate interlocks in their manufacture, enabling
the ignition to fire when and only when the driver's
seat-belt is fastened.
As we know, that is not the way that the minds of civil
transport authorities work. Nor was it the way of wartime Germany's military authorities with the Lorenz
machine. Rather than commission a re-design, they
banned undesired operator behaviours. The curative
effect proved patchy at best. By degrees the stable
door was in the end closed. By then the horse had
bolted. If German cryptanalysts had been given
opportunity to vet the Lorenz for design flaws, then
the British project to crack it could never have got
off the ground. In Britain the code-makers were the
Royal Corps of Signals. The code-breakers were the
mixed-services signals intelligence organization built
up at BP during World War II. Similar
compartmentalization prevailed, so some of my
colleagues ascertained, both in America and in Russia.
Interlock, filtering, etc. precautions would of course
at once occur to a codebreaker. In like manner, the
only dependable way of protecting corporate and
governmental computer networks today from the criminal
trespasses of hackers is to hire from their top echelon
on the principle of fighting fire with fire.
Generating depths was but the first and most extreme of
many human foibles that introduced unwanted
regularities into messages. The habit, for example, of
hitting shift-up and shift-down keys twice in quick
succession, just to make sure, could have been rendered
harmless electromechanically. A simple logical filter
placed between the stream of characters from the
keyboard and the wireless transmitter, could have done
the trick. Examples could be multiplied.
At all events, one thing is sure. The day that the
world's nations break the hermetic seal between codemakers and code-breakers will see the end of the
military cryptology game as we know it.
GOOD COMPANIONS
A common enterprise of comrades: many people of both
sexes and of every rank and degree have spoken or
written of their WW-II experiences, and of the
exhilaration of "Each for all and all for each". At BP,
friendship and mutual enjoyment continued in the more
than half of each twenty-four hour cycle that was
passed outside the wire, at dances with Wrens or ATS,
sing-songs in the transport coaches, group expeditions
to cinemas and pubs, daytime discussionwalks in the
countryside on coming off a night shift.
Fax from being shouldered aside by the urgency of worktime preoccupations, in wartime Britain every kind of
cultural interest, educational activity and
entertainment blossomed. Twenti-first century leisure
notions of dumbed down amusements, of mindless hanging
out, ganging up or freaking out, would have seemed like
bad news from another planet. Work-place politics, turf
wars and petty spites lay in the future. Later, often
much later, people of my generation came upon them for
the first time and made belated adjustment. I am not
alone in the impression that our new world seems
sometimes locked in joyless pursuit of the transient,
or of the unattainable. We and our juniors elbow each
other under ever more unpredictable competitive
conditions. Yet memory tells us that today's gathering
ills do not necessarily spring predestinate from
unalterable flaws. There are other modes of living and
working together. We know. We were there.
RELEASE OF THE NEWMANRY REPORT
On 29th September 2000, the 505-page "General Report on
Tunny" was released to the Public Records Office by the
British Government Communications Headquarters (GCHQ),
Cheltenham. The document does not give the identities
of its authors. They were I. J. Good, D. Michie and G.
A. Timms.
The Report details how the "Colossus" high-speed
electronic computers were used to break the German
wartime enciphering by Lorenz machines of the traffic
between Hitler's Berlin HQ and the various Army Groups
in occupied Europe and North Africa, including
collateral links such as Vienna-Athens. The first
manual break in December 1941 of a 4000-characters
message intercepted on one of these links was achieved
by Brigadier John Tiltman at the secret code-breaking
centre at Bletchley. The entire structure and intrinsic
logic of the Lorenz machines was then reconstructed by
William Tutte from the sample of 4000 characters of
pure key obtained by Tiltman. By the late spring of
1942, Major (later Colonel) Tester formed a section
able to exploit Tutte's feat so as in favourable cases
to recover by hand both the wheel-patterns that defined
the current intrinsic logic for a given month on a
given link, and also the start positions, or
"settings", of the 12 cipher wheels used to encipher
each message. As earlier related, I joined the
"Testery" soon after its creation.
In December 1942, the mathematician M. H. A. Newman was
given the job of developing and testing his bold idea
for mechanizing the discovery of cipher-wheel settings
of Lorenz-generated messages. He at once commissioned
an electronic machine, the "Heath Robinson" (American
equivalent: Rube Goldberg), of which the main component
was put together by the Post Office at Dollis Hill, and
the output printer together with its interface by THE
at Malvern.
Early in 1943 he formed his "Newmanry" section with one
cryptanalyst (myself), two engineers and 16 Wrens. The
prototype prefabricated parts of the the pilot Heath
Robinson were delivered soon afterwards, on-site
assembly and testing began, and by April 1943 the
machine was operational. Wheel-setting trials were
started, but ancillary use of the machine was needed
for amassing statistics on German military plaintext
before operational use could start in earnest.
Meanwhile Newman had already followed up with
conceptual specifications for the Colossus. He selected
the Dollis Hill Post Office research engineer T. H.
Flowers, originally recommended to him by Alan Turing,
to convert his requirements into detailed
implementations. As soon as initial trials using the
somewhat temperamental pilot Heath Robinson had
indicated feasibility of method, he was able to
commission construction of the first of the Colossus
machines.
The Colossus design possessed radically new properties.
Once the data-tape containing the intercepted
ciphertext of a message had been scanned in via the
high-speed optical tape reader, the whole business of
storing the data and of applying to it trial-and-error
combinations of settings of simulated Lorenz-machine
wheel-patterns was performed internally. In addition,
hand-switches allowed the user to apply specified
boolean constraints to the machine's search of the
stored data for statistical regularities. Constraints
could be varied in the light of intermediate results
read from the on-line typewriter. Colossus 1 was
assembled over Christmas 1943 and was successfully
installed in late January 1944, when it passed its
first test against a real enciphered message tape.
Operation on a serious scale began in February.
Starting with monthly changes, by the closing stages of
the war the Germans were changing the intrinsic logic,
that is the wheel patterns, of the Lorenz cipher every
day on every link. The German machines at each end of a
link acted both in encipher-and-send mode and in
receive-and-decipher modes. Provided that the operators
at each end ensured that the same patterns and the same
settings of the 12 wheels were used for a given
transmission, the plaintext message input on punched
paper tape at one end was output as plaintext at the
other end, having made the journey over the airwaves as
ciphertext.
To read the intercepted messages sent on a given link
on a given day it was necessary both to break the day's
new logic -(represented as patterns set up on the
Lorenz machine's 12 wheels) and then, for each of the
day's messages intercepted on that link, to discover
the starting positions to which each of the 12 wheels
had been set for that particular message-transmission.
We named the intercepted traffic on the different
links, Berlin-Paris, Berlin-Rome, Berlin-Tunis, etc.
after different species of fish. The first of these to
be broken was called Tunny, and it became a common
practice to use "Tunny" as a generic label. This
explains the title borne by the Good-Michie-Timms
report.
By the end of the war the closely interlaced operations
of Colonel Tester's hand-cryptanalysis section (the
"Testery") and of Max Newman's machine section (the
"Newmanry"), were breaking about 25 new sets of wheel
patterns weekly and the complete settings of about 150
transmissions. On grounds of the relative costeffectiveness of anticipated further effort, a rather
larger number of intercepts were abandoned with some
settings still not found. Under the coordinated control
of 22 cryptographers and 273 Wrens on a 3-shift regime,
nine Colossus machines, each weighing over a ton, were
working continuously round the clock, with one still
under completion. Peak monthly decryption reached in
excess of 600 messages per month.
To convey a broad appreciation of the combinatorial
magnitudes of the two tasks, of finding the patterns
and finding the settings: the number of possible sets
of 12 wheel patterns was of the order of 1012. The
number of possible settings of the 12 wheels was (41 x
31 x 29 x 26 x 23) x (43 x 47 x 51 x 53 x 59) x (61 x
37), i.e. approximately 10120. The use of parentheses
above designates the three kinds of wheels, namely five
"chi" wheels corresponding to the five teleprinter
channels of 5-hole punched paper tape, five "psi"
wheels, and two "mu" wheels, or "motor" wheels. The
words in quotation marks are the names of letters that
were taken arbitrarily from the Greek alphabet. The mu
wheels were of a different character from the others
and exercised a control function. Their task was to
govern step by step whether the psi wheels were to move
on en bloc, in concert with the chi wheels, or whether
they should stand still for the current step.
It is worth noting at this point that this further
elaboration of the Lorenz machine design had the object
of further complicating the task of would-be
codebreakers. In actuality it had the opposite effect,
introducing a subtle element of regularity. If the
motor wheels had been omitted from the German designs,
it is overwhelmingly probable that the Fish codes would
never have been broken. Pure key, of which Brigadier
Tiltman's initial 1941 break provided a sample of some
4000 successive characters, could readily have been so
constructed as to destroy any practical possibility of
reverse engineering to recover the logic that generated
it. Admittedly, with just the 10 chi-wheels plus the 10
psi-wheels, the above combinatorics get knocked down to
about 10' and 1017 respectively. But given reasonably
prudent restrictions on the properties of the wheel
patterns themselves, these reductions still guarantee a
sufficient semblance of randomness in the generated chi
+ psi stream of characters. But as it was, the extra
and gratuitous touch of having all psi wheels
intermittently stutter in synchrony allowed entry of a
serious and systematic departure from randomness, as
will later be illustrated.
The 1500-valve Colossus 1 became fully operational in
February 1944 with the job of substantively mechanizing
the finding of wheel settings. Note that this could
only be done for the tiny subset of all traffic for
which the wheel patterns for the given link and day had
already been broken by laborious, slow and chancy hand
processes.
In April 1944, as briefly described by I. J. Good [8],
I made a technical proposal which was simple enough to
allow the two of us to validate its correctness in a
couple of hours' experimentation. Using the machine in
ways for which it had not been designed, we
demonstrated the feasibility of going beyond the
automated finding of wheel settings, to machine-aided
breaking of the wheelpatterns themselves - the
"intrinsic logic" referred to earlier. Technical
aspects of this part of the story will form a main
topic of my chapter in B. J. Copeland's forthcoming
book [3].
A "crash programme" was immediately authorized at War
Cabinet level, targeted on having a working Colossus of
the new design in time for D-day on 6th June. This was
achieved, with a few days to spare, by a team of Post
Office research engineers led by T. H. Flowers'
lieutenant A. W. M. Coombs. The task was achieved by
equipping the 2500-valve Colossus 2, already under
construction since January, with a "special
attachment". The design, engineering and patching-in of
this attachment was led by a team member, Harry Fensom.
The remaining eight new-design Colossi, each further
enhanced over its predecessor, joined the others in
rapid succession over the ensuing nine months. Colossus
10 was under completion at the end of hostilities, by
which time more than 63 million characters of high
grade German messages had been decrypted.
What follows is based on an invited talk delivered by
me at the 24 th June (2000) meeting British Society for
the History of Mathematics entitled "History of
Cryptography."
The Newmanry Report
By good fortune I was able to announce at that meeting
my receipt a few days earlier the following memorandum
from the Government Communications Headquarters (GCHQ):
MEMORANDUM
To: PROFESSOR DONALD MICHIE
From: ANN THOMPSON, GCHQ
Date: 21 June 2000
RELEASE OF 'NEWMANRY'
This confirms that the two volumes of 'NEWMANRY' are to
be released to the PRO. We expect them to be available
for public scrutiny in the next few months.
I reproduce below the Table of Contents. The document
as a whole is in preparation for publication on the Web
by Whit Diffie, known to many as the originator of
public-key encryption, now pervasive in commercial
computing. A hard copy version will be published by MIT
Press. Costs of conversion of the original typescript
into suitable forms have been covered by grants to Mr
Diffie's co-editor Dr. J. V. Field, the British
historian of mathematics, made by the Royal Society of
London, the London Mathematical Society and the Royal
Statistical Society, with active facilitation from the
Public Records Office.
The technical facts concerning the manual breaking of
the Fish codes, starting with Tiltman's initial break
in December 1941, can be put together in moderate
detail from a variety of sources, some listed at the
end of this paper. The release of the General Report on
Tunny now allows the level of detail as regards machine
procedures (Newman's section) to be successfully
extended and refined.
The Testry report
Still to reach the public domain is a technical report
on the work of Colonel Tester's section. On cessation
of hostilities, I was assigned to prepare the latter
document, and completed it in a couple of months before
joining Good on the "Newmanry" report. Although the
Testery report remains classified, I was recently
enabled by special arrangement of the Director to visit
the Government Communications Headquarters (GCHQ) at
Cheltenham and to refresh my memory of it. I also
received guidance from GCHQ's Chief Mathematician as to
which details in it cannot yet be disclosed.
A good deal of knowledge of these hand procedures is
directly inferrable from such sources as those
previously referenced, including the "Newmanry" Report.
The full "Testery" Report amplifies this knowledge.
The Testery's changing roles
Before the establishment of the Newmanry, the finding
of all wheel settings, and of virtually all wheel
patterns, was done by hand in the Testery on samples of
key, the raw gobbledegook with which the German
enciphering machines operated upon the text, of
plaintext messages to generate transmitted ciphertext,
and even more impenetrable form of gobbledegook. On
rare occasions, such as that which Tiltman had seized
upon, samples of key could be reconstructed from
careless mistakes made by the German operators
("depths", see later). On even rarer occasions a free
present of pure key was made by the even more careless
error of omitting properly to insert the plaintext
message tape into the Lorenz machine's tape reader when
starting the transmission. A message consisting of all
blanks could result, or if the operator was
inadvertently leaning on some arbitrary part of the
keyboard, then of continual repetition of the same
character.
After the successful launch of the Newmanry, but before
the new machine methods were developed to full
capability, the Testery retained sole responsibility
for breaking wheel patterns. But settings of the chi
wheels were now done at speed in the Newmanry from the
ciphertext intercept without need of key. Machine
methods for additionally breaking chi patterns were
subsequently developed and made operational by an
extension of Colossus design, the earlier mentioned
"special attachment".
It would be a mistake to think that the Testery's role
somehow lost its criticality when it proved possible by
use of the Newmanry's Colossus machines to break both
chi-wheel patterns and settings at rates unthinkable by
hand methods. In actuality this development created a
new and even more critical role for the Testery.
The Newmanry stage of the operation simply yielded a
first intermediate product, in the form of the original
intercepted message with the chi-wheel component of the
total key stripped out. These partially solved texts
from which the effects of the five chi wheels had been
removed, were called "de-chi's". A further chain of
cryptanalytic successes in the Testery had then to take
place before an end-product could finally be mounted on
the Testery's electromechanical "Tunny machine" and
translated into plaintext. Each further step in the
chain required separate elucidation from de-chi text
(not key), whether in the form of pattern breaking, or
of the much commoner task of finding the start-settings
(a) of the five psi-wheels and (b) of the two motorwheels.
On arrival from the Newmanry of the swelling flood of
de-chi's, components (a) and (b) of the original
obscuring key, thus remained to be manually broken and
mechanically stripped out to yield plaintext. This task
was possible, and to the war's end only possible,
through the techniques, experience and ingenuity of the
Testery's complement of cryptanalysts and ATS
operators.
Post-war loss of log books
As was also necessary with the "Newmanry" report,
declassification requires that a decision be taken in
concert by GCHQ and the US National Security Agency.
Eventual declassification of the "Testery report" can
hardly be in doubt. In a different category is the
irreparable loss of no fewer than 500 hand-written log
books, including six exercise-books numbered R0 to R5,
in which we recorded and initialled research ideas and
points of discussion as they arose. Existence of an
additional book labelled R41 is also mentioned in the
Report. The GCHQ archivists have made painstaking
searches at my request without result. It should be
borne in mind that Bletchley Park's successor
organization was twice physically uprooted after the
war en route to its permanent home at Cheltenham. In
such repeated house-cleaning it would not be too
surprising if the much-thumbed, ragged and informal
appearance of the documents caused them at some stage
to be taken for junk, and thrown out.
TOPICS STILL TO BE MENTIONED
The audience at my June 2000 talk to the British
Society for the History of Mathematics covered so broad
a spectrum of backgrounds that I concentrated solely on
expounding some basics and telling a story.
Colossus
Recapitulating the story so far in something like this
spirit, I shall at the same time expand on selected
aspects of the Colossus series of machines, then relate
some extensions of their later scope, and finally
comment on their wartime impact on military
intelligence.
Three years before Colossus, the Fish story began with
the breaking by hand of what was called a "depth". I
earlier sketched an explanation of the phenomenon and
of its exploitation in the Testery to recover complete
lengths of pure key.
Treatment of depths involves a look at some
cryptological basics, with special reference to certain
properties of the Vernam teleprinter code on which the
logic of the German Lorenz ciphers were based. These
are exhibited, including an all-important property
whereby the "differencing" of a string of teleprint
code produces what we called its "delta" form, in which
each successive repetition of any character in the
undifferenced text is automatically flagged in the
delta text as a "/".
Turingery
Once William Tutte had disinterred the overall logical
structure common to all Fish ciphers, Turingery could
be applied, whenever pure key was available, to any
sufficient long sample. With this method, recovery was
possible both for the particular wheel-patterns
employed to encipher all traffic transmitted over a
particular link for the duration of that pattern set,
and to discover the wheelsettings of just those rare
messages intercepted on that link that formed a depth.
A brief description of the method will be given. But
powerful though it was, its domain of applicability was
limited to comparatively rare gifts of fortune when
German operators strayed from their prescribed path.
Newman's proposal
Newman's proposal for a mechanized attempt on wheelsetting comes next. His idea was to recover the wheel
settings from ciphertext messages instead of just those
where the underlying key had been handed over on a
plate (as depths, see earlier) by insufficiently
disciplined German operators. These rare free gifts,
which were all that we could so far decipher, were
rapidly becoming rarer. Clearly large-scale statistical
processing, impossible by hand methods, offered the
only chance for breaking and reading the overwhelming
majority of intercepts.
Demonstration of feasibilty of machine wheel-setting
Demonstration of his concept in the newly formed
Newmanry was obtained by Newman's complement of two
cryptanalysts. Having been recruited from the Testery I
was shortly joined by I. J. Good, from the Enigma Naval
Section. Our capabilities-analysis and use of the
custom-built "Heath Robinson", forerunner of Colossus,
enabled Newman to commission the Colossus programme and
to recruit a new influx of cryptanalysts. There
followed a rising curve of collective innovation both
cryptanalytic and organizational. Leading roles were
played by David Rees and Shaun Wylie among others in
effecting a headlong transformation of the reading of
the German military traffic, eventually to a factory
productionline scale as earlier outlined.
I have already described other material that had an
impact on "Fish". The "weights of evidence" method,
devised by A. M. Turing for his work on the German
naval cipher Enigma, was later employed also both in
the Testery and the Newmanry. The widespread notion
that his war-time cryptanalytic contribution was
limited to Enigma falls far short of the facts.
Incompleteness of coverage
From the topics covered, there are several conspicuous
omissions. No description is given here of the
"rectangling" procedure with which the breaking of chi
wheel patterns was kicked off, and no technical account
of how this and the subsequent steps of breaking the
chi wheel patterns was turned into a Colossus-aided
procedure by special extension of Colossus
capabilities. Nor are the Testery's manual procedures
described by which the psi-wheel and motor-wheel
patterns and settings were obtained from the "de-chi's"
which eventually streamed over from the Newmanry in
bulk. In what is already a lengthy treatment, it did
not seem reasonable to plunge into such depth and
detail. Much of the material is already well expounded
by Frank Carter [2] in technical reports available from
the Bletchley Park Trust. Whatever of interest that is
not to be found there will be treated in my chapter in
B. J. Copeland's forthcoming book [3].
DIKTAT VERSUS INTERLOCK
I interject here a return to the seemingly baffling
question: how could it be that, with the means of an
operationally unbreakable code at their command, so
efficient an opponent could yet hand us on a plate the
"depths" that permitted the initial breaks?
The major cause, I believe, is to be sought in a human
tendency, common throughout all technology. We prefer
to protect a device from misuse by prohibitions issued
to its users, rather than to design the device so that
it cannot be misused in this particular way. We issue
dire penalties for omission of aircraft tire
inspections. To administrators this is a quicker, and
hence preferable, fix than installing worn-tread
detectors that could automatically abort attempted
take-off (subject to pilot-override). The trait is not
peculiar to any particular time or place. Bureaucracies
everywhere prefer immediately applicable measures to
those requiring additional planning, particularly if
the latter may entail budgetary outlays or new
administrative arrangements. It follows that active
arms-race, rather than mutually assured
impenetrability, will be the continuing pattern of
cryptology into the foreseeable future.
WHAT WAS TURINGERY?
Given a length of key the first step is to difference
it and to work with delta key. Then it becomes possible
directly to exploit the following facts concerning raw
undifferenced key.
KEY = CHI-STREAM + PSI-STREAM
CHI-STREAM = CHARACTER-STRING GENERATED BY ALLOWING THE
CHI WHEELS, ONCE SET, EACH TO STEP ROUND ITS OWN PROPER
CYCLE of 41, 31, 29, 26, and 23, repeating the stream
only after every 41 x 31 x 29 x 26 x 23 steps. But note
that when each of the five teleprinter channels is
isolated, we can observe that channel 1 of the chistream repeats every 41 steps, channel 2 every 31
steps, and so on. It is therefore a trivial matter,
given a length of chi-stream exceeding 41 in length to
recover the patterns and settings of the five chi
wheels. In the case that we have a similar length of
delta chi-stream, exactly the same reasoning applies.
PSI-STREAM = CHARACTER-STRING GENERATED BY ALLOWING THE
PSI WHEELS, ONCE SET, EACH TO STEP ROUND ITS OWN PROPER
CYCLE WITH IRREGULARLY PLACED PAUSES SIMULTANEOUSLY
AFFECTING ALL FIVE WHEELS
The placement
occurrence of
which the l's
the action of
of the pauses is determined by the
O's in a MOTOR STREAM of O's and l's in
predominate. Its precise construction by
the two motor wheels will not concern us.
If we had a sufficient length of psi-stream, we could
perform the same recovery trick as before illustrated
for the chi-stream, except that in the case of
undifferenced psi stream we must first delete all the
character-repetitions, and in the case of delta psistream we must delete all the I's before proceeding. Of
course occasionally a repetition or a / respectively
may occur by random chance and not through the action
of the motor stream. In that case checks will fail and
the cryptanalyst must back off and reconstruct what has
happened and exactly where. This he must do by
processes of successive conjecture and refutation
before continuing and completing the job.
But what we actually have is neither delta chi-stream
nor delta psi-stream but their sum, i. e. delta key.
Next, therefore, we need to apply some way of first
extracting from this the delta chi. What follows is
overly condensed and technical, but is included for the
benefit of sufficiently interested professional
cryptologists. Otherwise the reader is invited to skip
to the next paragraph.
Channel by channel we do the following. Assume a value
for a given bit in the delta chi wheel, either 0 or 1,
and repeat this assumption at the appropriate chi
period throughout the differenced key. From the
agreement or disagreement of these assumptions with the
values found at these periods, inferences are made as
to whether the true value or its reverse occurs at that
point, and by correlating the results of these
inferences the delta chi patterns (and from these the
chi patterns, and from these the psi and motor wheel
patterns) are obtained.
NEWMAN'S PROPOSALS
The limitation on Turingery and on almost everything
else done in the Testery was the requirement to have
keytext to work on, obtainable only from a small and
diminishing number of intercepted depths. Newman based
his proposed on a frontal assault on the ciphertext
itself, hoping for sufficient occurrence of nonrandom
features of the enciphered plaintext.
A small example of regularity in the plaintext was
given above, using the keystroke sequence that
implements the full stop on a teleprinter.
Operationally, it was not actually a "small" example.
Military German was crammed with abbreviations, marked
by full stops, or by other punctuation signs whose
keystroke profiles showed similar statistical
regularities. These were found to be most marked in
plaintext when the first and second delta channels were
combined by addition. By cancelling out to zeros this
provided frequent "windows" for viewing snatches of
underlying key. As for the delta key itself, a
component of it (delta psi) was also sprinkled with
blanks resulting from character-repetitions in the nondelta psi-stream. This provided a low but exploitable
frequency of peep-holes through which to catch glimpses
of delta-chi.
In exploitation of this behaviour, delta-ing the
keytext in order to recover delta chi wheels was how
Tutte got in in the first place. As we have seen,
interception of rare depths made it possible to recover
and work on pure key. The Testery used delta key for
"Turingery", both to break wheel patterns in the first
place, and then, less laboriously, the wheel settings
of each individual message subsequently intercepted
during the period (initially a month) for which that
particular set of wheel patterns was operative for the
given link. But as the practice of German operators
became more disciplined, so the already sparse
availability of pure key became sparser, - for lack of
new depths.
Enter Heath Robinson, engineers, Wrens and Good
Funds for Newman's proposal were granted. Two rooms,
comprising Hut 11, were provided. The first machine
("Heath Robinson") was conceptually specified, and then
designed and built off-site. A few electronic engineers
and some operators from the Wrens, plus one
cryptanalyst (myself, transferred from the Testery)
were assembled. A floorful of pre-fabricated parts was
delivered and put under assembly-and-test by the
engineers, and the Newmanry sprang into energetic if
disjointed activity. I was one day sitting at my table
in solitary wonderment when I saw in the further corner
a smallish figure seemingly frozen in meditative yet
enquiring reverie. He slowly approached with a
deliberative step, right arm and hand in semiextension. I rose and waited for the hand to come into
range. At that point he stopped, and gazed composedly
upon my bafflement. After what seemed a long while, he
made an announcement in tones of quiet precision: "I am
Good!"
In April 1987 an international symposium on the
Foundations and Philosophy of Probability and
Statistics was held to honour Good's 70th birthday. In
prefacing my own contribution I remarked that my
lifetime experience, (continued today), had confirmed
that he was indeed Good then, and has been getting
better ever since.
PROOF OF CONCEPT: THE HEATH ROBINSON
Exhaustive search of the combinations of all five chiwheel settings at once was of course not remotely
possible even for electronic machines. Therefore the
strategy that Newman had proposed was, as mentioned
above, to find combinations of channels in German
plaintext messages that were so productive of
statistical regularities that the rest could initially
be disregarded without loss in the size of the
statistical excesses over chance. In the event, the
systematic studies that I helped Jack Good then to
conduct, using the first electronic machine (the "Heath
Robinson") as a statistician's slave, confirmed
Newman's suspicion that adequate relative excesses
("proportional bulges" in our terminology) could be got
even after disregarding channels 3, 4 and 5, leaving
only 31 x 41 = 1271 combinations of possible settings
of the chi-1 and chi-2 wheels to be tried. Once these
two chi wheels were set, matters became more
problematical. Tackling channels 4 and 5 in like manner
sometimes found a marginally sufficient bulge,
sometimes not. So more sophisticated statistical
properties of plaintext had to be pressed into service.
But first such properties had to be prospected for and
discovered, perhaps properties that could be forced
into the open by exploiting more complex relations
between channels.
Advance statistical reconnaissance
The hope, therefore, was that by amassing such
statistics, we might eventually operate on raw
intercepts as input, without having to rely on depths.
Depths were dwindling and were to become vanishingly
rare. The heaven-sent gift to the Testery of pure key
obtainable from depths can be appreciated when one
realizes that key can be seen as key + plaintext in the
special case that the plaintext consists of the
message: 000000000000000000000000000000, etc. Clearly,
here the character repetition frequency is 100%. In
other words delta plaintext + delta key gives all
zeroes, just as it would if the plaintext message were
11111111111111111, say, or GGGGGGGGGGGGGGGGGGGGGGG,
etc. Sometimes, by the way, it was, - at least for long
stretches, even in the absence of a depth. My best
guess was that the enciphering operator had either
fallen asleep in mid key-stroke or had carelessly
leaned with his elbow on the keyboard, thus keeping
some key depressed. In that case the corresponding
character could simply repeat in every machine cycle
and be transmitted.
If however the plaintext takes the form of a message in
military German, then only when this plaintext message
has a repeated character can this act as a little
window of "blank" in the delta ciphertext, through
which a character of the underlying delta key can be
seen. Of course, if the operator "leans on the
keyboard" for any length of time, the delta text over
the corresponding segment will have a correspondingly
extended peephole.
Size and frequency of "peep-holes"
As earlier noted, it so happens that just as the sum of
the first two channels gives all zeroes, so also does
the sum of channels 4 and 5. On channels 1 and 2 this
regularity was usually enough, along with other channel
1 and 2 features of delta German plaintext, to find by
machine the settings of chi-1 and chi-2. The features
of plaintext on channels 4 and 5 were not always so
favourable as to allow the same trick to be applied.
Use of the plug-board to cause the machine only to look
at channels 4 and 5 at those points which satisfied
some condition on 1 and 2 (e. g. that delta of 1 + 2 =
0) was found to increase the 4+5 bulge to a degree that
sometimes more than compensated the loss in effective
sample size. Other more sophisticated statistical
interactions were required to cope with all channels in
all cases. First, however, far-reaching knowledge of
the intricate statistical characteristics of plaintext
was required in order to discover what precisely these
statistical interactions might be.
When the first Robinson became operational, Jack Good
and I spent our day shift in frontal assault, with Max
pacing around for positive results to announce.
Although he agreed in theory with our argument that
pure reconnaissance of the problem should be the first
use of the newly operational machine, the need for
credibility, with high ranking military and others
dropping in to see what results were being got, pressed
sorely on him. Once he had laid it down, Max Newman was
not someone that a person in his senses would continue
to oppose. From nine to six each day Jack and I
accordingly went through the motions.
GHOST SHIFTS
But many evenings were spent in a clandestine ghost
shift, with one or two volunteer Wrens and an engineer.
Our purpose was to use the new instrument to gather
massive delta plaintext statistics, including in
particular the frequencies of zeroes (i.e. repetitions
in the original plaintext) in the boolean sum of
selected pairs of channels, conditional on what was
happening on other channels.
I say above "conditional on what was happening . . . "
This was all-important after the first two chi wheels
had been got, yielding knowledge of the first two
channels of delta-psi + delta-plain. To get knowledge
of further, and possibly more recalcitrant, chi wheel
settings, we needed somehow to sharpen the statistical
"bulges" characterizing other channels of delta
plaintext. On the Heath Robinson we could only screen
out channels unwanted in a given run by concocting
tapes in which the unwanted channels were left all
blank. In the Colossus, this conditionalizing was later
done at the flick of a set of hand-switches, together
with plug-board programming for forming arbitrary
boolean combinations of selected channels.
With the aid of Heath Robinson and our volunteer
assistants we systematically extracted from Testery
decrypts batteries of general rules governing the
statistico-logical structure of military German, with
and without delta-psi streams superadded. Owing to the
action on the psi's of the motor stream, this latter
component was guaranteed to supply a stream of extra
zero's.
Armed with these tabulations, statistical summaries and
empirical rules we were now in a position to make
frontal assaults in earnest. This yielded sufficient
operational success for Max Newman to announce
feasibility. I doubt if he ever knew of the clandestine
operation. If he had, his forward-pressing propensity,
and preference for focussing on the next big thing,
would I believe have led him to smile and dismiss it
from mind.
The next big thing, of course, was Colossus.
COLOSSUS, THE FIRST HIGH-SPEED ELECTRONIC COMPUTER
In the BBC television programme "Station X",
subsequently published in book form, Michael Smith
remarked:
Colossus was the first practical application of a
large-scale programcontrolled computer and as such the
forerunner of the post-war digital computer.
Note that it was not a stored-program machine, and
hence not a generalpurpose computer. In this sense its
logic was more primitive than Charles BaD bage's
nineteenth-century designs for his uncompleted
Analytical Engine. The world's first general-purpose
digital computer became operational in June 1948, when
the Manchester "Baby" ran its first non-trivial
program. Kilburn and Williams who led the team had been
appointed by Max Newman at the end of the war, using
money from the Royal Society specifically for the
purpose of developing such a machine.
In the light of Newman's wartime part in conceptual
design and practical use, the forerunner role of the
Colossus machines was rooted not only in their logical
affinities to the post-war digital computer but also in
the qualities and consequences of a great mathematician
and an extraordinary man, my wartime boss from early
1943 until the summer of 1945, Max Newman.
Ten years earlier, in 1935, as a young Cambridge
undergraduate Alan Turing had attended Newman's
lectures on the logical foundations of mathematics. The
experience led him directly to formulate in a now
famous paper a weirdseeming mathematical construction
known today as the Universal Turing Machine (UTM). We
no longer see it as weird. It formed the theoretical
base from which not only Newman at Manchester, but also
Womersley at the UK's National Physical Laboratory and
von Neumann in the USA, launched their immediate postwar machine-building initiatives. Today's multiplying
varieties of computing machine, from the smallest handhelds to the largest supercomputers, are still formally
describable as engineering approximations to a single
invariant abstract specification, the UTM of 1936. As
earlier stated, the Colossus machines were not general-
purpose, hence not UTM's. The Colossus 1, however,
marked a small step in that direction, and the later
Colossi marked a further step in programmability.
The Colossus 2, 3,..., 9 crash programme
Colossus 1 had facilities for plug-board programming.
By manipulation of connections and hand-switches, up to
100 boolean functions of selected channels of a running
tape could be simultaneously evaluated on the fly.
Hard-wired branching was also possible, for example, to
effect conditional printing, conditional, that is, on
current values of intermediate computed results. As
earlier mentioned, I. J. Good [8] helped me to test a
proposal for an unorthodox use of the machine which
conferred an extension of functionality far beyond that
of Colossus 1. The resulting engineering crash
programme leading to the Colossus 2, 3, ..., 9 machines
nudged the design further in the direction of
"programmability" in the modern sense. In a taped
interview with Christopher Evans (undated) in the mid1970's Newman speaks in guarded language (some elements
of wartime secrecy were still in force) about
... new problems for, I suppose, reasons perhaps I
shouldn't mention, but which had to be dealt with by
doing something new with the Colossus itself and it was
a great tribute to Flowers' design of this thing that
he made it a bit more general than we had asked him to
[in Colossus 11 in such a way that when we had these
new problems we often found that it could be done on
the machine without any modification of it, just as it
is. This involved a lot of work by various people. By
Good and Michie particularly and all of us ... but it
was very satisfactory to find that this machine could
do this and this is perhaps ... one of the things which
makes it justifiable to say that it is really at least
a forerunner and perhaps the first germ of a computing
machine; a general purpose computing machine.
In partial corroboration of this, as a "fun" project
after the German surrender on May 8th 1945 Geoffrey
Timms programmed one of the later Colossi to multiply
whole numbers. In practice, only very elementary
multiplications could be done; otherwise the machine
cycle interrupted the process before the calculation
had completed. None the less, Timms thus demonstrated
the in-principle applicability of the later Colossi to
problem domains far beyond the original.
Spanning
Among enhancements to later machines a design extension
was originated by I. J. Good called "spanning". It
enabled the user to screen off selected segments of the
data tape from the machine's current inspection, e. g.
if suspicion arose during processing that sections of a
transmitted message might be offset with respect to
other sections. This could occur as a result of losses
during interception of individual characters, or of
interpolations of spurious characters. Spanning allowed
statistical computations to be repeatedly performed on
different selected subsets of a given intercepted
message suspected of having been corrupted in one or
another of these ways. Practical usefulness of spanning
for exploratory analysis was very great.
Comparison with ENIAC
The Colossi, then, were special-purpose in practice,
but in principle not entirely. They were probably
roughly comparable in functionality to the US post-war
electronic computer ENIAC, operational in 1946. As
sheer performance greyhounds, however, there is no
comparison. The later Colossi, for example, could read
5channel paper tape at the astonishing rate of 25,000
characters per second, and were fully operational,
round the clock, a full two years before ENIAC's debut.
* This paper was one of several presented at the
"History of Cryptography Conference" which took place
at Cambridge University, 24 June 2000. The sponsor of
the conference was the British Society for the History
of Mathematics.
REFERENCES, SOURCES, AND FURTHER READINGS
1. Carter, Frank. 1996. Codebreaking with the Colossus
Computer, Report No. 1, The Bletchley Park Trust
Reports. Milton Keynes: BP Trust.
2. Carter, Frank. 1997. , Codebreaking with the
Colossus Computer: Finding the K- Wheel Patterns by
Frank Carter, Report No. 4 June 1997, The Bletchley
Park Trust Reports. Milton Keynes: BP Trust.
3. Copeland, B. J. (In preparation.) Colossus: The
First Electronic Computer. Oxford UK: Oxford University
Press.
4. Currie, Helen,
Bletchley Park at
c/o BP Trust, The
Milton Keynes MK3
(Undated.) An ATS girl's memories of
war and after. Apply to Tony Sale,
Mansion, Bletchley Park, Bletchley,
6EF UK.
5. Evans, Christopher. (Undated: internal evidence
indicates approximately 1976). Pioneers of Computing
number 15. Interview in an oral history of computing
compiled with the support of the Science Museum in
London and the National Physical Laboratory at
Teddington, UK.
6. Fensom, Harry. Nov. 1999. Mathematics of
Codebreaking (Cryptanalysis) with Colossus or What did
Colossus Really Do? Apply to Mr Tim Burslem, Editor,
RSS Newsletter, Honeysuckle Cottage, School Road,
Waldringfield, Woodbridge IP12 4QR UK.
7. Good, I. J. 1970. Some future social repercussions
of computers. Intern. J. Environmental Studies. 1: 6779.
8. Good, I. J. 1994. Enigma and Fish, Chapter 19 in
Codebreakers: the inside story of Bletchley Park. (Eds.
F. H. Hinsley and Alan Stripp). Oxford UK: Oxford
University Press.
9. Good, I. J. 1980. Pioneering work in computers at
Bletchley, Chapter 4 of A History of Computing in the
Twentieth Century. (Eds. N. Metropolis, J. Howlett, and
Gian-Carlo Rota). New York: Academic Press.
10. Randell, Brian. 1980. The COLOSSUS, Chapter 5 in A
History of Computing in the Twentieth Century. (Eds. N.
Metropolis, J. Howlett, and GianCarlo Rota). New York:
Academic Press.
11. Sale, Tony. (Undated typescript.) Lorenz and
Colossus. Apply to Tony Sale, c/o BP Trust, The
Mansion, Bletchley Park, Bletchley, Milton Keynes MK3
6EF UK.
12. Tutte, William. (Undated.) FISH and I. Apply to
Professor William Tutte, 15 Manderston Road, Newmarket
Suffolk CB8 0NS UK.
Donald Michie^
^ Professor Emeritus of Machine Intelligence,
University of Edinburgh UNITED KINGDOM, and Adjunct
Professor of Computer Science and Engineering,
University of New South Wales AUSTRALIA.
BIOGRAPHICAL SKETCH
Donald Michie, MA, DPhil, DSc, is Professor Emeritus of
Machine Intelligence at Edinburgh University. Born in
1923 and educated at Rugby School and Balliol College
Oxford, he served at the Foreign Office in 1942-1945.
From 1952 he held research posts in genetics at London
University before moving in 1958 to Edinburgh as Senior
Lecturer and Reader in Surgical Science. During 19621965 he founded and built at Edinburgh the first
European centre for Artificial Intelligence research
and teaching, and received a Personal Chair in 1967.
Leaving Edinburgh in 1984, he founded the Turing
Institute in Glasgow and was its Chief Scientist until
1992. He held a number of visiting posts, including at
the University of New South Wales, where he is
currently Adjunct Professor of Computer Science and
engineering. Professor Michie is the author of about
170 papers and five books in experimental biology,
cognition, and computing
Copyright Cryptologia Jan 2002
Provided by ProQuest Information and Learning Company.
All rights Reserved