Saturday, February 18, 2017

History of Computing I: The Colossi

The earliest notion of a "computer" that most people had in the nineteenth and twentieth centuries had one key feature: enormous size. Babbage's 1837 "Analytical Engine," widely regarded as the earliest ancestor of the computer, would -- had it ever been completed -- have filled a warehouse-sized room and weighed nearly 30,000 pounds. Babbage's designs used interlocking gears with various ratios to perform calculations, and his system contemplated a punch-card I/O unit, a calculating unit known as the "Mill" (a sort of CPU), and a storage unit he called the "Store." In his search for backers, he enlisted Lord Byron's daughter Ada Lovelace, who wrote an erudite explanation of the Engine's operations for a French journal; in 1983, a new computer language designed for the US Department of Defense was christened "Ada" in her honor. Two working models of his machine have been built in recent years; you can see one of them in action here.

Few significant advances in computing were made until the late 1930's and early 1940's, when military needs -- calculating target data, and (most importantly) breaking secret codes such as Germany's ENIGMA, provided both the impetus and the funding. In the UK, researchers at Bletchley Park, led by the young computer genius Alan Turing, constructed machines they named "bombes" which used electrical relays and motors to run through hundreds of thousands of possible combinations of the wheels and wires of an Enigma machine. Later, they constructed a far more advanced machine, known literally as "Colossus," for the same task. Advances in cryptography would eventually render all these computers obsolete -- indeed, a carefully-done "one time pad" or "Vernam cipher"is unbreakable once the pad text is destroyed (as witness a message found on a skeletized pigeon, though some have claimed to have deciphered it).

Yet other problems, such as calculating trajectories, remained. The first fully digital machine along these lines was ENIAC (short for Electronic Numerical Integrator And Computer) which used vacuum tubes -- more than 17,000 of them! -- as relays and switches. The machine had to be literally re-wired for each different kind of operation; the task was entrusted to a group of young women who, even though many of them had college degrees in mathematics and engineering, were regarded at first as little more than glorified switchboard operators. All of the units together weighed more than 60,000 pounds, and consumed 150 kilowatts of power -- all to perform roughly 5,000 calculations per second. While this was many times faster than any earlier machine, it's equivalent to a CPU speed of 5 kHz -- fifty million times slower than the average desktop computer of today.

The key invention which began the change from room- and-building-sized machines to something that could actually fit in an office or a home was of course the transistor, developed at Bell Labs in 1947. The basic idea was to use a semiconductor sandwiched between more conductive materials; such a device, like a radio tube, could be used either as a signal amplifier or a switch. There were several key advantages: transistors produced less heat, were cheaper to manufacture, and -- even in their early state -- much smaller. Each of the women of ENIAC shown in the photo above is holding a unit with the same storage capacity; the first two decreases in size are due to smaller, specially-made tubes, but the last is due to transistors. A typical smart phone today has nearly a million times the number of transistors in this smallest unit.

With the war over, business demands drove the computer market. The first commercial computer introduced for this market was the UNIVAC, introduced in the early 1950's. For around $750,000, you got a CPU speed of 1.9 kHz, about 1.5 Kb of memory, and tape drives, each the size of a small refrigerator, which held about 1.5 Kb per tape. A decade later, IBM introduced its 1401 system; with the top model, one could now have 16 Kb of memory, and perform almost 23,000 calculations per second -- 23 kHz. IBM did not sell the 1401, but you could lease one for around $2,500 a month. Home computing on a practical scale was still far in the future; although the SIMON and other home-kit computers were available throughout this period for home hobbyists, their size -- 8 binary switches -- made them useless for any but the most limited tasks.


  1. Wish I came on time this day. All that I read was very interesting, how the first computers were made, how women were pretty much key to getting the work done, and all that spy technology they made, all that leading to where we are now. I find something that Neil Degrasse Tyson said to be interesting and I feel that it applies here. To sum it up real quick (you can find it in the most recent interview with Stephen Colbert), Neil pretty much said that we as a people have stopped advancing in such a fast rate because of the loss of competition with different countries ex. the space race and how China now has the most powerful telescope. I saw that here with the competition of the hacking devices that were made and was just suprised with all that was made within that time. We still are advancing but not as much as before in my opinion, mainly because it was a race in war, but we just don't have that urgency as before. I just found all this interesting! Then with the women and the machines they help use and make. I find it similar to the Hidden Figures example, where these women in my knowledge weren't involved. In all my time thinking of computers I thought of men with glasses and beards tinkering away at long nights and raising their chin at their accomplishments. But to find out that women were behind a lot of what happened, it's interesting, something I should have already learned. Makes me wonder if people at that time knew that women were behind a lot of the great things being created.

  2. History of Computing I: The Colossi
    From Babbage to the Colossi to the iPad we have seen steady improvements and size reductions in computer power. Moore’s Law (1965) said computers will double in power (number of transistors) every year. He was right. In 1975 he revised that to every two years, and that has held true more for less for the past 40 years. Now the MIT Technology Review says that we are likely to see that fall off within the next 5 years. Things can only get so small. Rather than forecast the end of advancement, we are likely to see advancement in the basic structure of computers next. People love a challenge! Reprogrammable chips (Intel paid 17 billion dollars for a company working on that) will be the new vanguard of Artificial Intelligence. Quantum computing is also waiting for a practical breakthrough. Indeed we could see an increase in Moore’s Law if either of those step up to the working tech table. How will we get to Mars, or encode our Souls into machines for immortality, exploration, hyper-shopping and advanced game play without it! In the end, for my own prognostication, I believe the first realistic virtual reality porn machine will be the beginning of end times for the human race. The Opium scourge (treated with Marshall Law and street executions) in China will be a humanitarian speed bump by comparison! “Laandru!”
    Tony Ricci

  3. This is a little off topic, but does involve the computer…Speaking of the "Porn Machine", I recently went to a play called the Nether at the Gamm Theatre in Pawtucket. The play was set in the not so distant future, and its premise focused on man brought up on criminal charge for raping and killing a young girl repeatedly in virtual reality. We may not be able to conceive of it today, but if virtually reality becomes part of our everyday existence, it stands to reason where this might lead in terms of legal ramifications. Although one can never totally be certain of every conceivable use a newfangled invention, I don’t think WWW founder Tim Berners Lee envisioned virtual porn as a possible progenitor of his research.

    To expound on what Edgar was blogging, Hidden Figures was a great film. It's nice to let the world know how ordinary people, who otherwise may be lost to our past, made this world extraordinary. Indeed it takes vision and intelligence to create skyscrapers, spaceships, and computers to say the least, but we also need the countless man and womanpower to make it a reality.