Talk:History of computing hardware

From Wikipedia, the free encyclopedia

Featured article star History of computing hardware is a featured article; it (or a previous version of it) has been identified as one of the best articles produced by the Wikipedia community. Even so, if you can update or improve it, please do.
Main Page trophy This article appeared on Wikipedia's Main Page as Today's featured article on June 23, 2004.
June 19, 2004 Featured article candidate Promoted
    Skip to table of contents    
Peer review This Engtech article has been selected for Version 0.5 and subsequent release versions of Wikipedia. It has been rated FA-Class on the assessment scale (comments).
This article has been reviewed by the Version 1.0 Editorial Team.
This article is within the scope of Computing WikiProject, an attempt to build a comprehensive and detailed guide to computers and computing. If you would like to participate, you can edit the article attached to this page, or visit the project page, where you can join the project and/or contribute to the discussion.
Featured article FA This article has been rated as FA-Class on the quality scale
Top This article has been rated as top-importance on the importance scale

This is the talk page for discussing improvements to the History of computing hardware article.

Article policies
To-do list for History of computing hardware:

Here are some tasks you can do:
This article is undergoing a featured article review to ensure that it meets the standards of a featured article. Please add a comment to assist the process and/or be bold and improve the article directly. If the article has been moved from its initial review period to the Featured Article Removal Candidate (FARC) section, you may support or contest its removal. When the review is complete, a bot will update the article talk page.

Contents

[edit] 2001 talk

I doubt very much now that Harvard Mark I was fully programmable. Later models could switch conditionally from one paper roll to another, but since I don't believe it could rewind the paper rolls, no actual loops are possible and it's not Turing complete. But I'm not sure. Anybody know details about the Mark I?

Oh, and I just read that the Manchester Mark I was actually the first functional von Neumann machine, even before EDVAC -- but of course based on EDVAC's ideas. --AxelBoldt


I believe you are correct about the IBM-Harvard Mark I. This machine, by the way, was not built at or by Harvard. It was built for Harvard (and the U.S. Navy) by IBM.

My understanding is that the first operational stored program computer was the "Manchester Baby Mark I", a test machine for the Williams-tube storage technology, not the Manchester Mark I itself. The EDSAC at Cambridge appears to have preceeded the Manchester Mark I as the first "practical" stored program computer in operation.


Axel: An electromechanical computer necessarily uses some electronics. Thus "electro". But I understand what you mean about the electronics/electromechanical distinction.

Aiken directed the construction of the ASCC by IBM engineers at the IBM Endicott labs. Construction was completed in 1943. It was moved to Harvard, and operation began May 1944. [1]

As stated, the EDVAC was never completed--so all EDVAC-based computers were "before EDVAC". The "Baby" was first based on the EDVAC design that got a program running. --The Cunctator

I gotta say, the Wiki method really works--this entry has gotten amazingly better in a vary short period of time. It's still a little too discursive (some of the specificity would be better in stand-alone entries), but it's highly informative and readable. --The Cunctator


Not to disagree, but there's still a whole lot missing. No mention of Whirlwind, SAGE, PLATO, to give just a few examples.

Is that a disagreement or not? SAGE is mentioned in the history of networking... --The Cunctator


It seems like the end of the article is the original timeline visible at the top of this page table, and reads very much like a timeline. Wouldn't more of an overview and synthesis be appropriate, considering we have the (very good IMHO) other timeline?


I agree, particularly the latter part of the article has too many dates, names and details obscuring the general flow of progress. --AxelBoldt

I don't want to be argumentative, but I thought the new article didn't tell much of anything before WWII or after 1970, let alone flow of progress. The flight control system of the F14, while interesting, was hardly a landmark computer.

Yes, there was a fair amount where I just went in and pasted missing stuff from the old page. However, I feel it is more important to have date-filled placeholders than nothing at all. Now that some base data is there, anyone can go in and rewrite/rearrange it. By all means, feel free to edit as you see appropriate. The power of Wiki :-) --Alan Millar


Names, dates, and details are good things; but need to be pushed down into more detailed articles on more specific topics. At the same time, an overview/summary/synthesis needs to be presented at this level. But my guess is its easier to do this bottom-up rather than top-down. In other words, collect all the detailed information first, then refactor into appropriate levels of detail.

Also, should this article cover software as well as hardware? -HWR

Of course, hardware w/o software is scrap metal. The question is whether it's tangible enough to procude records. --Yooden


Anyone can refactor (a basic design feature of Wiki), but only if there is some information to refactor, so I think the bottom-up approach is necessary.

But all the information is already on the Computing timeline page, so why repeat it here? I think this article should have a bird's eye view on Computing history, just outlining the developments, and not listing anecdotes such as ads bought by certain companies at certain sports games. --AxelBoldt


As to Swiss clocks: the essence of computing is not the addition and subtraction of numbers, although it grew out of it and is a necessary part of it. The essence of computing is the execution of a sequence of instructions, and in that respect modern computers have as much in common with Swiss clocks as the abacus. And no, I'm not recommending removing the reference to the abacus :-) --Alan Millar

Swiss clocks neither process information nor can be programmed. They are just fancy mechanical devices, like all mechanical clocks. I don't see any relation to the history of computing except maybe that some early mechanical calculators used similar mechanisms as mechanical clocks (why Swiss?). Also, why are they mentioned in the paragraph about programmability? --AxelBoldt

What about music boxes? They're programmed to play tunes. -HWR

They have a single sequence, as do player pianos, and player pianos can even use a different paper roll to play a different tune. In that respect, the music box mechanically is a predecessor to the Jacquard loom. The Swiss clocks had multiple sequences of actions, where a main cog would activate other cogs to order different actions. The first GOSUB? :-) --Alan Millar

Actually, there are music boxes that play tunes from interchangeable discs. I don't know the chronology of this however.

BTW, is this article restricted to the history of DIGITAL computers? Analog computers don't generally execute sequences of instructions. -HWR


"IBM decided to enter the PC market ..., with the IBM XT" is not correct -- the XT was their second machine, with the hard drive.

That's correct--I'll change it. The first one was simply called the "IBM PC". Some mention of Compaq and the beginnings of the clone market in that era seems appropriate too. --LDC


I'm afraid this entry is getting too timeline-y...but I see that others are aware of that. Looks like we need to start thinking about some more subentries...anyone have any suggestions? --The Cunctator


Unfortunately the timeline here has many inaccuracies and ommissions of historical importance: 1965: IBM System 360 (first OS); 1968 first mouse/window system demo; 1973: CPM first micro OS; 1969 Intel 4004; 1977 Commodore Pet & TRS 80; 1978 Atari 400/800; 1979 Motorola 68000 32 bit CPU (w. 16 bit data and 24 bit address bus); 1981 Commodore Vic20 & IBM PC & Xerox Star (w. GUI/Mouse/Ethernet...); 1982 Commodore 64 with 64k RAM $600 & Timex Sinclair 2K RAM $99; 1983 1 million Commodore Vic20s and 1 million Apple IIs sold; 1985 Commodore Amiga with multitasking/Color GUI/accelerated video/stereo sound/3.5" floppy $1200; 1988 7 million Commodore 64 and 128 computers sold.... --Jonathan--

Feel free to enter whatever you think is missing to Computing timeline, not to History of computing. --AxelBoldt


Ack! It's getting insanely more timeliney! I'm thinking of paring. Please, everyone, notice Computing timeline. History of computing shouldn't supposed to list every computer, but discuss the intellectual development of the engineering/science of computing. --The Cunctator


Moved from /Permission-subpage:

I have obscured the email addresses in the message below in an obvious way. --AxelBoldt

Received: from mail11.svr.pol.co.uk
        by mail.metrostate.edu; Tue, 21 Aug 2001 19:25:28 -0500
Received: from modem-88.bass.dialup.pol.co.uk ([217.134.8.88] helo=arthur.the-roost)
        by mail11.svr.pol.co.uk with esmtp (Exim 3.13 #0)
        id 15ZLpr-0001gy-00
        for Axel.Boldt@OBSCURED1.metrostate.edu; Wed, 22 Aug 2001 01:25:32 +0100
Received: from benji.the-roost
        ([10.0.0.5] helo=localhost ident=mail)
        by arthur.the-roost with esmtp (Exim 2.12 #1)
        id 15ZLpq-0003Te-00
        for Axel.Boldt@OBSCURED2.metrostate.edu; Wed, 22 Aug 2001 01:25:30 +0100
Received: from stephen by localhost with local (Exim 3.12 #1)
        id 15ZLpp-0000vx-00
        for Axel.Boldt@OBSCURED3.metrostate.edu; Wed, 22 Aug 2001 01:25:29 +0100
Date: Wed, 22 Aug 2001 01:25:29 +0100
From: Stephen White <swhite@OBSCURED4.ox.compsoc.net>
To: Axel Boldt <Axel.Boldt@OBSCURED5.metrostate.edu>
Subject: Re: Computing history timeline for GNU encyclopedia
Message-ID: <20010822012529.A3581@benji.the-roost>
References: <sb812be5.012@mail.metrostate.edu>
Mime-Version: 1.0
Content-Type: text/plain; charset=us-ascii
Content-Disposition: inline
User-Agent: Mutt/1.2.5i
In-Reply-To: <sb812be5.012@mail.metrostate.edu>; from Axel.Boldt@OBSCURED6.metrostate.edu on Mon, Aug 20, 2001 at 03:25:19PM -0500
Sender:  <stephen@OBSCURED7.trillian.earth.li>

---- Original Message ----
> From Axel Boldt <Axel.Boldt@OBSCURED8.metrostate.edu>
> Date: Monday, 20 Aug 2001, 21:25
>
> I noticed that you have the definite computing history timeline on
> your web site. Maybe you have heard about the GNU style
> encyclopedia at http://wikipedia.com ; we currently have only a weak
> entry about computing history (in fact some of it seems to be
> illegally copied from your site). Would you consider donating your
> timeline to the Wikipedia? You can enter and edit the article about
> computing history yourself, just go to
> http://wikipedia.com/wiki/History_of_computers and click on "edit this
> page right now".

Ok.  First I'll give you permission to use whatever you want from my
computing history site in the encyclopedia.  I'd appreciate it if the
link http://www.ox.compsoc.net/~swhite/history.html is retained for
people to get the most up-to-date version of my information, however
since the GPL doesn't allow for such provsios this will remain an
informal "Gentleman's agreement" and is not legally required for the
inclusion of material from my site in your encylopedia or derived works.

On the second front I'm rather busy moving house at the end of the week
and I've been planning a bit of an update to my computing history pages
for a while - so I'm not sure when Ill have time to look closely at your
history of computing entry and possibly update it.  However I'll leave
this email in my pending folder in the hope that I'll have time to do so
in the not-too-distant future.

Good luck with the project,

-- 
Stephen White                    Oxford University Computing Society
System Administrator                  http://ox.compsoc.net/~swhite/
PGP Key ID: 0xC79E5B6A                       <swhite@OBSCURED9.ox.compsoc.net>

Fantastic!!! --User:LMS

See also : History of computing

[edit] 2002 talk

Is there a reason for all the bold entries in the article? They don't seem consistent. I'd like to remove them. Aldie 15:53 Nov 29, 2002 (UTC)

It appears that the original authors were trying to break up the long, long blocks of type by bolding some of the names. Crossheads are the way to do this. Change all the existing === h3 heads to the correct == h2 heads, debold all the names, then go back and add some === h3 heads that say things like TV Typewriter, etc. Ortolan88

[edit] 2003 talk

Noyce and Kilby were independent inventors of the Integrated Circuit. Intel invented the Microprocessor, of course, but not Noyce. 169.207.117.23 21:48, 25 Nov 2003 (UTC)


This page and and the timelines are incorrectly titled. They seem to be about history of technology used in computing rather than history of computing itself. Obviously, most computing until recently was done with pencil and paper, and that is not mentioned in these timelines. Would anyone object to moving this page to history of computing technology and starting a separate page that is about computing, not about machines used in computing? The statement that the "computing era" began only when computing machinery began is idiotic. Michael Hardy 21:48, 30 Nov 2003 (UTC)

Slide rules are not even mentioned on this page. Really, I'm beginning to think people trained in computer science should not be allowed in public places, in the interest of public safety. Michael Hardy 21:52, 30 Nov 2003 (UTC)

  • Done. 169.207.88.95 16:42, 1 Feb 2004 (UTC)

The article titled history of computing hardware is fairly long, but no one has attempted to write a history of computing itself on Wikipedia. Such an article would treat algorithms to be executed with pencil and paper, with or without the aid of tables, as well as computing with abaci, slide rules, or machines of any kind. Michael Hardy

I'd like that article (history of computing) to be renamed to "History of computing methods" for clarity. Tempshill 00:37, 4 Dec 2003 (UTC)

Why not move this article to History of computers, rather than the current and cumbersome History of computing hardware? Yes, before circa 1950, "computer" meant a person who did mathematical computation, and so one could argue that "History of computers" could refer to either computers (people) or the computers (machines)...but that would be a fairly trifling objection, I think. --Sewing 21:58, 17 Dec 2003 (UTC)

Hear, hear. A good suggestion for clarity and simplicity. --Wernher 22:05, 17 Dec 2003 (UTC)
Well, I want to change it but am reluctant to act, for 2 reasons: (1) There are a lot of pages that link to this one (which means manually changing them if I want to be a good Wikipedian); and (2) I may be wading into something I will later regret. I'll take a wait-and-see attitude for now... --Sewing 18:17, 18 Dec 2003 (UTC)

[edit] Title dispute

Originally listed at VfD

  • History of computers. It is currently a redirect to History of computing hardware. I couldn't move the 2nd article to the 1st, so I removed the redirect text in the 1st article, but I still couldn't do the move. "History of computing hardware" is a cumbersome attempt by a mathematician to distinguish the history of computers from the History of computing (the article's former title), which encompasses not only computers but pen and paper as well. His point is valid, but the new title he chose for the article is unnecessarily awkward. --Sewing 17:14, 21 Dec 2003 (UTC)
    • I am thinking whether History of computation is a better title than History of computing. btw There is a Timeline of computing, too. Optim 17:47, 21 Dec 2003 (UTC)
      • I agree History of computing is not ideal. But isn't History of computation also awkward? Anyhow, it goes back to Michael Hardy's argument that "computing" (and "computation") is not just about computers but about mathematical techniques that precede computers. I think History of computers is the best option: it is simple and unambiguous. --Sewing 18:08, 21 Dec 2003 (UTC)
      • History of computation still seems nice and more correct to me. Optim 19:01, 21 Dec 2003 (UTC)
      • I think the term computation is more often (academically) used for the theoretical side of things (algorithms, complexity,etc.), computers seems better for the practical side to me. --Imran 22:17, 21 Dec 2003 (UTC)
        • That's right. We can have a Computation article for the academic theoretical history and a Computers article for practical-business computing. how do u think? Optim 00:49, 22 Dec 2003 (UTC)
    • Keep, who wouldn't be interested in the history of computers? Lirath Q. Pynnor
    • Move to History of computers. Mathematics is as much a part of the history of computers as it is the history of their hardware. - Mark 06:58, 30 Dec 2003 (UTC)
    • I don't know why this was listed on VfD so long. It isn't really a VfD decision. It's more of a title dispute so I've listed it at Wikipedia:Current disputes over articles instead. Angela. 05:42, Jan 4, 2004 (UTC)
      • Well, it can't have been much of a dispute as no-one's discussed it for over two weeks, so I'm delisting it from the disputes page. Feel free to relist it if there really is a dispute. Angela. 01:41, Jan 22, 2004 (UTC)

[edit] Jack Kilby 1957

Even though I changed the date for the IC to 1958 to conform to the Nobel laureate article, I happen to know that Kilby thought of the IC during the mass vacation at TI (which would have been in late 1957). Kilby didn't have the vacation seniority, so he came to work at an empty Texas Instruments facility. The quietness of the work environment allowed Kilby to concentrate his thoughts and invent the IC. 169.207.115.129 01:41, 5 Jan 2004 (UTC) Thus the 1958 date must be the official publication date and not the actual date of conception.

[edit] Invention of the abacus

Some sources assert that the abacus was inventing in China around 3000 BC; others that it was invented by the Romans or Babylonians around 1000-500 BC and traveled east to China. At present, this Wikipedia article says it was of Chinese invention. It would be nice to come up with an account of the current opinion that was as complete, accurate, and NPOV as possible.


[edit] Turing Completion is not a good test for a computer

The article states that Turing Completion is "as good a test as any" for whether a machine is a computer. I fundamentally disgree. It is too easy to build a machine that is theoretically Turing Complete. The Z3 has been shown to be theoretically Turing complete yes. But so what! The z3 had no conditional branching and the proof that it was Turing complete relies on mathmatical tricks defined in the 1990's. It was never intended to be used as a general purpose machine. Babbages Analytical engine was more flexible than the Z3. Furthermore if the z3 was Turing Complete I would lay money on a bet that the ABC as also Turing Complete it was functionally very similar. And what about the Colossi. The MKii Colossi (of which 9 not 10 were built, the MKi was later converted to a MKii) at least had conditional branching. It too must have been "theoretically Turing complete.

It really isn't good enough to shy away from a hard definition by hiding behind the definition of Turing Completion. It has been shown that Conways game of life is Turing complete. It is possible to build a universal turing machine using only a carefully defined set of tiles and them applying conways rules. And what does this prove? It proves that Turing Completion is not a very difficult status to achieve.

Practical as opposed to theorectical Turing completion is something very different. The first computer that could automatically exploit the fact that it was Turing complete and could do this in a practical way, and solve real problems - That was the first computer. The ENIAC does not count it was a serial single purpose machine. Sure you could rebuild it like so many lego bricks but that is hardly a practical general purpose computer. The Manchester MKi was the first stored program machine but it's purpose was to prove that the williams kilburn tube worked effectively as a memory, not solve real problems. It was a research machine. The EDSAC at Cambridge was the first real computer in the modern sense. It was the first machine that could automatically exploit the fact that it was Turing Complete and it could do this in a practical way not merely as a party trick or under laboritory conditions. It was the first machine to impliment the von Neumann Architecture and solve real problems. (the Manchester Mki and maybe the BINIAC preceded the EDSAC but they never solved a real problem.)

A computer is a tool it must be practically capable not just theoretically capable. All the machines before EDSAC were theorectically general purpose but practically special purpose. A computer is a general purpose device. EDSAC was the first modern computer (You may now rip me to pieces ;-) John R.Harris

A minor comment: contrary to the commonly held belief, the Colossus computer in fact did not have condition branching. (Or, indeed, branching of any kind - or a program of any kind, for that matter!) So it definitely was not Turing-complete. See Talk:Colossus computer for more. Noel (talk) 05:05, 1 Mar 2005 (UTC)

[edit] The role of weather prediction in the development of computing

I am trying to work in Lewis Fry Richardson's use of differential equations for predicting weather. At the time he wrote his book 1922, computing was not practical for predicting weather, and yet I believe Atanasoff was trying to solve some meteorological problems when he invented the ABC; thus there has been a meteorological application since the first electronic computer; to this day, the supercomputers are used for predicting weather. Ancheta Wis 23:08, 5 May 2004 (UTC)

See: Navier-Stokes equations for the basic equation of weather prediction Ancheta Wis 18:08, 22 May 2004 (UTC) and alsoWikipedia:WikiProject Fluid dynamics. Richardson's approach is listed in Numerical ordinary differential equations. Ancheta Wis 10:12, 25 May 2004 (UTC)

I am replying to a high school librarian's assessment of this article: upon repeated re-reading and editing of the statements in this article, I can state categorically that the edits are made in good faith. As a professional with decades spent on technology, I have learned and experienced items which not even a professional historian could possibly have learned. Since the field has expanded every decade since the 1880's, and since technologists have not had a venue for documenting their accomplishments until the advent of Wikipedia, their work has gone unsung until now. Ancheta Wis 16:58, 26 Aug 2004 (UTC)

[edit] Italics everywhere?

Why is it that seemingly every noun in the article is in italics? Did someone get confused about how to make Wiki links? Most of the italic portions would be (I think) most appropriately either deitalicised, or made into wiki links. (Italic emphasis gratuitously added to illustrate how tiresome it is to read something formatted like that.)

Unless there's some particular reason why it's like that, I'll try to change them around a bit at some point soon. PMcM 02:27, 3 Dec 2004 (UTC)

Michael Hardy puts the usage thus: When a noun is used in a sentence, then it is not italicized, unless the sentence is about that noun, in which case it is italicized. Here is a link to further use of italics. Ancheta Wis 02:47, 3 Dec 2004 (UTC) Thus when I refer to logarithms of numbers (which are about a transformation of the respective numbers), I italicize to emphasize the transformation of the operations of multiplication and division into the operations of addition and subtraction.


Speedy response! Just finished playing around with it.

Who is Michael Hardy? I think that possibly going by the Wikipedia guidelines I feel it more appropriate to have a lot (about 75%) of what is/was in italics in that article as wiki links.

Certainly if it was written on paper it would be more appropriate to have the visual cue of italic text, used sparingly here and there where it might be confusing otherwise, but I personally don't feel it's necessary in the majority of places it was present in the article. If you're really incredibly attached to them, please feel free to put them back in, but I think the article would be less well off without the inclusion of the links I added. Thanks. PMcM 03:06, 3 Dec 2004 (UTC)

Unrelated: Any idea why this talk page has no contents section? Is it likely to be something I have set wrong, or is it the same for others? PMcM 03:10, 3 Dec 2004 (UTC)

It does have a contents section, you just have to look hard for it ;-) The reason is, the top of the page is filled with comments divided using horizontal rules. The TOC doesn't appear until after those. — Matt 10:39, 3 Dec 2004 (UTC)

Also, apologies for the somewhat patronising tone I used to initially raise the issue. PMcM 03:13, 3 Dec 2004 (UTC)

[edit] What to do with this tale...

I removed this:

During World War II, Curt Herzstark's plans for a mechanical pocket calculator (see Curta) literally saved his life. In 1938, while he was technical manager of his father's company Rechenmaschinenwerk AUSTRIA Herzstark & Co. he had already completed the design, but could not manufacture it due to the Nazi annexation of Austria. Instead, the company was ordered to make measuring devices for the German army. In 1943, perhaps influenced by the fact that his father was a liberal Jew, the Nazis arrested him for "helping Jews and subversive elements" and "indecent contacts with arian women" and sent him to the Buchenwald concentration camp. However, the reports of the army about the precision-production of the firm AUSTRIA and especially about the technical expertise of Herzstark lead the Nazis to treat him as an "intelligence-slave". His stay at Buchenwald seriously threatened his health, but his condition improved when he was called to work in the Gustloff factory linked to the camp. There he was ordered to make a drawing of the construction of his calculator, so that the Nazis could ultimately give the machine to the Führer as a gift after the successful end of the war. The preferential treatment this allowed him ensured that he survived his stay at Buchenwald until the camp's liberation in 1945, by which time he had redrawn the complete construction from memory. See: Cliff Stoll, Scientific American 290, no. 1, pp. 92-99. (January 2004) Also see: [2].

While this is a fascinating tale, I'm not sure whether its significant enough in terms of the history of computing hardware to deserve a long paragraph in an overview article on the topic. Mechanical calculators were commonplace by the 1930's, even if they weren't miniaturized. --Robert Merkel 23:38, 5 Dec 2004 (UTC)

inserted the information into Curt Herzstark Ancheta Wis 07:13, 6 Dec 2004 (UTC)

I just noticed that the DNA computing section in History of Computing was removed . Once one concedes that the travelling salesman problem is a true computation problem, then one must also concede that a computation using DNA is a computing hardware feat. If that is so, then the recognition that DNA can form the basis for a Turing tape is part of the history (and future) of computation; thus the recognition that DNA forms a code is part of the intellectual heritage of computing and part of its future. That is why Adleman actually solved a travelling salesman problem using DNA. But if that is truly a CS item, then Gamow deserves to be mentioned as this was part of the work that occurred before 1960. Ancheta Wis 01:43, 14 Dec 2004 (UTC)

This is an overview, which means some editorial judgement needs to be made about what are the most essential points to be covered in the space available. There are any number of things this article omits or discusses only briefly. There is much that could be said about analog computers, for instance. DNA computing, while an interesting concept, has not seen wide practical adoption compared to the technologies descended from those covered on this page. Therefore, remove was IMO appropriate. --Robert Merkel 12:26, 14 Dec 2004 (UTC)

[edit] Colossus relays

I altered the following:

"The Colossus used only vacuum tubes and had no relays."

It seems Colossus did use relays, both for buffering output and as part of its counters: [3], [4] — Matt Crypto 09:30, 13 Dec 2004 (UTC)

[edit] American developments

Just curious; why is "American developments" a distinct section? --Khendon 14:37, 1 Mar 2005 (UTC)

Earlier versions indeed intermixed the various projects irregardless of nation. Thus the current headings are a matter of preference by the contributors.

Hm. I think it makes much more sense to have a purely chronological article. --Khendon 16:46, 1 Mar 2005 (UTC)

I made the change in organization a while ago, to put Zuse and Colossus before the stuff on what was happening in America because it was the American work that led to ENIAC and the EDVAC design. Go back and read what it was like before and after the change was made and you'll hopefully see why I did it. Personally I think the article pays too much attention to Zuse and Colossus - they were both fascinating dead ends IMO - and if I was reorganizing the article further would considerably trim down the material on them. --Robert Merkel 23:13, 1 Mar 2005 (UTC)

According to http://www.scl.ameslab.gov/Projects/ABC/Trial.html, the role that the Atanasoff-Berry computer had in influencing the design of ENIAC may be grossly understated in this article

[edit] Request for references

Hi, I am working to encourage implementation of the goals of the Wikipedia:Verifiability policy. Part of that is to make sure articles cite their sources. This is particularly important for featured articles, since they are a prominent part of Wikipedia. The Fact and Reference Check Project has more information. Thank you, and please leave me a message when a few references have been added to the article. - Taxman 19:33, Apr 22, 2005 (UTC)

Taxman, this is an overview article, summarising facts found across many other articles. Hence, there's likely not to be much direct referencing here. --Robert Merkel 04:19, 23 Apr 2005 (UTC)
Added W.J. Eckert's little orange book. It would be good to acknowledge Lewis Fry Richardson's work, but it was decades before computers arose which could implement his method. (Now it would be called a system analysis, but he invented a field here.) I don't see a suitable way to work it into the article, which is about hardware, after all. Ancheta Wis 07:46, 23 Apr 2005 (UTC)
Great, thanks for your work, that is much better. Certainly an overview article can have references that back up its facts too. - Taxman 13:49, Apr 23, 2005 (UTC)

[edit] Heron of Alexandria

I scanned the article page and couldn't find any refference to Heron. I thought it may be a good idea to put a reference to his automated theater in the beggining, right around Wilhelm Schickard. However, I'm not sure since the concept of computing here seems to be more calculation-based, if no one has any problems I think it would be a good addition. Herons automated theater was a series of pegs with strings wrapped around them. Various weights were tied to the strings and controled the movement of objects for the play. In the end it was a simple analog computer program. --Capi crimm 03:21, 24 May 2005 (UTC)

I propose that you start a new page, History of automata or History of automatons. Stanislaw Ulam, John Von Neumann, John H. Conway, ... Stephen Wolfram were/are quite aware that the computing paradigm has automata in it. The topic is called Category:Cellular automata. However the concept of computing is tied to Leibniz' notion of expression evaluation, which means, in the case of a computing machine, we are automating human computation. When we are automating the motion of a puppet, which is one of the things that computers can do, the subject is called Automatic control, or Cybernetics or Robotics. Computer used to be a job title. Perhaps someday computers will be called controllers, as well. Come to think of it, perhaps History of controllers or History of cybernetics would be a good page for Heron's automated theater. Ancheta Wis 08:52, 24 May 2005 (UTC)
When I followed the Cellular automata link myself, I found material on the history of cellular automata, which would work quite well on the future History of ... page to which I am referring, as well as Heron's automated theater. Would you like to start such a page? I could contribute to it as well. Ancheta Wis 08:57, 24 May 2005 (UTC) If History of Cybernetics were to be the page, then Norbert Wiener's concept of the steersman (the root meaning of cybernetics) or the pilot would come into play. Aeronautics would come into play as well, because the theory of control became pressing with with the invention of the airplane. There is an extensive literature on automatic control in the IEEE Transactions. So we wouldn't just be flying blindly, to put a little pun in this. Machine vision could also be added, if the topic were to be Cybernetics.

[edit] List of books

I added a list of books for further reading. These are the ones I had on my shelf. The order may look haphazard, but I tried to put the more accessible ones at the top. I thought about ordering them by date or by author - if anyone thinks they should be that way, please feel free to change the order (and to add to the list, of course). --Bubba73 20:15, 7 Jun 2005 (UTC)

Thanks. If you had used them to fact check material in this or other articles, please consider listing them as actual references, or better yet, citing individual facts to them. Much better than taking a (potentially unknown) Wikipedia editor's word for the material is citing it to a reliable source. Use whatever format you like, but (Smith, 2003) is fine, or you can use some form of footnote system including the invisible one. Thanks again. - Taxman Talk 23:12, Jun 7, 2005 (UTC)
I haven't done much (if anything) to this page, but I've contributed a lot to particular computers, mainly 1946-1954. I used 6 or 7 of those books, but mainly about 3 of them. I should have put a reference for each of the edits (I did on a few) at the time, since now it is hard to know where I got what information, w/o looking it up again.

[edit] Too much analog...

Maybe some of the new analog computer stuff should be trimmed (and placed in the appropriate article), as it leaves the article as a whole rather unbalanced. --Robert Merkel 04:15, 11 Jun 2005 (UTC)

Add more to the other sections then. Greg321 10:39, 11 Jun 2005 (UTC)

This article is supposed to be a readable summary of the history of computing hardware. At the moment, this is like a history of the automobile that spent half its content looking at steam-powered cars. The excessive information obscures the forest for the trees.--Robert Merkel 23:29, 11 Jun 2005 (UTC)
While not an advocate of the current balance in the content, the current information does highlight the fact that steam powered computation was a vision which pioneers like Babbage and William Stanley Jevons were pursuing. It is not irrelevant, as it shows that technologies such as both mechanical and electrical analog computation, electronic digital computation, DNA computation and quantum computing are possible technologies, and certainly not the only possible forms for computing hardware. The period from 1945-1950 was important, but not the only possibility. It could have happened several other ways. The mechanical antecedents are very important, as they illuminate a path that could have been taken as early as the 1500s; only the requirements for precision of manufacture in the computing devices are missing for large-scale computing hardware. Ancheta Wis 12:40, 13 Jun 2005 (UTC)
That's an interesting point to discuss a little bit, but too much speculation as to what might have happened if history had turned out differently is likely to get unencyclopedic very quickly. --Robert Merkel 01:25, 14 Jun 2005 (UTC)
William Stanley Jevons actually was one who dreamed of steam-powered computation, possibly as early as the time when he lived in Australia 150 years ago. His Logic Machine was exhibited in Sydney last year, BTW. Ancheta Wis 21:42, 14 Jun 2005 (UTC)

[edit] More detail on some sections

More detail could be added to the sections on electronic computation. We could add more on the role of Herman Goldstine, von Neumann, etc. for example. What I have in mind is the chance meeting of Herman Goldstine and von Neumann on the Princeton train, and how it turned into a doctoral examination on Computer Engineering for Goldstine. Another item might be how the Israelis got the von Neumann architecture first hand; that is how their first machine got built. Another item might be the use of Binary Coded Decimal in the first electronic computers. Perhaps we might sketch a little outline before actually adding in the text. Ancheta Wis 23:36, 14 Jun 2005 (UTC)

  • I agree. Some of this is in other articles. See IAS machine for how Israel got the von Neumann architecture - the plans for the IAS were made freely available, and about 15 computers were based on the design, WEIZAC was one of them. Bubba73 01:10, 15 Jun 2005 (UTC)

[edit] A comment from another point of view

I was active in leading edge electronics in about 1970. The article makes good sense and fills me in on a number of things I didn't know. Its a good article. It doesn't mention the driving force for miniturization, nor where the money came from to do the leading edge research and its subsequent application. It doesn't talk about how transistors, once discovered, were worked into machines which made a few decisions by themselves, based on inputs from other machines. The theoretical developments (i.e. transitors, field effect transistors, storage devices) were implemented into hardware and miniturized in a number of ways. While I don't know exactly, the USA military was a huge force in the area. Taxpayers in large measure paid for research (universities) and implementation of reasearch (military projects). As an example of implementation, I worked for a company that contracted to the military for a megabuck, producing 4 radar receivers that could be installed on 4 aircraft. The point I'm making is that the military money was a prime source of the energy that miniturized computers, there is a trickle down effect that goes on even today. Military spends money to have leading edge equipment and then that expertise trickles down to the consumer. Today the trickle down happens faster than in the 1970s is my impression. Anyway its a good and useful article as it stands, happy days. Terryeo 13:42, 4 February 2006 (UTC)

[edit] Overview

This article, along with ENIAC, Computer, Atanasoff–Berry Computer, Zuse and related pages, and others, are crying out (IMHO) for some sort of organizational overview. Does anyone know what Wikipedia policy is on such pages? -- Gnetwerker 18:36, 16 March 2006 (UTC)

So what do you have in mind? This article exists because User:Michael Hardy strongly differentiated the history of computing from its hardware. And there are the timelines of computing history from the Jargon files, which also strongly influenced the article. The computing article may be what you have in mind as a venue as it is mostly lists with a veneer of prose. Perhaps you might place your overview there? --Ancheta Wis 18:52, 16 March 2006 (UTC)

[edit] A question about units

I was curious as to why metric units were used throughout the article, especially in the section on American Development. Could Imperial units could be put in parenthetically? Yeah, I know, lazy Americans and all that… — ChardingLLNL 18:13, 20 July 2006 (UTC)

[edit] put 'before 1960' in the title

As it is, the article title is misleading and inaccurate. --Apantomimehorse 17:26, 17 August 2006 (UTC)

The article evolved. This is the original (or basic) article implemented upon the suggestion of User:Michael Hardy who wanted to clearly distinguish The History Of Computing from its hardware. I oppose turning a Featured Article upside down in favor of a renaming dispute. The article, if cast into the decades format would then be indistinguishable from the Timeline of Computing. But the current article clearly shows this history before Electronic Computers, which will eventually be superseded, whereas Computing will survive as long as Mankind survives. The hardware started with bones and sticks. 1960 is an accident of history and the convention of 32K pages for an article. --Ancheta Wis 19:43, 17 August 2006 (UTC)

[edit] Picture obscures text

The picture of Herman Holerith is on top of some text when seen on Firefox Resolution at 1280 x 1024 in full screen Sdp1978 00:12, 5 January 2007 (UTC) virendra

[edit] Speculative sentence?

"This technology was lost, however, and over 1,600 years would elapse before similarly complex computing machines were again created." - This sentence about the Antikythera mechanism seems purely speculative to me. This is arguing from absence of evidence, and largely unnecessary in this particular article in any case. Since we have not much evidence of prior art either, one might just as well claim this was a totally unique object in it's times, a totally unreasonable inference. -- Cimon Avaro; on a pogostick. 16:10, 5 April 2007 (UTC)

I have moderated the above statement in the article. I have a further query though... Is it fair to say that Kepler truly revolutionized astronomy? To me it appears like a peacock term. -- Cimon Avaro; on a pogostick. 06:36, 15 April 2007 (UTC)

400 years ago, Kepler spent 20 years of his life discovering his three laws, based on Tycho Brahe's observations. Yes, he really did revolutionize Astronomy. He was the first to do what he did. --Ancheta Wis 06:45, 15 April 2007 (UTC)
That's okay then -- Cimon Avaro; on a pogostick. 18:16, 15 April 2007 (UTC)

[edit] A question

Why link to EC-130 links not to EC-130 computer, but to EC-130 aircraft instead?

[edit] Punch card history

Puch cards actually had a predecessor, namely play drums found in carillons. They were widely used from the sixteenth century on in the low countries. Play drums were linked to a clock to automatically play music every hour. A picture of a play drum can be found in the dutch wikipedia article on carillons. Essentially this is very similar to book music, just a little more primitive. So I think it should be mentioned in the article. Gespenster 19:07, 10 August 2007 (UTC)

[edit] Von Neumann

The unabashed credit given here to Von Neumann for the stored computer architecture is not reflected by most historians or the Wikipedia page on Von Neumann himself -- go look. The agreed on interpretation by historians and the people who worked on the EDVAC project was that Von Neuman was collecting notes on the groups presentations, and decided, unilaterally, to publish it under his name. Presper Ekert is less kind and basically says that Von Neumann clearly decided to grab the credit for himself.

In any case, I doubt it serves any purpose for Wikipedia to distribute this kind of misinformation. —Preceding unsigned comment added by 76.102.198.58 (talk) 08:01, 30 September 2007 (UTC)

This is now fixed in the article. --Ancheta Wis (talk) 08:11, 10 May 2008 (UTC)

[edit] Punched cards are still used and manufactured in the current century

This sentence automatically updates its meaning when the century changes, and it changed only a few years ago. What century was intended? tooold 08:04, 4 October 2007 (UTC)

Changed the sentence. Thank you for your note. --Ancheta Wis 10:00, 4 October 2007 (UTC)

[edit] Copyedits needed for 2nd generation section

In the interests of keeping this article a featured article, might we move the latest contribution on 2nd generation computers to the talk page and work on the English prose before re-instating it to the article page? --Ancheta Wis (talk) 14:17, 2 January 2008 (UTC)

Hi, thats my work! What is wrong with it? 92.1.67.188 (talk) 14:35, 2 January 2008 (UTC)
Hi, I replied on your talk page to redirect here.
  1. The expert you allude to on your first paragraph was Thomas J. Watson
  2. The Von Neumann pre-print of 1947 went all over the world. That is how Israel built its first computer, for example. Russia did the same. What the 1954 date on Italy's first computer shows is that they built either on Von Neumann's architecture or they studied other documents. In any case, a citation would be good.
  3. Did the IBM 1401 use only transistors?
  4. Does tenths of thousands mean 10000 or 100?
In any case, I think you see what I mean. The English needs copyediting. I am not referring to your content, with the exception that we need citations. --Ancheta Wis (talk) 15:19, 2 January 2008 (UTC)
I have commented out the contribution, as the English needs copyediting. The sentences are disconnected, the timeline of development for second generation is nonsequential, there is no flow from one statement to the next. --Ancheta Wis (talk) 10:47, 17 February 2008 (UTC)

computers are an amazing creation of sensation. —Preceding unsigned comment added by Sckater (talkcontribs) 21:50, 8 March 2008 (UTC)

[edit] question

i was wondering whether the Antikythera mechanism is the first computer, cause their are lot articles that make that claim.Tomasz Prochownik (talk) 21:05, 23 April 2008 (UTC)

Follow the link for the latest thinking on the matter. --Ancheta Wis (talk) 23:18, 23 April 2008 (UTC)

[edit] Citations needed

Fellow editors, User:Ragesoss has noted that we are building back up the citations for this FA. When this article was first formed, the rise in standards for Featured Articles had not yet occurred. Since I have been volunteered for this, there will be an American bias to the footnotes I am contributing; please feel to contribute your own sources.

Please feel free to step up and add more citations in the form of the following markup: <ref>Your citation here</ref>. You can add this markup anywhere[1] in the article, and our wiki software will push it to the <references/> position in the article page, individually numbered and highlighted when you click on the ^. As an illustration, I placed this markup on the talk page so that new users can even practice on this talk page.

In my opinion, the best source is Bell and Newell (1971)[2], which is already listed in the article. I do not have time to visit the local university library, so my own contributions are from sources which I have on my own bookshelves; this may be appropriate since the seminal period 1945-1950 will probably be viewed as the heyday of first generation of electronic digital computers, which blossomed in the US, 1945-1950.[3],[4],[5],[6],[7],[8],[9] I recognize that there will need to be more citations from the Association for Computing Machinery and the IEEE Transactions, but that will have to come from those editors who are in the Wikiproject on computing. In particular, the Radiation Laboratory of MIT published a series of books The M.I.T. Radiation Laboratory Series[10] which are the foundation for computing hardware, in tandem with the Manhattan Project; what is common to these projects is that they involved groups of cooperating contributors.[11] Before the howls of outrage subside, please note that the exact forms of computer hardware had not yet been selected in this period, but since the technologists were already in place for other purposes, it was a small step to the forms of hardware we see today.[12],[13],[14],[15],[16],[17], [18] The forms of hardware could easily have gone in other directions, and our current computers would have been different from what could have been.[19] [20]

New users (especially those with a CS or EE background ), please feel free to contribute your citations. Wikipedia:Five Pillars summarize the guidelines for editors, and your cheatsheet for markup can be found here. Users can append comments to the foot of this talk page, signed with the signature markup: --~~~~

Casual readers might note that the references which will be added to this article can be purchased quite cheaply on the Internet (typically for a few dollars), which in sum would amount to a nice education in this subject. --Ancheta Wis (talk) 09:31, 3 May 2008 (UTC)

We are up to 59 footnotes. You can examine the edit history to see how the citations were embedded in the article, as well as study this section, for examples on how to do it. --Ancheta Wis (talk) 10:01, 6 May 2008 (UTC)

User:SandyGeorgia has noted that the citations are expected to have a certain format. Everyone is welcome to improve the citations. --Ancheta Wis (talk) 01:42, 7 May 2008 (UTC)

It appears that the footnote macro is space-sensitive. For example <ref name=IBM_SMS/ > works, but <ref name=IBM_SMS/> causes error messages unless a space is added after the trailing slash. To see this, look at this diff --Ancheta Wis (talk) 09:42, 9 May 2008 (UTC)

Sample citation format from User:Wackymacs:[21]

  • This one was formatted incorrectly. There should be a "|" in between the url and the accessdate like this:[22]

[edit] References sample illustration

  1. ^ Your citation here
  2. ^ Gordon Bell and Alan Newell (1971) Computer Structures: readings and examples ISBN 0-07-004357-4
  3. ^ Herman Goldstine's 1947 First Draft of a Report on the EDVAC, which was mimeographed and distributed worldwide, had a global effect, producing von Neumann-architecture computer systems world-wide. For example, the first computer in Israel was built this way.
  4. ^ Federal Telephone and Radio Corporation (1943, 1946, 1949), Reference Data for Radio Engineers
  5. ^ The Jargon File, version 4.4.7 The Jargon file
  6. ^ Charles Belove, ed. (1986) Handbook of modern electronics and electrical engineering, ISBN 0-471-09754-3
  7. ^ Sybil P. Parker, ed. (1984) McGraw-Hill encyclopedia of electronics and computers ISBN 0-07-045487-6
  8. ^ Arthur B. Glaser and Gerald E. Subak-Sharpe (1977), Integrated Circuit Engineering ISBN 0-201-07427-3
  9. ^ Richard H. Eckhouse, Jr. and L. Robert Morris (1979), Minicomputer Systems: organization, programming, and applications (PDP-11) ISBN 0-13-583914-9
  10. ^ For example, John F. Blackburn (1947), Components Handbook, Volume 17, M.I.T. Radiation Laboratory Series, Lexington, MA: Boston Technical Publishers
  11. ^ "I must say that I did not design Windows NT -- I was merely one of the contributors to the design of the system. As you read this book, you will be introduced to some, but not all, of the other contributors. This has been a team effort and has involved several hundred person-years of effort." -- Dave Cutler, Director, Windows NT Development, in the foreword to Inside Windows NT, ISBN 1-55615-481-X, by Helen Custer, p. xix.
  12. ^ Ron White (1995), How Computers Work ISBN 1-56276-344-X
  13. ^ Scott Mueller (2002), Upgrading and repairing PCs ISBN 0-7897-2683-1 CHECK_THIS_ISBN
  14. ^ Harry Newton (1998), Newton's Telecom Dictionary ISBN 1-57820-023-7
  15. ^ George McDaniel, ed. (1993), IBM Dictionary of Computing ISBN 0-07-031489-6
  16. ^ Paul Horowitz & Winfield Hill(1989). The Art of Electronics ISBN 0-521-37095-7
  17. ^ David A. Patterson and John L. Hennessy (1998), Computer Organization and Design ISBN 1-55860-428-6
  18. ^ Alan V. Oppenheim and Ronald W. Shafer (1975), Digital Signal Processing ISBN 0-13-214635-5
  19. ^ W.J. Eckert (1940), Punched card methods in scientific computation, Lancaster, PA: Lancaster Press
  20. ^ Robert Noyce's Unitary circuit, US patent 2981877, "Semiconductor device-and-lead structure", granted 1961-04-25, assigned to Fairchild Semiconductor Corporation 
  21. ^ Jones, Douglas W. accessdate=2008-05-15 Punched Cards: A brief illustrated technical history. The University of Iowa.
  22. ^ Jones, Douglas W. Punched Cards: A brief illustrated technical history. The University of Iowa. Retrieved on 2008-05-15.

[edit] Zuse and Von Neumann

According to Hennesey and Patterson, Von Neumann knew about the details of Zuse' floating-point proposal. This suggests that the sentence 'Zuse was largely ignored' should be stricken. Any objections? --Ancheta Wis (talk) 10:30, 5 May 2008 (UTC)

Zuse did not implement the floating-point design he patented in 1939, before WWII ended. Von Neumann was aware of Zuse's patent and refused to include it in his Princeton machine, as documented in the seminal paper (Burks, Goldstine and von Neumann, 1946). -- Hennesey and Patterson p.313, note "A decimal floating point unit was available for the IBM 650, and [binary floating-point hardware was available for] 704, 709, 7090, 7094, ... ". "As a result, everybody had floating point, but every implementation was different." .

To this day, floating point operations are less convenient, less reliable, and more difficult to implement (in both hardware and software). -Ancheta Wis (talk) 08:07, 10 May 2008 (UTC)

[edit] 'First electronic computer'?

This assertion is made about the Colossus in this article. It is also made about the ACE in that article. THERE CAN BE ONLY ONE! Twang (talk) 18:59, 10 May 2008 (UTC)

On the other hand, the article also states "Defining a single point in the series as the "first computer" misses many subtleties." thank you for BEING BOLD! You are welcome to contribute to the article and the talk page! --Ancheta Wis (talk) 20:34, 10 May 2008 (UTC)
Not to be too pedantic, but the article is an example of how a recurring need (in this case, the need to calculate) gets met multiple ways, at multiple times, by multiple people trying to solve a problem. For example, Pascal was trying to help his dad collect taxes; ENIAC was used to fight a war by calculating the trajectories of artillery shells; Zuse was trying to ease the burden of his engineering work; Colossus was trying to decode secret messages; IBM was trying to extend the use of its punch card machines for business purposes; Maurice Wilkes was excited about the possibilities of the First Draft of the Design for EDVAC. You get the idea: it's asking 'What does the first mean?'. As we now know from spacetime, time depends on the observer - what does first mean in that case? It only has meaning in the context of a thread. Thus clearly, Maurice Wilkes came after ENIAC, but before the implementation of EDVAC. Colossus was secret, so it was part of a different thread, by definition. And in the article, there is evidence that von Neumann knew something of the ideas of Zuse, so the design and architecture of EDVAC is after Zuse. However, you cannot say that the implemented EDVAC is after Wilkes' machine implementation - they are parallel threads which branched after Wilkes was influenced by the First Draft. These ideas are part of Lee Smolin's book Three roads to quantum gravity ISBN 0-465-07835-4 pp.53-65. (As you can see, classical logic needs to be reformulated. The world is not monotonic.) I don't have Smolin's book in front of me so I can't give you a page number right now. And I can't put what I just wrote in the article because I don't have a citation other than Smolin, which isn't explicitly about computing hardware (it's about physical processes in general). --Ancheta Wis (talk) 21:07, 10 May 2008 (UTC)
Just following up about ACE, the Automatic Computing Engine. It's the same idea. Turing owed nothing to EDVAC. So there are other editors who have the same kind of reasoning as Smolin's work, stated above. However, just Turing's knowledge that EDVAC is possible said a lot to him -- the ACE solution also has to obey the laws of physics, like EDVAC; thus the ACE problem solvers had a lot less work to do when solving their specific issues on the way to a goal.
These kinds of problems, about priority and independence, are being solved with clean rooms, where developers work in isolation from other implementers. This is all faintly antique for anyone in the open source movement; all that has to be done in open source is to include the provenance of the code base, to keep it Open.
That's where Wikipedia can make its mark on world culture: we can keep everyone honest about who owes what to whom, by citing our sources. This article clearly states that von Neumann owed much of his First Draft to Zuse, Eckert/Mauchly (who owe something to Atanasoff/Berry) and the rest of the inventors who came before him. And Wilkes (and the rest of the world) owe much to von Neumann, etc. Since Turing's ACE does not have priority over Wilkes' machines, the ACE article should probably heavily qualify the meaning of first in its text. That brings us to Emil Post, the American logician who is independent of Turing, but who waited too long to publish. (He had his ideas 15 years before Turing's 1936 publication...) --Ancheta Wis (talk) 21:39, 10 May 2008 (UTC)

[edit] Contributions welcomed.

Fellow editors, you are welcome to make your contribution to this article. See the sections above for examples on adding citations. Be Bold.

--Ancheta Wis (talk) 10:43, 11 May 2008 (UTC)

[edit] ENIAC 1,000 times faster than its contemporaries

The article currently states "(Electronic Numerical Integrator and Computer) .... it was 1,000 times faster than its contemporaries." As it is stated that ENIAC was Turing complete, if it had been programmed to break "Tunny" would it have been 1,000 times faster than Colossus? If not then this sentence needs changing. --Philip Baird Shearer (talk) 10:08, 13 May 2008 (UTC)

If we are comparing electromechanical relays to vacuum tubes then the statement is correct. But Tunny came after ENIAC, so it is a descendant, and not a contemporary, which would have been Z1 (the only unclassified project).
You might change the article page, for example, replacing contemporaries with Z1 in the statement. Citations are welcomed. This page needs more contributors! --Ancheta Wis (talk) 03:35, 15 May 2008 (UTC)
The sentence has been changed. --Ancheta Wis (talk) 08:41, 19 May 2008 (UTC)

[edit] The number of pictures

Ancheta Wis, you're doing amazing work here - but don't you think the article should have less pictures? — Wackymacs (talk ~ edits) 06:23, 15 May 2008 (UTC)

Thank you for your kind words. I propose to comment out Herman Hollerith, the Jacquard loom, the Manchester Baby, and others.
Editors, you are welcome to contribute to this article and talk page. Be Bold. Citations wanted.
--Ancheta Wis (talk) 10:06, 15 May 2008 (UTC)
Good work. Still too many. Some images obscure section headings (in other words, push them out of order). Also, per WP:MOS, images should not be placed directly under a section heading on the left side. — Wackymacs (talk ~ edits) 10:10, 15 May 2008 (UTC)

[edit] Citations

It is no good adding lots of citations, when half of them are not formatted properly with the citation templates provided. Please see Wikipedia:Citation templates. All web citations should use the Cite web template, and must have an access date. Also, a lot of the current citations look questionable, and some are useless. (For example, the two citations in the lead explaining hardware and software) - Why? Wikipedia has articles on both of these. — Wackymacs (talk ~ edits) 10:45, 15 May 2008 (UTC)

So the next step is to revisit the citations, using the sample you have provided and reformat. As part of the history of this article, when we did this, the footnote software had not yet reached its current state. I hope it is stable enough to rely on for the future. I have no objection to go back and revisit the footnotes, as I am a believer in the spiral development process. --Ancheta Wis (talk) 08:06, 16 May 2008 (UTC)
The "Example 2 article text" appears to be a codification of the usage of ordinary wiki markup practices over the years. I propose reformatting the existing citations into that style. I must say that it appears to place human editors into the position of data entry clerks for the care and feeding of the citation monster. After reading Wikipedia:Citation templates, my reaction is that this article/policy? will evolve.
My personal preference is for "Example 2 article text", and my guess is that any of the items in Wikipedia:Citation templates is acceptable to the FA reviewers. True statement? --Ancheta Wis (talk) 08:29, 16 May 2008 (UTC)
You can either use {{Citation}} for everything, or a mixture of {{cite news}}, {{cite web}}, {{cite book}}, and so on. Both methods are acceptable at FA. — Wackymacs (talk ~ edits) 08:54, 16 May 2008 (UTC)
My last re-format using the cite template ate the name of a co-author. I have to go now, and will return to this issue later. --Ancheta Wis (talk) 16:53, 17 May 2008 (UTC)
This diff shows 27119 net bytes (a 33% increase) have been added to the article since 29 April 2008. I have attempted to address the concerns of Wackymacs (1c) and SandyGeorgia (1a) in the meantime. --Ancheta Wis (talk) 10:50, 19 May 2008 (UTC)
All book footnotes should have specific page numbers. Ancheta Wis, can you start adding page numbers (assuming you have the books which are referenced in footnotes)? — Wackymacs (talk ~ edits) 16:50, 5 June 2008 (UTC)
My books are largely in my basement with the exception of the 40-lb. box I dragged upstairs for the article. But some of the books I have not looked at since I left the semiconductor industry some decades ago, which does not mean I do not remember where I learned the fact, and which book title I have already cited. I am thinking of Mead and Conway, to be specific. To avoid time pressure, because I cannot predict where (in what box, as is probably apparent, I own thousands of books, not to mention 3 editions of Britannica) I will unearth the book, I will simply comment out those book refs which lack the page numbers. I will also try to conserve on byte in the references for the sake of the page limit. --Ancheta Wis (talk) 00:12, 6 June 2008 (UTC)