Jump to content

Talk:History of computing hardware/Archive 1

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Archive 1Archive 2Archive 3Archive 4

2001 talk

I doubt very much now that Harvard Mark I was fully programmable. Later models could switch conditionally from one paper roll to another, but since I don't believe it could rewind the paper rolls, no actual loops are possible and it's not Turing complete. But I'm not sure. Anybody know details about the Mark I?

Oh, and I just read that the Manchester Mark I was actually the first functional von Neumann machine, even before EDVAC -- but of course based on EDVAC's ideas. --AxelBoldt


I believe you are correct about the IBM-Harvard Mark I. This machine, by the way, was not built at or by Harvard. It was built for Harvard (and the U.S. Navy) by IBM.

My understanding is that the first operational stored program computer was the "Manchester Baby Mark I", a test machine for the Williams-tube storage technology, not the Manchester Mark I itself. The EDSAC at Cambridge appears to have preceeded the Manchester Mark I as the first "practical" stored program computer in operation.


Axel: An electromechanical computer necessarily uses some electronics. Thus "electro". But I understand what you mean about the electronics/electromechanical distinction.

Aiken directed the construction of the ASCC by IBM engineers at the IBM Endicott labs. Construction was completed in 1943. It was moved to Harvard, and operation began May 1944. [1]

As stated, the EDVAC was never completed--so all EDVAC-based computers were "before EDVAC". The "Baby" was first based on the EDVAC design that got a program running. --The Cunctator

I gotta say, the Wiki method really works--this entry has gotten amazingly better in a vary short period of time. It's still a little too discursive (some of the specificity would be better in stand-alone entries), but it's highly informative and readable. --The Cunctator


Not to disagree, but there's still a whole lot missing. No mention of Whirlwind, SAGE, PLATO, to give just a few examples.

Is that a disagreement or not? SAGE is mentioned in the history of networking... --The Cunctator


It seems like the end of the article is the original timeline visible at the top of this page table, and reads very much like a timeline. Wouldn't more of an overview and synthesis be appropriate, considering we have the (very good IMHO) other timeline?


I agree, particularly the latter part of the article has too many dates, names and details obscuring the general flow of progress. --AxelBoldt

I don't want to be argumentative, but I thought the new article didn't tell much of anything before WWII or after 1970, let alone flow of progress. The flight control system of the F14, while interesting, was hardly a landmark computer.

Yes, there was a fair amount where I just went in and pasted missing stuff from the old page. However, I feel it is more important to have date-filled placeholders than nothing at all. Now that some base data is there, anyone can go in and rewrite/rearrange it. By all means, feel free to edit as you see appropriate. The power of Wiki :-) --Alan Millar


Names, dates, and details are good things; but need to be pushed down into more detailed articles on more specific topics. At the same time, an overview/summary/synthesis needs to be presented at this level. But my guess is its easier to do this bottom-up rather than top-down. In other words, collect all the detailed information first, then refactor into appropriate levels of detail.

Also, should this article cover software as well as hardware? -HWR

Of course, hardware w/o software is scrap metal. The question is whether it's tangible enough to procude records. --Yooden


Anyone can refactor (a basic design feature of Wiki), but only if there is some information to refactor, so I think the bottom-up approach is necessary.

But all the information is already on the Computing timeline page, so why repeat it here? I think this article should have a bird's eye view on Computing history, just outlining the developments, and not listing anecdotes such as ads bought by certain companies at certain sports games. --AxelBoldt


As to Swiss clocks: the essence of computing is not the addition and subtraction of numbers, although it grew out of it and is a necessary part of it. The essence of computing is the execution of a sequence of instructions, and in that respect modern computers have as much in common with Swiss clocks as the abacus. And no, I'm not recommending removing the reference to the abacus :-) --Alan Millar

Swiss clocks neither process information nor can be programmed. They are just fancy mechanical devices, like all mechanical clocks. I don't see any relation to the history of computing except maybe that some early mechanical calculators used similar mechanisms as mechanical clocks (why Swiss?). Also, why are they mentioned in the paragraph about programmability? --AxelBoldt

What about music boxes? They're programmed to play tunes. -HWR

They have a single sequence, as do player pianos, and player pianos can even use a different paper roll to play a different tune. In that respect, the music box mechanically is a predecessor to the Jacquard loom. The Swiss clocks had multiple sequences of actions, where a main cog would activate other cogs to order different actions. The first GOSUB? :-) --Alan Millar

Actually, there are music boxes that play tunes from interchangeable discs. I don't know the chronology of this however.

BTW, is this article restricted to the history of DIGITAL computers? Analog computers don't generally execute sequences of instructions. -HWR


"IBM decided to enter the PC market ..., with the IBM XT" is not correct -- the XT was their second machine, with the hard drive.

That's correct--I'll change it. The first one was simply called the "IBM PC". Some mention of Compaq and the beginnings of the clone market in that era seems appropriate too. --LDC


I'm afraid this entry is getting too timeline-y...but I see that others are aware of that. Looks like we need to start thinking about some more subentries...anyone have any suggestions? --The Cunctator


Unfortunately the timeline here has many inaccuracies and ommissions of historical importance: 1965: IBM System 360 (first OS); 1968 first mouse/window system demo; 1973: CPM first micro OS; 1969 Intel 4004; 1977 Commodore Pet & TRS 80; 1978 Atari 400/800; 1979 Motorola 68000 32 bit CPU (w. 16 bit data and 24 bit address bus); 1981 Commodore Vic20 & IBM PC & Xerox Star (w. GUI/Mouse/Ethernet...); 1982 Commodore 64 with 64k RAM $600 & Timex Sinclair 2K RAM $99; 1983 1 million Commodore Vic20s and 1 million Apple IIs sold; 1985 Commodore Amiga with multitasking/Color GUI/accelerated video/stereo sound/3.5" floppy $1200; 1988 7 million Commodore 64 and 128 computers sold.... --Jonathan--

Feel free to enter whatever you think is missing to Computing timeline, not to History of computing. --AxelBoldt


Ack! It's getting insanely more timeliney! I'm thinking of paring. Please, everyone, notice Computing timeline. History of computing shouldn't supposed to list every computer, but discuss the intellectual development of the engineering/science of computing. --The Cunctator


Moved from /Permission-subpage:

I have obscured the email addresses in the message below in an obvious way. --AxelBoldt

Received: from mail11.svr.pol.co.uk
	by mail.metrostate.edu; Tue, 21 Aug 2001 19:25:28 -0500
Received: from modem-88.bass.dialup.pol.co.uk ([217.134.8.88] helo=arthur.the-roost)
	by mail11.svr.pol.co.uk with esmtp (Exim 3.13 #0)
	id 15ZLpr-0001gy-00
	for Axel.Boldt@OBSCURED1.metrostate.edu; Wed, 22 Aug 2001 01:25:32 +0100
Received: from benji.the-roost
	([10.0.0.5] helo=localhost ident=mail)
	by arthur.the-roost with esmtp (Exim 2.12 #1)
	id 15ZLpq-0003Te-00
	for Axel.Boldt@OBSCURED2.metrostate.edu; Wed, 22 Aug 2001 01:25:30 +0100
Received: from stephen by localhost with local (Exim 3.12 #1)
	id 15ZLpp-0000vx-00
	for Axel.Boldt@OBSCURED3.metrostate.edu; Wed, 22 Aug 2001 01:25:29 +0100
Date: Wed, 22 Aug 2001 01:25:29 +0100
From: Stephen White <swhite@OBSCURED4.ox.compsoc.net>
To: Axel Boldt <Axel.Boldt@OBSCURED5.metrostate.edu>
Subject: Re: Computing history timeline for GNU encyclopedia
Message-ID: <20010822012529.A3581@benji.the-roost>
References: <sb812be5.012@mail.metrostate.edu>
Mime-Version: 1.0
Content-Type: text/plain; charset=us-ascii
Content-Disposition: inline
User-Agent: Mutt/1.2.5i
In-Reply-To: <sb812be5.012@mail.metrostate.edu>; from Axel.Boldt@OBSCURED6.metrostate.edu on Mon, Aug 20, 2001 at 03:25:19PM -0500
Sender:  <stephen@OBSCURED7.trillian.earth.li>

---- Original Message ----
> From Axel Boldt <Axel.Boldt@OBSCURED8.metrostate.edu>
> Date: Monday, 20 Aug 2001, 21:25
>
> I noticed that you have the definite computing history timeline on
> your web site. Maybe you have heard about the GNU style
> encyclopedia at http://wikipedia.com ; we currently have only a weak
> entry about computing history (in fact some of it seems to be
> illegally copied from your site). Would you consider donating your
> timeline to the Wikipedia? You can enter and edit the article about
> computing history yourself, just go to
> http://wikipedia.com/wiki/History_of_computers and click on "edit this
> page right now".

Ok.  First I'll give you permission to use whatever you want from my
computing history site in the encyclopedia.  I'd appreciate it if the
link http://www.ox.compsoc.net/~swhite/history.html is retained for
people to get the most up-to-date version of my information, however
since the GPL doesn't allow for such provsios this will remain an
informal "Gentleman's agreement" and is not legally required for the
inclusion of material from my site in your encylopedia or derived works.

On the second front I'm rather busy moving house at the end of the week
and I've been planning a bit of an update to my computing history pages
for a while - so I'm not sure when Ill have time to look closely at your
history of computing entry and possibly update it.  However I'll leave
this email in my pending folder in the hope that I'll have time to do so
in the not-too-distant future.

Good luck with the project,

-- 
Stephen White                    Oxford University Computing Society
System Administrator                  http://ox.compsoc.net/~swhite/
PGP Key ID: 0xC79E5B6A                       <swhite@OBSCURED9.ox.compsoc.net>

Fantastic!!! --User:LMS

See also : History of computing

2002 talk

Is there a reason for all the bold entries in the article? They don't seem consistent. I'd like to remove them. Aldie 15:53 Nov 29, 2002 (UTC)

It appears that the original authors were trying to break up the long, long blocks of type by bolding some of the names. Crossheads are the way to do this. Change all the existing === h3 heads to the correct == h2 heads, debold all the names, then go back and add some === h3 heads that say things like TV Typewriter, etc. Ortolan88

2003 talk

Noyce and Kilby were independent inventors of the Integrated Circuit. Intel invented the Microprocessor, of course, but not Noyce. 169.207.117.23 21:48, 25 Nov 2003 (UTC)


This page and and the timelines are incorrectly titled. They seem to be about history of technology used in computing rather than history of computing itself. Obviously, most computing until recently was done with pencil and paper, and that is not mentioned in these timelines. Would anyone object to moving this page to history of computing technology and starting a separate page that is about computing, not about machines used in computing? The statement that the "computing era" began only when computing machinery began is idiotic. Michael Hardy 21:48, 30 Nov 2003 (UTC)

Slide rules are not even mentioned on this page. Really, I'm beginning to think people trained in computer science should not be allowed in public places, in the interest of public safety. Michael Hardy 21:52, 30 Nov 2003 (UTC)

The article titled history of computing hardware is fairly long, but no one has attempted to write a history of computing itself on Wikipedia. Such an article would treat algorithms to be executed with pencil and paper, with or without the aid of tables, as well as computing with abaci, slide rules, or machines of any kind. Michael Hardy

I'd like that article (history of computing) to be renamed to "History of computing methods" for clarity. Tempshill 00:37, 4 Dec 2003 (UTC)

Why not move this article to History of computers, rather than the current and cumbersome History of computing hardware? Yes, before circa 1950, "computer" meant a person who did mathematical computation, and so one could argue that "History of computers" could refer to either computers (people) or the computers (machines)...but that would be a fairly trifling objection, I think. --Sewing 21:58, 17 Dec 2003 (UTC)

Hear, hear. A good suggestion for clarity and simplicity. --Wernher 22:05, 17 Dec 2003 (UTC)
Well, I want to change it but am reluctant to act, for 2 reasons: (1) There are a lot of pages that link to this one (which means manually changing them if I want to be a good Wikipedian); and (2) I may be wading into something I will later regret. I'll take a wait-and-see attitude for now... --Sewing 18:17, 18 Dec 2003 (UTC)

Title dispute

Originally listed at VfD

  • History of computers. It is currently a redirect to History of computing hardware. I couldn't move the 2nd article to the 1st, so I removed the redirect text in the 1st article, but I still couldn't do the move. "History of computing hardware" is a cumbersome attempt by a mathematician to distinguish the history of computers from the History of computing (the article's former title), which encompasses not only computers but pen and paper as well. His point is valid, but the new title he chose for the article is unnecessarily awkward. --Sewing 17:14, 21 Dec 2003 (UTC)
    • I am thinking whether History of computation is a better title than History of computing. btw There is a Timeline of computing, too. Optim 17:47, 21 Dec 2003 (UTC)
      • I agree History of computing is not ideal. But isn't History of computation also awkward? Anyhow, it goes back to Michael Hardy's argument that "computing" (and "computation") is not just about computers but about mathematical techniques that precede computers. I think History of computers is the best option: it is simple and unambiguous. --Sewing 18:08, 21 Dec 2003 (UTC)
      • History of computation still seems nice and more correct to me. Optim 19:01, 21 Dec 2003 (UTC)
      • I think the term computation is more often (academically) used for the theoretical side of things (algorithms, complexity,etc.), computers seems better for the practical side to me. --Imran 22:17, 21 Dec 2003 (UTC)
        • That's right. We can have a Computation article for the academic theoretical history and a Computers article for practical-business computing. how do u think? Optim 00:49, 22 Dec 2003 (UTC)
    • Keep, who wouldn't be interested in the history of computers? Lirath Q. Pynnor
    • Move to History of computers. Mathematics is as much a part of the history of computers as it is the history of their hardware. - Mark 06:58, 30 Dec 2003 (UTC)
    • I don't know why this was listed on VfD so long. It isn't really a VfD decision. It's more of a title dispute so I've listed it at Wikipedia:Current disputes over articles instead. Angela. 05:42, Jan 4, 2004 (UTC)
      • Well, it can't have been much of a dispute as no-one's discussed it for over two weeks, so I'm delisting it from the disputes page. Feel free to relist it if there really is a dispute. Angela. 01:41, Jan 22, 2004 (UTC)

Jack Kilby 1957

Even though I changed the date for the IC to 1958 to conform to the Nobel laureate article, I happen to know that Kilby thought of the IC during the mass vacation at TI (which would have been in late 1957). Kilby didn't have the vacation seniority, so he came to work at an empty Texas Instruments facility. The quietness of the work environment allowed Kilby to concentrate his thoughts and invent the IC. 169.207.115.129 01:41, 5 Jan 2004 (UTC) Thus the 1958 date must be the official publication date and not the actual date of conception.

Invention of the abacus

Some sources assert that the abacus was inventing in China around 3000 BC; others that it was invented by the Romans or Babylonians around 1000-500 BC and traveled east to China. At present, this Wikipedia article says it was of Chinese invention. It would be nice to come up with an account of the current opinion that was as complete, accurate, and NPOV as possible.


Turing Completion is not a good test for a computer

The article states that Turing Completion is "as good a test as any" for whether a machine is a computer. I fundamentally disgree. It is too easy to build a machine that is theoretically Turing Complete. The Z3 has been shown to be theoretically Turing complete yes. But so what! The z3 had no conditional branching and the proof that it was Turing complete relies on mathmatical tricks defined in the 1990's. It was never intended to be used as a general purpose machine. Babbages Analytical engine was more flexible than the Z3. Furthermore if the z3 was Turing Complete I would lay money on a bet that the ABC as also Turing Complete it was functionally very similar. And what about the Colossi. The MKii Colossi (of which 9 not 10 were built, the MKi was later converted to a MKii) at least had conditional branching. It too must have been "theoretically Turing complete.

It really isn't good enough to shy away from a hard definition by hiding behind the definition of Turing Completion. It has been shown that Conways game of life is Turing complete. It is possible to build a universal turing machine using only a carefully defined set of tiles and them applying conways rules. And what does this prove? It proves that Turing Completion is not a very difficult status to achieve.

Practical as opposed to theorectical Turing completion is something very different. The first computer that could automatically exploit the fact that it was Turing complete and could do this in a practical way, and solve real problems - That was the first computer. The ENIAC does not count it was a serial single purpose machine. Sure you could rebuild it like so many lego bricks but that is hardly a practical general purpose computer. The Manchester MKi was the first stored program machine but it's purpose was to prove that the williams kilburn tube worked effectively as a memory, not solve real problems. It was a research machine. The EDSAC at Cambridge was the first real computer in the modern sense. It was the first machine that could automatically exploit the fact that it was Turing Complete and it could do this in a practical way not merely as a party trick or under laboritory conditions. It was the first machine to impliment the von Neumann Architecture and solve real problems. (the Manchester Mki and maybe the BINIAC preceded the EDSAC but they never solved a real problem.)

A computer is a tool it must be practically capable not just theoretically capable. All the machines before EDSAC were theorectically general purpose but practically special purpose. A computer is a general purpose device. EDSAC was the first modern computer (You may now rip me to pieces ;-) John R.Harris

A minor comment: contrary to the commonly held belief, the Colossus computer in fact did not have condition branching. (Or, indeed, branching of any kind - or a program of any kind, for that matter!) So it definitely was not Turing-complete. See Talk:Colossus computer for more. Noel (talk) 05:05, 1 Mar 2005 (UTC)

The role of weather prediction in the development of computing

I am trying to work in Lewis Fry Richardson's use of differential equations for predicting weather. At the time he wrote his book 1922, computing was not practical for predicting weather, and yet I believe Atanasoff was trying to solve some meteorological problems when he invented the ABC; thus there has been a meteorological application since the first electronic computer; to this day, the supercomputers are used for predicting weather. Ancheta Wis 23:08, 5 May 2004 (UTC)

See: Navier-Stokes equations for the basic equation of weather prediction Ancheta Wis 18:08, 22 May 2004 (UTC) and alsoWikipedia:WikiProject Fluid dynamics. Richardson's approach is listed in Numerical ordinary differential equations. Ancheta Wis 10:12, 25 May 2004 (UTC)

I am replying to a high school librarian's assessment of this article: upon repeated re-reading and editing of the statements in this article, I can state categorically that the edits are made in good faith. As a professional with decades spent on technology, I have learned and experienced items which not even a professional historian could possibly have learned. Since the field has expanded every decade since the 1880's, and since technologists have not had a venue for documenting their accomplishments until the advent of Wikipedia, their work has gone unsung until now. Ancheta Wis 16:58, 26 Aug 2004 (UTC)

Italics everywhere?

Why is it that seemingly every noun in the article is in italics? Did someone get confused about how to make Wiki links? Most of the italic portions would be (I think) most appropriately either deitalicised, or made into wiki links. (Italic emphasis gratuitously added to illustrate how tiresome it is to read something formatted like that.)

Unless there's some particular reason why it's like that, I'll try to change them around a bit at some point soon. PMcM 02:27, 3 Dec 2004 (UTC)

Michael Hardy puts the usage thus: When a noun is used in a sentence, then it is not italicized, unless the sentence is about that noun, in which case it is italicized. Here is a link to further use of italics. Ancheta Wis 02:47, 3 Dec 2004 (UTC) Thus when I refer to logarithms of numbers (which are about a transformation of the respective numbers), I italicize to emphasize the transformation of the operations of multiplication and division into the operations of addition and subtraction.


Speedy response! Just finished playing around with it.

Who is Michael Hardy? I think that possibly going by the Wikipedia guidelines I feel it more appropriate to have a lot (about 75%) of what is/was in italics in that article as wiki links.

Certainly if it was written on paper it would be more appropriate to have the visual cue of italic text, used sparingly here and there where it might be confusing otherwise, but I personally don't feel it's necessary in the majority of places it was present in the article. If you're really incredibly attached to them, please feel free to put them back in, but I think the article would be less well off without the inclusion of the links I added. Thanks. PMcM 03:06, 3 Dec 2004 (UTC)

Unrelated: Any idea why this talk page has no contents section? Is it likely to be something I have set wrong, or is it the same for others? PMcM 03:10, 3 Dec 2004 (UTC)

It does have a contents section, you just have to look hard for it ;-) The reason is, the top of the page is filled with comments divided using horizontal rules. The TOC doesn't appear until after those. — Matt 10:39, 3 Dec 2004 (UTC)

Also, apologies for the somewhat patronising tone I used to initially raise the issue. PMcM 03:13, 3 Dec 2004 (UTC)

What to do with this tale...

I removed this:

During World War II, Curt Herzstark's plans for a mechanical pocket calculator (see Curta) literally saved his life. In 1938, while he was technical manager of his father's company Rechenmaschinenwerk AUSTRIA Herzstark & Co. he had already completed the design, but could not manufacture it due to the Nazi annexation of Austria. Instead, the company was ordered to make measuring devices for the German army. In 1943, perhaps influenced by the fact that his father was a liberal Jew, the Nazis arrested him for "helping Jews and subversive elements" and "indecent contacts with arian women" and sent him to the Buchenwald concentration camp. However, the reports of the army about the precision-production of the firm AUSTRIA and especially about the technical expertise of Herzstark lead the Nazis to treat him as an "intelligence-slave". His stay at Buchenwald seriously threatened his health, but his condition improved when he was called to work in the Gustloff factory linked to the camp. There he was ordered to make a drawing of the construction of his calculator, so that the Nazis could ultimately give the machine to the Führer as a gift after the successful end of the war. The preferential treatment this allowed him ensured that he survived his stay at Buchenwald until the camp's liberation in 1945, by which time he had redrawn the complete construction from memory. See: Cliff Stoll, Scientific American 290, no. 1, pp. 92-99. (January 2004) Also see: [2].

While this is a fascinating tale, I'm not sure whether its significant enough in terms of the history of computing hardware to deserve a long paragraph in an overview article on the topic. Mechanical calculators were commonplace by the 1930's, even if they weren't miniaturized. --Robert Merkel 23:38, 5 Dec 2004 (UTC)

inserted the information into Curt Herzstark Ancheta Wis 07:13, 6 Dec 2004 (UTC)

I just noticed that the DNA computing section in History of Computing was removed . Once one concedes that the travelling salesman problem is a true computation problem, then one must also concede that a computation using DNA is a computing hardware feat. If that is so, then the recognition that DNA can form the basis for a Turing tape is part of the history (and future) of computation; thus the recognition that DNA forms a code is part of the intellectual heritage of computing and part of its future. That is why Adleman actually solved a travelling salesman problem using DNA. But if that is truly a CS item, then Gamow deserves to be mentioned as this was part of the work that occurred before 1960. Ancheta Wis 01:43, 14 Dec 2004 (UTC)

This is an overview, which means some editorial judgement needs to be made about what are the most essential points to be covered in the space available. There are any number of things this article omits or discusses only briefly. There is much that could be said about analog computers, for instance. DNA computing, while an interesting concept, has not seen wide practical adoption compared to the technologies descended from those covered on this page. Therefore, remove was IMO appropriate. --Robert Merkel 12:26, 14 Dec 2004 (UTC)

Colossus relays

I altered the following:

"The Colossus used only vacuum tubes and had no relays."

It seems Colossus did use relays, both for buffering output and as part of its counters: [3], [4] — Matt Crypto 09:30, 13 Dec 2004 (UTC)

Copyedits needed for 2nd generation section

In the interests of keeping this article a featured article, might we move the latest contribution on 2nd generation computers to the talk page and work on the English prose before re-instating it to the article page? --Ancheta Wis (talk) 14:17, 2 January 2008 (UTC)

Hi, thats my work! What is wrong with it? 92.1.67.188 (talk) 14:35, 2 January 2008 (UTC)
Hi, I replied on your talk page to redirect here.
  1. The expert you allude to on your first paragraph was Thomas J. Watson
  2. The Von Neumann pre-print of 1947 went all over the world. That is how Israel built its first computer, for example. Russia did the same. What the 1954 date on Italy's first computer shows is that they built either on Von Neumann's architecture or they studied other documents. In any case, a citation would be good.
  3. Did the IBM 1401 use only transistors?
  4. Does tenths of thousands mean 10000 or 100?
In any case, I think you see what I mean. The English needs copyediting. I am not referring to your content, with the exception that we need citations. --Ancheta Wis (talk) 15:19, 2 January 2008 (UTC)
I have commented out the contribution, as the English needs copyediting. The sentences are disconnected, the timeline of development for second generation is nonsequential, there is no flow from one statement to the next. --Ancheta Wis (talk) 10:47, 17 February 2008 (UTC)

computers are an amazing creation of sensation. —Preceding unsigned comment added by Sckater (talkcontribs) 21:50, 8 March 2008 (UTC)

question

i was wondering whether the Antikythera mechanism is the first computer, cause their are lot articles that make that claim.Tomasz Prochownik (talk) 21:05, 23 April 2008 (UTC)

Follow the link for the latest thinking on the matter. --Ancheta Wis (talk) 23:18, 23 April 2008 (UTC)

Citations needed

Fellow editors, User:Ragesoss has noted that we are building back up the citations for this FA. When this article was first formed, the rise in standards for Featured Articles had not yet occurred. Since I have been volunteered for this, there will be an American bias to the footnotes I am contributing; please feel to contribute your own sources.

Please feel free to step up and add more citations in the form of the following markup: <ref>Your citation here</ref>. You can add this markup anywhere[1] in the article, and our wiki software will push it to the <references/> position in the article page, individually numbered and highlighted when you click on the ^. As an illustration, I placed this markup on the talk page so that new users can even practice on this talk page.

In my opinion, the best source is Bell and Newell (1971)[2], which is already listed in the article. I do not have time to visit the local university library, so my own contributions are from sources which I have on my own bookshelves; this may be appropriate since the seminal period 1945-1950 will probably be viewed as the heyday of first generation of electronic digital computers, which blossomed in the US, 1945-1950.[3],[4],[5],[6],[7],[8],[9] I recognize that there will need to be more citations from the Association for Computing Machinery and the IEEE Transactions, but that will have to come from those editors who are in the Wikiproject on computing. In particular, the Radiation Laboratory of MIT published a series of books The M.I.T. Radiation Laboratory Series[10] which are the foundation for computing hardware, in tandem with the Manhattan Project; what is common to these projects is that they involved groups of cooperating contributors.[11] Before the howls of outrage subside, please note that the exact forms of computer hardware had not yet been selected in this period, but since the technologists were already in place for other purposes, it was a small step to the forms of hardware we see today.[12],[13],[14],[15],[16],[17], [18] The forms of hardware could easily have gone in other directions, and our current computers would have been different from what could have been.[19] [20]

New users (especially those with a CS or EE background ), please feel free to contribute your citations. Wikipedia:Five Pillars summarize the guidelines for editors, and your cheatsheet for markup can be found here. Users can append comments to the foot of this talk page, signed with the signature markup: --~~~~

Casual readers might note that the references which will be added to this article can be purchased quite cheaply on the Internet (typically for a few dollars), which in sum would amount to a nice education in this subject. --Ancheta Wis (talk) 09:31, 3 May 2008 (UTC)

information processor

We are up to 59 footnotes. You can examine the edit history to see how the citations were embedded in the article, as well as study this section, for examples on how to do it. --Ancheta Wis (talk) 10:01, 6 May 2008 (UTC)

User:SandyGeorgia has noted that the citations are expected to have a certain format. Everyone is welcome to improve the citations. --Ancheta Wis (talk) 01:42, 7 May 2008 (UTC)

It appears that the footnote macro is space-sensitive. For example <ref name=IBM_SMS/ > works, but <ref name=IBM_SMS/> causes error messages unless a space is added after the trailing slash. To see this, look at this diff --Ancheta Wis (talk) 09:42, 9 May 2008 (UTC)

Sample citation format from User:Wackymacs:[21]

  • This one was formatted incorrectly. There should be a "|" in between the url and the accessdate like this:[22]

References sample illustration

  1. ^ Your citation here
  2. ^ Gordon Bell and Allen Newell (1971) Computer Structures: readings and examples ISBN 0-07-004357-4
  3. ^ Herman Goldstine's 1947 First Draft of a Report on the EDVAC, which was mimeographed and distributed worldwide, had a global effect, producing von Neumann-architecture computer systems world-wide. For example, the first computer in Israel was built this way.
  4. ^ Federal Telephone and Radio Corporation (1943, 1946, 1949), Reference Data for Radio Engineers
  5. ^ The Jargon File, version 4.4.7 The Jargon file
  6. ^ Charles Belove, ed. (1986) Handbook of modern electronics and electrical engineering, ISBN 0-471-09754-3
  7. ^ Sybil P. Parker, ed. (1984) McGraw-Hill encyclopedia of electronics and computers ISBN 0-07-045487-6
  8. ^ Arthur B. Glaser and Gerald E. Subak-Sharpe (1977), Integrated Circuit Engineering ISBN 0-201-07427-3
  9. ^ Richard H. Eckhouse, Jr. and L. Robert Morris (1979), Minicomputer Systems: organization, programming, and applications (PDP-11) ISBN 0-13-583914-9
  10. ^ For example, John F. Blackburn (1947), Components Handbook, Volume 17, M.I.T. Radiation Laboratory Series, Lexington, MA: Boston Technical Publishers
  11. ^ "I must say that I did not design Windows NT -- I was merely one of the contributors to the design of the system. As you read this book, you will be introduced to some, but not all, of the other contributors. This has been a team effort and has involved several hundred person-years of effort." -- Dave Cutler, Director, Windows NT Development, in the foreword to Inside Windows NT, ISBN 1-55615-481-X, by Helen Custer, p. xix.
  12. ^ Ron White (1995), How Computers Work ISBN 1-56276-344-X
  13. ^ Scott Mueller (2002), Upgrading and repairing PCs ISBN 0-7897-2683-1 CHECK_THIS_ISBN
  14. ^ Harry Newton (1998), Newton's Telecom Dictionary ISBN 1-57820-023-7
  15. ^ George McDaniel, ed. (1993), IBM Dictionary of Computing ISBN 0-07-031489-6
  16. ^ Paul Horowitz & Winfield Hill(1989). The Art of Electronics ISBN 0-521-37095-7
  17. ^ David A. Patterson and John L. Hennessy (1998), Computer Organization and Design ISBN 1-55860-428-6
  18. ^ Alan V. Oppenheim and Ronald W. Shafer (1975), Digital Signal Processing ISBN 0-13-214635-5
  19. ^ W.J. Eckert (1940), Punched card methods in scientific computation, Lancaster, PA: Lancaster Press
  20. ^ Robert Noyce's Unitary circuit, US patent 2981877, "Semiconductor device-and-lead structure", issued 1961-04-25, assigned to Fairchild Semiconductor Corporation 
  21. ^ Jones, Douglas W. accessdate=2008-05-15 "Punched Cards: A brief illustrated technical history". The University of Iowa. {{cite web}}: Check |url= value (help); Missing pipe in: |url= (help)
  22. ^ Jones, Douglas W. "Punched Cards: A brief illustrated technical history". The University of Iowa. Retrieved 2008-05-15.

Zuse and Von Neumann

According to Hennesey and Patterson, Von Neumann knew about the details of Zuse' floating-point proposal. This suggests that the sentence 'Zuse was largely ignored' should be stricken. Any objections? --Ancheta Wis (talk) 10:30, 5 May 2008 (UTC)

Zuse did not implement the floating-point design he patented in 1939, before WWII ended. Von Neumann was aware of Zuse's patent and refused to include it in his Princeton machine, as documented in the seminal paper (Burks, Goldstine and von Neumann, 1946). -- Hennesey and Patterson p.313, note "A decimal floating point unit was available for the IBM 650, and [binary floating-point hardware was available for] 704, 709, 7090, 7094, ... ". "As a result, everybody had floating point, but every implementation was different." .

To this day, floating point operations are less convenient, less reliable, and more difficult to implement (in both hardware and software). -Ancheta Wis (talk) 08:07, 10 May 2008 (UTC)

'First electronic computer'?

This assertion is made about the Colossus in this article. It is also made about the ACE in that article. THERE CAN BE ONLY ONE! Twang (talk) 18:59, 10 May 2008 (UTC)

On the other hand, the article also states "Defining a single point in the series as the "first computer" misses many subtleties." thank you for BEING BOLD! You are welcome to contribute to the article and the talk page! --Ancheta Wis (talk) 20:34, 10 May 2008 (UTC)
Not to be too pedantic, but the article is an example of how a recurring need (in this case, the need to calculate) gets met multiple ways, at multiple times, by multiple people trying to solve a problem. For example, Pascal was trying to help his dad collect taxes; ENIAC was used to fight a war by calculating the trajectories of artillery shells; Zuse was trying to ease the burden of his engineering work; Colossus was trying to decode secret messages; IBM was trying to extend the use of its punch card machines for business purposes; Maurice Wilkes was excited about the possibilities of the First Draft of the Design for EDVAC. You get the idea: it's asking 'What does the first mean?'. As we now know from spacetime, time depends on the observer - what does first mean in that case? It only has meaning in the context of a thread. Thus clearly, Maurice Wilkes came after ENIAC, but before the implementation of EDVAC. Colossus was secret, so it was part of a different thread, by definition. And in the article, there is evidence that von Neumann knew something of the ideas of Zuse, so the design and architecture of EDVAC is after Zuse. However, you cannot say that the implemented EDVAC is after Wilkes' machine implementation - they are parallel threads which branched after Wilkes was influenced by the First Draft. These ideas are part of Lee Smolin's book Three roads to quantum gravity ISBN 0-465-07835-4 pp.53-65. (As you can see, classical logic needs to be reformulated. The world is not monotonic.) I don't have Smolin's book in front of me so I can't give you a page number right now. And I can't put what I just wrote in the article because I don't have a citation other than Smolin, which isn't explicitly about computing hardware (it's about physical processes in general). --Ancheta Wis (talk) 21:07, 10 May 2008 (UTC)
Just following up about ACE, the Automatic Computing Engine. It's the same idea. Turing owed nothing to EDVAC. So there are other editors who have the same kind of reasoning as Smolin's work, stated above. However, just Turing's knowledge that EDVAC is possible said a lot to him -- the ACE solution also has to obey the laws of physics, like EDVAC; thus the ACE problem solvers had a lot less work to do when solving their specific issues on the way to a goal.
These kinds of problems, about priority and independence, are being solved with clean rooms, where developers work in isolation from other implementers. This is all faintly antique for anyone in the open source movement; all that has to be done in open source is to include the provenance of the code base, to keep it Open.
That's where Wikipedia can make its mark on world culture: we can keep everyone honest about who owes what to whom, by citing our sources. This article clearly states that von Neumann owed much of his First Draft to Zuse, Eckert/Mauchly (who owe something to Atanasoff/Berry) and the rest of the inventors who came before him. And Wilkes (and the rest of the world) owe much to von Neumann, etc. Since Turing's ACE does not have priority over Wilkes' machines, the ACE article should probably heavily qualify the meaning of first in its text. That brings us to Emil Post, the American logician who is independent of Turing, but who waited too long to publish. (He had his ideas 15 years before Turing's 1936 publication...) --Ancheta Wis (talk) 21:39, 10 May 2008 (UTC)

Contributions welcomed.

Fellow editors, you are welcome to make your contribution to this article. See the sections above for examples on adding citations. Be Bold.

--Ancheta Wis (talk) 10:43, 11 May 2008 (UTC)

The number of pictures

Ancheta Wis, you're doing amazing work here - but don't you think the article should have less pictures? — Wackymacs (talk ~ edits) 06:23, 15 May 2008 (UTC)

Thank you for your kind words. I propose to comment out Herman Hollerith, the Jacquard loom, the Manchester Baby, and others.
Editors, you are welcome to contribute to this article and talk page. Be Bold. Citations wanted.
--Ancheta Wis (talk) 10:06, 15 May 2008 (UTC)
Good work. Still too many. Some images obscure section headings (in other words, push them out of order). Also, per WP:MOS, images should not be placed directly under a section heading on the left side. — Wackymacs (talk ~ edits) 10:10, 15 May 2008 (UTC)

ENIAC 1,000 times faster than its contemporaries

The article currently states "(Electronic Numerical Integrator and Computer) .... it was 1,000 times faster than its contemporaries." As it is stated that ENIAC was Turing complete, if it had been programmed to break "Tunny" would it have been 1,000 times faster than Colossus? If not then this sentence needs changing. --PBS (talk) 10:08, 13 May 2008 (UTC)

If we are comparing electromechanical relays to vacuum tubes then the statement is correct. But Tunny came after ENIAC, so it is a descendant, and not a contemporary, which would have been Z1 (the only unclassified project).
You might change the article page, for example, replacing contemporaries with Z1 in the statement. Citations are welcomed. This page needs more contributors! --Ancheta Wis (talk) 03:35, 15 May 2008 (UTC)
The sentence has been changed. --Ancheta Wis (talk) 08:41, 19 May 2008 (UTC)

First light

We need a name akin to the concept of first light of an observatory telescope; I propose the denotation first good run, and wish to apply it to Baby's first good run, June 21, 1948, 60 years ago. --Ancheta Wis (talk) 23:00, 21 June 2008 (UTC)

I am wary of defining such "firsts" in computing, bearing in mind the statement in this article that "Defining a single point in the series as the "first computer" misses many subtleties".TedColes (talk) 16:42, 22 June 2008 (UTC)
Thank you for your considered response. What I refer to is 'the comparison of an expectation to an observation', to use William Shockley's phrase. For example, there were 'screams of joy' when the first p-system for UCSD Pascal compiled itself (the expectation). In my mind, that qualifies as a first good run. Another might be the attainment of 1 peta-flop operation for IBM Roadrunner, just last month. For Baby, the resulting convergence of dots on the Williams tube to the expected location was the first good run. And since the phrase is ostensive, meaning relative to the situation, akin to 'baby's first word', I can see that what the proud parent might view as a triumph might be viewed as something more akin to Michael Faraday's response 'and what is the use of a new-born baby'. Might it be better to use a more prosaic phrase like 'first run'? --Ancheta Wis (talk) 19:20, 22 June 2008 (UTC)
Herbert Simon once said 'There is no substitute for a working program'. Maybe the phrase might be 'first working program' for Baby. --Ancheta Wis (talk) 19:39, 22 June 2008 (UTC)