Jump to content

Talk:History of computing hardware/Archive 2

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Archive 1Archive 2Archive 3Archive 4

American developments

Just curious; why is "American developments" a distinct section? --Khendon 14:37, 1 Mar 2005 (UTC)

Earlier versions indeed intermixed the various projects irregardless of nation. Thus the current headings are a matter of preference by the contributors.

Hm. I think it makes much more sense to have a purely chronological article. --Khendon 16:46, 1 Mar 2005 (UTC)

I made the change in organization a while ago, to put Zuse and Colossus before the stuff on what was happening in America because it was the American work that led to ENIAC and the EDVAC design. Go back and read what it was like before and after the change was made and you'll hopefully see why I did it. Personally I think the article pays too much attention to Zuse and Colossus - they were both fascinating dead ends IMO - and if I was reorganizing the article further would considerably trim down the material on them. --Robert Merkel 23:13, 1 Mar 2005 (UTC)

According to http://www.scl.ameslab.gov/Projects/ABC/Trial.html, the role that the Atanasoff-Berry computer had in influencing the design of ENIAC may be grossly understated in this article

Request for references

Hi, I am working to encourage implementation of the goals of the Wikipedia:Verifiability policy. Part of that is to make sure articles cite their sources. This is particularly important for featured articles, since they are a prominent part of Wikipedia. The Fact and Reference Check Project has more information. Thank you, and please leave me a message when a few references have been added to the article. - Taxman 19:33, Apr 22, 2005 (UTC)

Taxman, this is an overview article, summarising facts found across many other articles. Hence, there's likely not to be much direct referencing here. --Robert Merkel 04:19, 23 Apr 2005 (UTC)
Added W.J. Eckert's little orange book. It would be good to acknowledge Lewis Fry Richardson's work, but it was decades before computers arose which could implement his method. (Now it would be called a system analysis, but he invented a field here.) I don't see a suitable way to work it into the article, which is about hardware, after all. Ancheta Wis 07:46, 23 Apr 2005 (UTC)
Great, thanks for your work, that is much better. Certainly an overview article can have references that back up its facts too. - Taxman 13:49, Apr 23, 2005 (UTC)

Heron of Alexandria

I scanned the article page and couldn't find any refference to Heron. I thought it may be a good idea to put a reference to his automated theater in the beggining, right around Wilhelm Schickard. However, I'm not sure since the concept of computing here seems to be more calculation-based, if no one has any problems I think it would be a good addition. Herons automated theater was a series of pegs with strings wrapped around them. Various weights were tied to the strings and controled the movement of objects for the play. In the end it was a simple analog computer program. --Capi crimm 03:21, 24 May 2005 (UTC)

I propose that you start a new page, History of automata or History of automatons. Stanislaw Ulam, John Von Neumann, John H. Conway, ... Stephen Wolfram were/are quite aware that the computing paradigm has automata in it. The topic is called Category:Cellular automata. However the concept of computing is tied to Leibniz' notion of expression evaluation, which means, in the case of a computing machine, we are automating human computation. When we are automating the motion of a puppet, which is one of the things that computers can do, the subject is called Automatic control, or Cybernetics or Robotics. Computer used to be a job title. Perhaps someday computers will be called controllers, as well. Come to think of it, perhaps History of controllers or History of cybernetics would be a good page for Heron's automated theater. Ancheta Wis 08:52, 24 May 2005 (UTC)
When I followed the Cellular automata link myself, I found material on the history of cellular automata, which would work quite well on the future History of ... page to which I am referring, as well as Heron's automated theater. Would you like to start such a page? I could contribute to it as well. Ancheta Wis 08:57, 24 May 2005 (UTC) If History of Cybernetics were to be the page, then Norbert Wiener's concept of the steersman (the root meaning of cybernetics) or the pilot would come into play. Aeronautics would come into play as well, because the theory of control became pressing with with the invention of the airplane. There is an extensive literature on automatic control in the IEEE Transactions. So we wouldn't just be flying blindly, to put a little pun in this. Machine vision could also be added, if the topic were to be Cybernetics.

List of books

I added a list of books for further reading. These are the ones I had on my shelf. The order may look haphazard, but I tried to put the more accessible ones at the top. I thought about ordering them by date or by author - if anyone thinks they should be that way, please feel free to change the order (and to add to the list, of course). --Bubba73 20:15, 7 Jun 2005 (UTC)

Thanks. If you had used them to fact check material in this or other articles, please consider listing them as actual references, or better yet, citing individual facts to them. Much better than taking a (potentially unknown) Wikipedia editor's word for the material is citing it to a reliable source. Use whatever format you like, but (Smith, 2003) is fine, or you can use some form of footnote system including the invisible one. Thanks again. - Taxman Talk 23:12, Jun 7, 2005 (UTC)
I haven't done much (if anything) to this page, but I've contributed a lot to particular computers, mainly 1946-1954. I used 6 or 7 of those books, but mainly about 3 of them. I should have put a reference for each of the edits (I did on a few) at the time, since now it is hard to know where I got what information, w/o looking it up again.

Too much analog...

Maybe some of the new analog computer stuff should be trimmed (and placed in the appropriate article), as it leaves the article as a whole rather unbalanced. --Robert Merkel 04:15, 11 Jun 2005 (UTC)

Add more to the other sections then. Greg321 10:39, 11 Jun 2005 (UTC)

This article is supposed to be a readable summary of the history of computing hardware. At the moment, this is like a history of the automobile that spent half its content looking at steam-powered cars. The excessive information obscures the forest for the trees.--Robert Merkel 23:29, 11 Jun 2005 (UTC)
While not an advocate of the current balance in the content, the current information does highlight the fact that steam powered computation was a vision which pioneers like Babbage and William Stanley Jevons were pursuing. It is not irrelevant, as it shows that technologies such as both mechanical and electrical analog computation, electronic digital computation, DNA computation and quantum computing are possible technologies, and certainly not the only possible forms for computing hardware. The period from 1945-1950 was important, but not the only possibility. It could have happened several other ways. The mechanical antecedents are very important, as they illuminate a path that could have been taken as early as the 1500s; only the requirements for precision of manufacture in the computing devices are missing for large-scale computing hardware. Ancheta Wis 12:40, 13 Jun 2005 (UTC)
That's an interesting point to discuss a little bit, but too much speculation as to what might have happened if history had turned out differently is likely to get unencyclopedic very quickly. --Robert Merkel 01:25, 14 Jun 2005 (UTC)
William Stanley Jevons actually was one who dreamed of steam-powered computation, possibly as early as the time when he lived in Australia 150 years ago. His Logic Machine was exhibited in Sydney last year, BTW. Ancheta Wis 21:42, 14 Jun 2005 (UTC)

More detail on some sections

More detail could be added to the sections on electronic computation. We could add more on the role of Herman Goldstine, von Neumann, etc. for example. What I have in mind is the chance meeting of Herman Goldstine and von Neumann on the Princeton train, and how it turned into a doctoral examination on Computer Engineering for Goldstine. Another item might be how the Israelis got the von Neumann architecture first hand; that is how their first machine got built. Another item might be the use of Binary Coded Decimal in the first electronic computers. Perhaps we might sketch a little outline before actually adding in the text. Ancheta Wis 23:36, 14 Jun 2005 (UTC)

  • I agree. Some of this is in other articles. See IAS machine for how Israel got the von Neumann architecture - the plans for the IAS were made freely available, and about 15 computers were based on the design, WEIZAC was one of them. Bubba73 01:10, 15 Jun 2005 (UTC)


A comment from another point of view

I was active in leading edge electronics in about 1970. The article makes good sense and fills me in on a number of things I didn't know. Its a good article. It doesn't mention the driving force for miniturization, nor where the money came from to do the leading edge research and its subsequent application. It doesn't talk about how transistors, once discovered, were worked into machines which made a few decisions by themselves, based on inputs from other machines. The theoretical developments (i.e. transitors, field effect transistors, storage devices) were implemented into hardware and miniturized in a number of ways. While I don't know exactly, the USA military was a huge force in the area. Taxpayers in large measure paid for research (universities) and implementation of reasearch (military projects). As an example of implementation, I worked for a company that contracted to the military for a megabuck, producing 4 radar receivers that could be installed on 4 aircraft. The point I'm making is that the military money was a prime source of the energy that miniturized computers, there is a trickle down effect that goes on even today. Military spends money to have leading edge equipment and then that expertise trickles down to the consumer. Today the trickle down happens faster than in the 1970s is my impression. Anyway its a good and useful article as it stands, happy days. Terryeo 13:42, 4 February 2006 (UTC)

Overview

This article, along with ENIAC, Computer, Atanasoff–Berry Computer, Zuse and related pages, and others, are crying out (IMHO) for some sort of organizational overview. Does anyone know what Wikipedia policy is on such pages? -- Gnetwerker 18:36, 16 March 2006 (UTC)

So what do you have in mind? This article exists because User:Michael Hardy strongly differentiated the history of computing from its hardware. And there are the timelines of computing history from the Jargon files, which also strongly influenced the article. The computing article may be what you have in mind as a venue as it is mostly lists with a veneer of prose. Perhaps you might place your overview there? --Ancheta Wis 18:52, 16 March 2006 (UTC)

A question about units

I was curious as to why metric units were used throughout the article, especially in the section on American Development. Could Imperial units could be put in parenthetically? Yeah, I know, lazy Americans and all that… — ChardingLLNL 18:13, 20 July 2006 (UTC)

put 'before 1960' in the title

As it is, the article title is misleading and inaccurate. --Apantomimehorse 17:26, 17 August 2006 (UTC)

The article evolved. This is the original (or basic) article implemented upon the suggestion of User:Michael Hardy who wanted to clearly distinguish The History Of Computing from its hardware. I oppose turning a Featured Article upside down in favor of a renaming dispute. The article, if cast into the decades format would then be indistinguishable from the Timeline of Computing. But the current article clearly shows this history before Electronic Computers, which will eventually be superseded, whereas Computing will survive as long as Mankind survives. The hardware started with bones and sticks. 1960 is an accident of history and the convention of 32K pages for an article. --Ancheta Wis 19:43, 17 August 2006 (UTC)

Picture obscures text

The picture of Herman Holerith is on top of some text when seen on Firefox Resolution at 1280 x 1024 in full screen Sdp1978 00:12, 5 January 2007 (UTC) virendra

Speculative sentence?

"This technology was lost, however, and over 1,600 years would elapse before similarly complex computing machines were again created." - This sentence about the Antikythera mechanism seems purely speculative to me. This is arguing from absence of evidence, and largely unnecessary in this particular article in any case. Since we have not much evidence of prior art either, one might just as well claim this was a totally unique object in it's times, a totally unreasonable inference. -- Cimon Avaro; on a pogostick. 16:10, 5 April 2007 (UTC)

I have moderated the above statement in the article. I have a further query though... Is it fair to say that Kepler truly revolutionized astronomy? To me it appears like a peacock term. -- Cimon Avaro; on a pogostick. 06:36, 15 April 2007 (UTC)

400 years ago, Kepler spent 20 years of his life discovering his three laws, based on Tycho Brahe's observations. Yes, he really did revolutionize Astronomy. He was the first to do what he did. --Ancheta Wis 06:45, 15 April 2007 (UTC)
That's okay then -- Cimon Avaro; on a pogostick. 18:16, 15 April 2007 (UTC)

A question

Why link to EC-130 links not to EC-130 computer, but to EC-130 aircraft instead?

Punch card history

Puch cards actually had a predecessor, namely play drums found in carillons. They were widely used from the sixteenth century on in the low countries. Play drums were linked to a clock to automatically play music every hour. A picture of a play drum can be found in the dutch wikipedia article on carillons. Essentially this is very similar to book music, just a little more primitive. So I think it should be mentioned in the article. Gespenster 19:07, 10 August 2007 (UTC)

Von Neumann

The unabashed credit given here to Von Neumann for the stored computer architecture is not reflected by most historians or the Wikipedia page on Von Neumann himself -- go look. The agreed on interpretation by historians and the people who worked on the EDVAC project was that Von Neuman was collecting notes on the groups presentations, and decided, unilaterally, to publish it under his name. Presper Ekert is less kind and basically says that Von Neumann clearly decided to grab the credit for himself.

In any case, I doubt it serves any purpose for Wikipedia to distribute this kind of misinformation. —Preceding unsigned comment added by 76.102.198.58 (talk) 08:01, 30 September 2007 (UTC)

This is now fixed in the article. --Ancheta Wis (talk) 08:11, 10 May 2008 (UTC)

Punched cards are still used and manufactured in the current century

This sentence automatically updates its meaning when the century changes, and it changed only a few years ago. What century was intended? tooold 08:04, 4 October 2007 (UTC)

Changed the sentence. Thank you for your note. --Ancheta Wis 10:00, 4 October 2007 (UTC)

Citations

It is no good adding lots of citations, when half of them are not formatted properly with the citation templates provided. Please see Wikipedia:Citation templates. All web citations should use the Cite web template, and must have an access date. Also, a lot of the current citations look questionable, and some are useless. (For example, the two citations in the lead explaining hardware and software) - Why? Wikipedia has articles on both of these. — Wackymacs (talk ~ edits) 10:45, 15 May 2008 (UTC)

So the next step is to revisit the citations, using the sample you have provided and reformat. As part of the history of this article, when we did this, the footnote software had not yet reached its current state. I hope it is stable enough to rely on for the future. I have no objection to go back and revisit the footnotes, as I am a believer in the spiral development process. --Ancheta Wis (talk) 08:06, 16 May 2008 (UTC)
The "Example 2 article text" appears to be a codification of the usage of ordinary wiki markup practices over the years. I propose reformatting the existing citations into that style. I must say that it appears to place human editors into the position of data entry clerks for the care and feeding of the citation monster. After reading Wikipedia:Citation templates, my reaction is that this article/policy? will evolve.
My personal preference is for "Example 2 article text", and my guess is that any of the items in Wikipedia:Citation templates is acceptable to the FA reviewers. True statement? --Ancheta Wis (talk) 08:29, 16 May 2008 (UTC)
You can either use {{Citation}} for everything, or a mixture of {{cite news}}, {{cite web}}, {{cite book}}, and so on. Both methods are acceptable at FA. — Wackymacs (talk ~ edits) 08:54, 16 May 2008 (UTC)
My last re-format using the cite template ate the name of a co-author. I have to go now, and will return to this issue later. --Ancheta Wis (talk) 16:53, 17 May 2008 (UTC)
This diff shows 27119 net bytes (a 33% increase) have been added to the article since 29 April 2008. I have attempted to address the concerns of Wackymacs (1c) and SandyGeorgia (1a) in the meantime. --Ancheta Wis (talk) 10:50, 19 May 2008 (UTC)
All book footnotes should have specific page numbers. Ancheta Wis, can you start adding page numbers (assuming you have the books which are referenced in footnotes)? — Wackymacs (talk ~ edits) 16:50, 5 June 2008 (UTC)
My books are largely in my basement with the exception of the 40-lb. box I dragged upstairs for the article. But some of the books I have not looked at since I left the semiconductor industry some decades ago, which does not mean I do not remember where I learned the fact, and which book title I have already cited. I am thinking of Mead and Conway, to be specific. To avoid time pressure, because I cannot predict where (in what box, as is probably apparent, I own thousands of books, not to mention 3 editions of Britannica) I will unearth the book, I will simply comment out those book refs which lack the page numbers. I will also try to conserve on byte in the references for the sake of the page limit. --Ancheta Wis (talk) 00:12, 6 June 2008 (UTC)

Replaced the {{cite}} with {{Citation}}. Retained {{Ref patent}} on the recommendation of the Citations people. The notes now use {{harvnb}} Harvard-style references. --Ancheta Wis (talk) 06:46, 19 June 2008 (UTC)

Looks good. Are you going to be adding page numbers to the books which are missing them? — Wackymacs (talk ~ edits) 07:37, 19 June 2008 (UTC)
Thank you. No book which is in the Notes is missing page numbers, as far as I know. But when I unearth such information I will augment the article. Some books in the References section are there for cultural reasons, such as Bell and Newell, which is the single most important source, in my opinion. --Ancheta Wis (talk) 02:11, 20 June 2008 (UTC)

For the record I am aware that Lord Bowden's first name is not Lord. But I am forced into this by the strictures of the Citation system while using Harvard references. The Ref patent template also does not appear to play well with the References section. That is the reason that I have the 3 patent citations in a hybrid, one style for the Notes, and the Last, First names preceding the Ref patent template in the References section. --Ancheta Wis (talk) 12:12, 19 June 2008 (UTC)

SandyGeorgia, the harvnb templates still need last|year, but I notice that the 'last=' was missing from the Intel and IEEE. I restored the Citation|last=IEEE and then noticed that the Citation|last=Intel was changed as well. How is the Harvard-style referencing method going to work, in this case? --Ancheta Wis (talk) 01:38, 2 July 2008 (UTC)

As it stands, this still doesn't meet the 2008 FA criteria standards. I just ran the link checker tool on this article, and found some broken links (many are used as references):

http://toolserver.org/~dispenser/cgi-bin/webchecklinks.py?page=History_of_computing_hardware

The broken links will need to be replaced with other reliable sources, preferably books. — Wackymacs (talk ~ edits) 07:53, 6 July 2008 (UTC)

Problems with References

At the moment, it seems page numbers are being given in the 'References' section instead of in the footnotes where they should be. — Wackymacs (talk ~ edits) 08:18, 6 July 2008 (UTC)

Why the special section?

Why is there a special section for 'American developments' and not one for 'British developments', or any other country? Are Americans special?

--Bias Detector-- 21st July 2008 —Preceding unsigned comment added by 86.9.138.200 (talk) 16:45, 21 July 2008 (UTC)

See the article: "There were three parallel streams of computer development in the World War II era; the first stream largely ignored, and the second stream deliberately kept secret." 1)=Zuse 2)=secret UK 3)=ENIAC etc. --Ancheta Wis (talk) 18:14, 21 July 2008 (UTC)

Harvard Mark I – IBM ASCC Turing Complete?

It seems like the table titled

"Defining characteristics of some early digital computers of the 1940s (See History of computing hardware)"

has a mistake. In the row about the Harvard Mark I – IBM ASCC in the column "Turing Complete" the link (1998) is clearly copied and pasted from the row about Z3. I don't know if Harvard Mark I was turing complete but the reference is wrong for sure. I am not familiar with the markup that references this table (obviously across multiple pages) and could not remove the information. Can someone else do it.

Stilgar (talk) 07:40, 25 June 2008 (UTC)

The same reference to Rojas applies to both electromechanical computers, which ran from tape of finite length, and whose programs are of finite length. Rojas shows that it is possible to simulate branching with pre-computed unconditional jumps. That would apply to both Z3 and Mark I. --Ancheta Wis (talk) 08:36, 25 June 2008 (UTC)

I don't agree with extending the Rojas conclusion to another machine. Isn't it more complicated? It sounds like a piece of original research that hasn't been published. Zebbie (talk) 23:30, 22 August 2008 (UTC)

Rojas wrong about Turing Complete?

As a separate issue, I think Rojas' conclusion was wrong. Turing's most important contribution to computer science was to postulate "halting problem." Simply put, you can't tell how long a program will take to finish. Therefore Turing defined his Turing machine with the conditional branch. Rojas conclusion, again paraphrased, was: you can write a program without conditionals, but you have to make the tape as long as the program run time is.

1. Rojas is redefining a Turing machine to have no conditionals.  I'd argue that is no longer a Turing machine.
2. Rojas' new machine has to know in advance how long the program will run.  Turing would argue you cannot know this.

Zebbie (talk) 23:30, 22 August 2008 (UTC)

The Rojas conclusion applies to jobs which include a while wrapper (code with a loop). The branches were needed to halt the program (the job) in any case. Otherwise the program could only terminate when the program encountered a HALT. A conditional branch to a location containing HALT would do this also. Such a program would stay in the potentially infinite loop until the operator manually terminated the job.
Jump tables are a technique to accomplish branches.
The length of time needed to complete a program can be known only to the programmer. I have had associates who had to re-submit jobs because the nervous operator terminated one which ran over 24 hours. But the program was correct and terminated by itself the next time after the operator let it run to completion. --Ancheta Wis (talk) 17:52, 23 August 2008 (UTC)
On a related note, the 'carry' operation used in the most elementary calculators from centuries ago is a type of 'branch'. I learned this from Hennessey and Patterson's books on Computer Organization. --Ancheta Wis (talk) 13:44, 24 August 2008 (UTC)

Shannon's thesis

Claude Shannon founded digital design. Open any electrical engineering book and you will see what Shannon did. This is a link to his thesis. --Ancheta Wis (talk) 10:07, 27 January 2009 (UTC)

"In his 1937 MIT master's thesis, A Symbolic Analysis of Relay and Switching Circuits, Claude Elwood Shannon 'proved' that Boolean algebra and binary arithmetic could be used to simplify the arrangement of the electromechanical relays then used in telephone routing switches, then turned the concept upside down and also proved that it should be possible to use arrangements of relays to solve Boolean algebra problems."

This isn't the same as "implementing" a circuit. However ground-breaking his thesis, it provided a proof, not an implementation. Follow the wikilinks. All we have is words to communicate here; we do need to be able to understand what they mean to make progress on this issue. --TraceyR (talk) 10:42, 27 January 2009 (UTC)

Thank you for taking this to the talk page, which I propose be the venue for improving the article: "In 1937, Shannon produced his master's thesis[61] at MIT that implemented Boolean algebra using electronic relays and switches for the first time in history." In this sentence, implemented refers to George Boole's work, which Shannon reduced to practice. Proof was established in the nineteenth century, before Shannon, by Boole. In other words, Shannon implemented Boole, with Boolean logic gates. In turn, successive generations of engineers re-implemented these logic gates in successive, improved technologies, which computing hardware has taken to successively higher levels of abstraction.

As a metaphor, take Jimbo Wales' statement of principle for Wikipedia. All successive editors implement Wales' vision. In the same way, Shannon implemented Boole.

If you have improvements to the article, I propose we work through them on the talk page. --Ancheta Wis (talk) 11:18, 27 January 2009 (UTC)

I think I see the disconnect: some things can be viewed as purely academic and theoretical; Boole's system of logic might be viewed in this light. But when Shannon expressed Boole's concepts in hardware (which had been done in an ad-hoc way earlier) he showed AT&T that there was another way to build the PSTN, which at one time was completely composed of humans doing the switching of telephone conversations. Today of course, this is all automated. So Shannon's accomplishment was essentially to provide an alternative vocabulary for the existing practice and mindset of the telephone company which in 1937 was analog circuitry. --Ancheta Wis (talk) 11:34, 27 January 2009 (UTC)

Here is a proposed sentence and reference:

--Ancheta Wis (talk) 12:54, 27 January 2009 (UTC)

That looks fine. Go with it. --TraceyR (talk) 10:26, 28 January 2009 (UTC)

Emil Post's machines

I need to put in a plug for Emil Post's work. His formulation of the Turing machine is simpler and Post was actually earlier than Turing, but he failed to publish early enough. That is actually the reason I left in the 'and others'. But, c'est la vie. Maybe the Post-Turing machine will gain currency in future Category:Computational models. --Ancheta Wis (talk) 18:36, 28 January 2009 (UTC)

Perhaps Post's "worker" can be regarded as a "machine" or perhps not. Either way, the evidence seems to point to Turing's 'On Computable Numbers' paper as having had considerable influence on subsequent developments. If you think it only fair to revise my edit, is 'others' (plural) the right word—maybe just refer to Post. I think there is a serious omission from the article in that it does not make any reference to Turing's Automatic Computing Engine design, which had important differences to von Neumann's 'First Draft' design. Incidentally, it is easy to underestimate the very close transatlantic co-operation during the second world war—Hodges says that Turing cited the von Neumann's paper in his own 1945/46 ACE paper.TedColes (talk) 23:06, 28 January 2009 (UTC)
I have No Problem with your edits as I respect your work. Perhaps we can also use the first-hand memoirs from First-Hand:History of the IEEE Global History Network to entice more editors to contribute here. --Ancheta Wis (talk) 01:47, 29 January 2009 (UTC)
I was not aware of Networking the History of Technology—I am not a member. But it looks like a potentially excellent and authoritative source. TedColes (talk) 06:56, 29 January 2009 (UTC)

Shannon and Stibnitz

Since Stibnitz is mentioned in the same paragraph as Shannon, there is a suggestion that Stibnitz's work was based on Shannon's thesis. If this is the case, perhaps this should be stated explicitly (and mentioned in the Stibnitz article too). If not, maybe a new paragraph is needed. --TraceyR (talk) 14:02, 29 January 2009 (UTC)

Or, since they both worked for Bell labs, connect with more text.
A new paragraph would be less work. --Ancheta Wis (talk) 15:03, 29 January 2009 (UTC)
If Stibnitz knew of Shannon's thesis and used it in his work, the article ought to reflect this. Is there citable evidence to enable this link to be made? That both worked for Bell is certainly circumstantial evidence, but is it enough to make the link?--TraceyR (talk) 15:21, 29 January 2009 (UTC)

Hatnote mess

Many of the section hatnotes are a little non-sequitorous. Others "belong" in other sections. I don't have the time to sift through them all myself though. –OrangeDog (talkedits) 18:37, 29 January 2009 (UTC)

I appreciate ArnoldReinhold's edits; they show that the flat memory model is a definite advance on the delay line memory model that early programmers had to deal with; however the current style of programming did not arise from nothing. If the deleted edits were unclear, then we might have to give an example of the contortions that programmers had to go through when solving a problem in the early days. Hardware-independent programming did not exist in the early days. Even today, operating system-independent programming is not a given: the API is typically OS dependent. In the absence of contributions to the article in this vein, consider how one would have to program if the items in memory were to decay before they were reused -- one would be forced to refresh critical data before the delay time had elapsed. --Ancheta Wis (talk) 19:01, 24 May 2009 (UTC)

You seem to be implying that refreshing memory was the programmer's responsibility, which it wasn't. A better example might be the programming contortions required to access the early magnetic drums. --Malleus Fatuorum 19:07, 24 May 2009 (UTC)
That was the point of the deleted text (on accessing the magnetic drums). --Ancheta Wis (talk) 21:37, 24 May 2009 (UTC) rvv --Ancheta Wis (talk) 05:25, 1 September 2009 (UTC)

Voltages ... were ... digitized

The lead summary states: "Eventually the voltages or currents were standardized, and then digitized". Could someone explain how voltages or currents were digitized. In what way(s) was this breakthrough made? I thought that my PC used 'analogue' power. Many thanks. --TraceyR (talk) 07:42, 30 April 2009 (UTC)

You can look up the voltages in the successor to the TTL databook. The logic series was 5400, then 7400, then 4000, etc. The 1970s 7400 Low power: "1.65 to 3.3V". We need an article about this, from 28V for relays, successively lower as power consumption became greener. Maybe WtShymanski can step in? --Ancheta Wis (talk) 21:25, 30 April 2009 (UTC)
When looking up DTL (1961) I see the levels were -3V and ground. So you can see the voltages were digitized from the beginning. --Ancheta Wis (talk) 00:41, 1 May 2009 (UTC)
Here is a handy table for the different logic families. --Ancheta Wis (talk) 00:50, 1 May 2009 (UTC)
If anyone can tell me what that paragraph is supposed to be saying, I'll buy him/her a donut. The whole lead is garbage and must be rewritten. For every Von Neumann chip out there there's probably a half-dozen Harvard-style chips - let's not lie excessively in the first paragraph. --Wtshymanski (talk) 18:49, 24 September 2009 (UTC)
Ever notice how a perfectly clear Wikipedia article, by gentle stages, eventually becomes something that looks like the transcript of the speech of a cat lady having a bad day? One's confidence in the ever-upward spiral of Wikiprogress is shaken. List all the synonyms, show how it's spelled in different varities of English, and, perhaps, include a diatribe on how it wsa *really* invented by Tesla/a Hungarian/a Canadian/an ET - put all that in the first sentence with enough links and footnotes, and you're well on the way to mania. --Wtshymanski (talk) 20:17, 24 September 2009 (UTC)

Old discussions

Old discussions have been moved to archives - use the navigation box to switch between them. I used the much nicer {{archives}} and {{archivesnav}} templates as found on the Personal computer talk pages to spruce up navigation a little. Remember when creating new archive pages that they must have a space in the title - talk:History of computing hardware/Archive 3 would be the next page, for example. --Wtshymanski (talk) 01:35, 25 September 2009 (UTC)

Commodore

Call me a massive geek, but surely the C64 and Amiga deserve some mention in here. The advancement in personal computers isn't just down to the number of transistors - those computers added some really creative features (particularly with sound hardware) which we now take for granted. Their rise and fall (there's a certain book by a congruent title) is a huge chapter in the history of computing, surely.. 86.154.39.2 (talk) 15:15, 28 August 2010 (UTC)

Introduction

I reached this article looking for a reference to the MOSAIC computer (Ministry of Supply Automatic Integrator and Calculator) and wondered if the following Introduction might be short enough and apposite:

Computing hardware subsumes (1) machines that needed separate manual action to perform each arithmetic operation, (2) punched card machines, and (3) stored program computers. The history of (3) relates largely to (a) the organization of the units to perform input and output, to store data and to combine it, into a complete machine (computer architecture), (b) the electronic components and mechanical devices that comprise these units, (3) higher levels of organization into 21st century supercomputers. Increases in speed and memory capacity, and decreases in cost and size in relation to compute power, are major features of the history.

Five lines instead of 36. The present Introduction could become the first section, headed say Overview, and the pre-stored program coverage extended to mention the abacus, the National Accounting Machines that "cross footed" under control of a "form bar" that facilitated table construction using difference methods, and machines of mid 20th century typified by the Brunswiga (not sure of spelling) and Marchand. The overlap of punched card and stored program computers, by dint of control panels and then card programmed computers could be mentioned. Michael P. Barnett (talk) 01:47, 24 December 2010 (UTC)

Used your 5-line suggestion for the Introduction. Please feel free to incorporate the remainder of your contribution into the article. Thank you for your suggestions. --Ancheta Wis (talk) 11:54, 26 December 2010 (UTC)

Argument at IEEE 754-1985

There is currently a slow edit war at IEEE 754-1985. I put down the Z3 as the first working computer as is in this article and it was reverted. I pointed out this article as a better venue to argue matters about history but they can't be bothered to do that so I'm doing it instead. Discussion at Talk:IEEE 754-1985#Z3 first working computer. 17:50, 8 February 2011 (UTC)

Punched cards derived from Jean baptist Falcon (1728)

Please put in a note, that the idea of punched card driven looms originated from french mechanic Jean Baptist Falcon in 1728, although Falcon never successed in building one by himself. —Preceding unsigned comment added by 91.97.182.235 (talk) 15:12, 13 February 2011 (UTC)

I can't see why, they were a development of the perforated paper rolls being used for the purpose and he didn't make it work. Who used the perforated paper rolls first and when would be more relevant. Also relevant at this level of detail possibly would be the barrels with pins which were used before that for controlling automatons, and as far as I know Hero of Alexandria used them first. Dmcq (talk) 19:07, 13 February 2011 (UTC)

Transition from analog to digital

I propose to rename the analog section in order to preserve the content that was removed.

Alternatively, perhaps a new section with this name might be inserted to contain that content. --Ancheta Wis (talk) 11:15, 5 May 2011 (UTC)

The business about accuracy is practically irrelevant. Digital computers are more convenient. It is like the difference between solving geometry problems the Greek way and solving them using Cartesian coordinates. The Cartesian coordinates may be more long winded in some cases but they just work. Dmcq (talk) 11:09, 9 May 2011 (UTC)
Noise is relevant. A usable signal to noise ratio is the fundamental reason that digital circuits are more accurate than analog circuits. --Ancheta Wis (talk) 11:22, 9 May 2011 (UTC)
You mean precision, not accuracy. Doesn't matter how many bits you have in the number if it's the wrong number. --Wtshymanski (talk) 13:21, 9 May 2011 (UTC)
Yes, you are quite right about the distinction. Thank you. --Ancheta Wis (talk) 13:58, 9 May 2011 (UTC)
I believe the original idea behind ENIAC was that it should emulate a differential analyser but that idea was abandoned early on as lacking in vision. Even Ada Lovelace and Babbage knew better Dmcq (talk) 17:12, 9 May 2011 (UTC)