Jump to content

Talk:History of computing hardware/Archive 3

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Archive 1Archive 2Archive 3Archive 4

Was the Harvard Mark I "Turing Complete"? -- Revisited

We currently label the Mk I as NOT Turing complete - presumably because of a lack of jump instructions. There was some discussion of this on this talk page back in 2008.

It must be noted that:

  • Turing completeness says that a machine is considered to be Turing complete if it can emulate a Turing complete machine.
  • One instruction set computer points out that a machine that implements nothing more than the 'SUBLEQ' instruction is Turing complete.
  • Harvard Mark I says that the Mk I could run from a looped paper tape - so even without a formal jump instruction, it could run the same set of instructions over and over indefinitely.
  • The following program demonstrates that it is possible to emulate a SUBLEQ machine with code inside a single 'while(1)' loop - which the Harvard Mark I could have implemented via paper tape looping:
// Initialization:
typedef unsigned char byte ;
int lut [ 256 ] =
   {
     1, 1, 1, 1, 1, 1, 1, ....  // 128 ones.
     0, 0, 0, 0, 0, 0, 0, ....  // 128 zeroes.
   } ;
byte mem [...whatever...] = { ...whatever... } ;  // The initial state of memory in the SUBLEQ machine
int PC = 0 ; // The SUBLEQ machine's program counter.
// Runtime:
while ( 1 )  // (Implemented via a paper tape loop)
{
  // Read instruction operands from the program counter location.
  int a = mem[PC++] ;
  int b = mem[PC++] ;
  int c = mem[PC++] ;
  // Perform subtraction:
  mem[b] -= mem[a] ;
  // Use lookup table to extract sign of mem[b] so that:
  // c is multiplied by 1 and added to the program counter if mem[b]<=0
  // c is multiplied by 0 and added to the program counter if mem[b]>0.
  PC += lut[mem[b]+128] * c ;
}

Ergo, the Harvard Mark I was indeed Turing Complete. This is rather important IMHO. SteveBaker (talk) 15:26, 3 May 2012 (UTC)

Why is 'Turing completeness' important? It is not synonymous with 'general purpose' - and that certainly would not be claimed for the Harvard Mark I. --TedColes (talk) 15:43, 3 May 2012 (UTC)
It's important because the Church-Turing thesis says that all computers that are turing complete are equivalent (given enough time and memory). If the Mk I is Turing complete - then (with enough time and memory) it could emulate any modern computer - so we'd have to say that it should be considered to be "general purpose". Turing completeness is what truly separates the modern concept of the word "computer" from some mere calculator or sequencer. SteveBaker (talk) 16:52, 3 May 2012 (UTC)
It seems that you are correct. However, we have a problem. Here in Wikipedia, we can't publish original research. What we need is a reliable source that claims this (or the contrary). ---- CharlesGillingham (talk) 20:02, 3 May 2012 (UTC)
I'm painfully aware that this discovery is my own WP:OR - and therefore problematic without reliable sources. However, the entire section History_of_computing_hardware#Early_computer_characteristics has not one single reference - so why should this article have an unreferenced falsehood rather than an unreferenced truth? We do state as a fact that the Mk I is not Turing complete - and that is stated without references. Per WP:V we can only do that if this statement is uncontroversial. Well, following my reasoning, it most certainly is controversial because both you and I agree that it's untrue. Hence until/unless we can find a WP:RS we have three alternatives:
  1. Leave the article as it is - with an unreferenced, controversial (and seemingly false) statement.
  2. Change the article to say that the Mk I is indeed Turing complete - leaving an unreferenced (but evidently true and hopefully uncontroversial) statement.
  3. Remove that table (or at least the "Turing complete" column or the "Mk I" row) on the grounds that it is "is likely to be challenged" and has no WP:RS to back it up (per WP:V).
I don't think (1) is acceptable - so we either need to change the (unreferenced) "No" to an equally unreferenced "Yes" - or nuke the table (per WP:V) on the grounds that it's both un-sourced and controversial. Ideally of course we should find a reliable source - but until we do, the article shouldn't contain an unreferenced statement that we now know to be false.
SteveBaker (talk) 12:46, 7 May 2012 (UTC)
Turing completeness is clearly controversial, so I would favour removing that column form the table. The nuclear option of deleting the whole table seems extreem, particularly as the transcluded form has been removed from a whole host of articles. As regards the lack of references, readers can look to the articles about the individual machines. --TedColes (talk) 17:04, 7 May 2012 (UTC)

I think the Turing completeness column is useful to our readers as a rough guide to how the technology evolved. The controversial entries should have a footnote that says later researchers have attempted to show the machines in question were Turing complete but those capabilities were not envisioned when the machines were developed and used. --agr (talk) 10:39, 9 May 2012 (UTC)

Only if all the entries can be verified from independant sources, and not original research, should this column be retained.--TedColes (talk) 11:38, 9 May 2012 (UTC)
This table only useful as a "rough guide" if it actually contains true facts. Before I edited it, the article said that the Mk I is definitely not Turing complete - which was clearly false. That's not a "useful rough guide" - it's a misleading untruth!
The historical matter of whether the machine's developers were trying to make the machine Turing complete is moot because the Church-Turing thesis wasn't widely accepted or its implications understood until Kleene's paper was published in the early 1950's...six or more years after the Harvard Mk I entered service. Before Church-Turing, it really didn't seem to matter a damn whether a machine was Turing complete or not because nobody knew that Turing-completeness was the key to making a fully general-purpose computer. They couldn't have known how important that is - and therefore were unlikely to build specific features into their machines to ensure that it crossed that threshold. It's not like researchers were pushing steadily towards Turing-completeness - so the column of Yes's and No's doesn't really exhibit a trend in the design of early computers.
Neither I, nor WP:V have any problem with putting unsourced material into the encyclopedia provided that it's not controversial. You don't need to find sources for "The sky is blue", "2+2=4" or "My laptop is Turing complete". But as soon as a statement becomes controversial, you either have to find references for it or remove it. Personally, I'm 100% convinced that the Harvard Mk I was Turing complete - and IMHO our article wasn't just controversial, it was downright wrong. But my argument alone should suffice to convince everyone that the statement that the Mk I is not Turing complete is at the very least controversial. So no matter what, the article can't say that.
The decision then comes down to either:
  • If everyone accepts my argument (above) - then a "Yes" next to the Harvard Mk I isn't controversial - and we can change the article to say that without a reference (although that would still be nice to have)
...OR...
  • One or more people here disagree with (or don't understand) my argument - so the table is controversial whether it says "Yes" or "No". Since it's unreferenced material - it must be deleted in order to resolve the controversy.
SteveBaker (talk) 13:21, 9 May 2012 (UTC)
If we don't know we should just put in a dash, we don't have to say yes or no. I know some people just can't stand uncertainty so will argue forever about grey things like that and personally I'm no fan of the Turning column so I wouldn't miss it. The real point is that people couldn't be bothered with anything like that, Zuse for instance wanted to produce programmable scientific calculators that individual engineers or small groups could use, for that price was a main constraint. Colossus was built to crack codes. Universality just wasn't one of the things the early pioneers were interested in. You compare them against the Manchester Baby which was easily universal but totally impractical and built just to test out some ideas especially the Williams tube memory. Universality doesn't require much as can be seen from the game of Life, I think the Baby can be celebrated as the first computer with a modern structure having a stored program rather than all the configuring of ENIAC which was an automated bunch of electronic tabulators in effect. If anything I'd put down the main innovation in them or what they were for rather than the Turing completeness column. Perhaps change the 'Programming' column to 'Description' and add under the Baby for instance "Testbed for Williams tube memory. First stored program computer." Dmcq (talk) 16:53, 9 May 2012 (UTC)

Flamm citations

To anon 86.177.118.203: I patched in a phrase in the new footnote 1 which I hope matches your intent. Please feel free to alter my patch to your contribution. --Ancheta Wis   (talk | contribs) 03:25, 25 January 2013 (UTC)

In the same light, I propose to use 'accelerated' rather than 'underpinned' in your contribution because the article makes it clear that there were funding sources other than military contract, in both US and Germany. I do not deny that IC-based computers in military systems (1958-1960s) were materially funded by US (& likely USSR) contracts. --Ancheta Wis   (talk | contribs) 04:06, 25 January 2013 (UTC)

Sorry, going to be a pain! With regard to the USSR, I feel the word underpinned to describe government involvement is already an understatement; I feel underpinned is also the appropriate term to use for development elsewhere. Also the sources say that the investment from the private sector pales into insignificance when compared the resources ploughed in from government. Just so it's not my word (all quotes below are from reviews of Flamm's studies): "As Flamm points out, one of the crucial features of the computer was the role played by government and universities in the early stages of research and development when a great deal of 'Knightian' uncertainty (as opposed to risk) deterred private companies from significant commitment. ... [In Japan,] the Ministry of International Trade and Industry was crucial". An "insignificant" commitment from the private sector, according to the sources cited, for early stages of computer development and the computer market. According to Flamm, who, at least in my understanding, is my understanding of what we must accurately represent, governments more than "accelerated" the development and commercial viability—it wouldn't have happened without them. "the U.S. government, especially its military arm, has played a crucial role at critical times in bringing the computer out of the experimental stage to its present strong position in the marketplace". Again: "the government's multifaceted involvement ... [included that of] research sponsor, principal customer, and regulator of computing technology". And again: "government support is crucial because of the financial disincentives for private investors to be involved in long-term Research and Development". So I'm cheering for a slightly more emphatic term than accelerate, at least for early development and the creation of a viable market!
86.177.118.203 (talk) 00:06, 26 January 2013 (UTC)
I appreciate your response, and have reverted my wording. Thank you for your precis of the Flamm works.
Computing, IC engineering, Arpanet, Quantum cryptography, and so forth, would look very different with different/ alternative funding histories. And these topics are germane to the article. -Ancheta Wis   (talk | contribs) 00:47, 26 January 2013 (UTC)

categories

What category (or categories) is appropriate for machines that use integrated circuits, but don't put the entire processor on a single chip? In other words, what category covers what History of computing hardware (1960s–present) calls "Third generation" computers?

In other words, what category goes in the blank of the following?:

--DavidCary (talk) 14:55, 23 August 2013 (UTC)

Perhaps category: minicomputers ? --DavidCary (talk) 14:55, 23 August 2013 (UTC)

The category: minicomputers covers a many of them, but it doesn't cover other multi-chip processors such as the Apollo Guidance Computer, the Cray-1, the first hardware prototype of the Motorola 6800, etc.

Should we start a new category, perhaps category: integrated circuit processors? --DavidCary (talk) 14:55, 23 August 2013 (UTC)

Archimedes' method

Archimedes' method of performing calculations was the use of mechanical balance (think see-saw) of countable items versus the object being measured. This method was used for estimating the number of grains of sand in the universe, etc. (see the the sand reckoner).

Thus Archimedes' method of calculation was very concrete, as befits his status as engineer, inventor, and physicist. For this reason I propose to add his method to history of computing rather than to this article. I am pretty sure there is already a main article about this. --Ancheta Wis   (talk | contribs) 02:19, 30 September 2013 (UTC)

That sounds reasonable to me. Bubba73 You talkin' to me? 02:28, 30 September 2013 (UTC)
Done. --Ancheta Wis   (talk | contribs) 02:40, 30 September 2013 (UTC)

Claim that Zuse is commonly known as the "inventor of computer" is wrong.

The lede previously claimed that Zuse was commonly known as *the* "inventor of the computer" and the only citations given are to discussions in blogs. Published histories of computing have variously proposed that the "inventor of the computer" is Babbage (who designed the first programmable computer), Aiken (for the Harvard Mark1 which was a highly influential electromechanical computer designed and built around the time of Zuse's Z3), Atanasoff (for the first electronic digital computer), Eckert and von Neumann (for the stored program concept), and several other milestones. Zuse's Z3 could certainly could support the claim of his being the creator of the first working electromechanical programmable computer, but this does not imply that he is commonly known as the inventor of the computer. Wikipedia articles should not be used to push non-mainstream views.

For now I have moved this claim down to section on Zuse's computer for now, but I think that either a separate section discussing the complex issue of who was *the* inventor of the computer should be added, or this claim should be removed (in any case, the claim needs reputable citations, not just blogs). 198.255.141.250 (talk) 16:33, 22 December 2013 (UTC)

changes

Hi, the article is rather chaotic and unorganized. It's very difficult for a casual reader to make sense of the important developments and stages. There's is also lots of important information that is missing.Noodleki (talk) 19:19, 7 January 2014 (UTC)

Noodleki, Thank you for responding! Now, using the WP:BRD protocol, I propose reverting myself, and adding inline tags to indicate what ought to be worked out?
To all editors, comments on my proposal? In other words, start with Noodleki's changes, and tag Noodleki's edits with concerns.
  1. For example, I think it is POV to call the earliest known computing devices primitive.
  2. The invention of zero isn't even marked in the article, and zero was momentous, in my opinion.
  3. The recognition that the carry operation was a form of branching...
  4. The upcoming quantum computers are only briefly mentioned, etc., etc.
  5. Software is only tangentially mentioned. ...
  6. Or ... some other proposal ...
  7. Such as agreeing on an outline of the changes? --Ancheta Wis   (talk | contribs) 20:35, 7 January 2014 (UTC)

Hi, I understood from the above that you would revert. I think your suggestions equally apply to the version as it stands, although I think software wouldn't necessarily come under this article's purview. Thanks.Noodleki (talk) 21:20, 8 January 2014 (UTC)

Noodleki, you are welcome to interpolate inline tags, or other comment on the talk page. Regarding your vision of the article, I would be interested in exactly what missing items you are noting. The development of the hardware follows the published history, for example. A retrospective view necessarily distorts what actually happened. If we were to follow Babbage's dream, for example, we would have seen steam-powered computation. But that is not the way computation actually developed. --Ancheta Wis   (talk | contribs) 01:02, 9 January 2014 (UTC)
I'm afraid I don't understand what you mean about inline tags. You said above that you propose to revert yourself, but you don't seem to be doing this. The changes in the article are layout improvement and better organization of material, and more information on key developments such as Babbage and Colossus.Noodleki (talk) 11:31, 9 January 2014 (UTC)
The WP:BRD protocol requires a discussion - the reverter should explain the reasons behind his/her revert, which is something you aren't doing. Your suggestions apply equally to the article as it stands, and I've already explained the basis for my changes. You also agreed earlier to revert it yourself, and I don't understand why you are not doing this.Noodleki (talk) 11:33, 12 January 2014 (UTC)

Noodleki, I am waiting for the other editors to respond. Your changes for Babbage fit nicely in the nineteenth c. and I suggest that you add them to that section. However I do not agree with your characterization of 'chaotic' and suggest to you that there is a logical flow in the article already. It goes a bit far to place as much emphasis on Babbage as your version, as his design required repeatable manufacturing tolerances beyond the capacities of the nineteenth c. It took another century. __Ancheta Wis   (talk | contribs) 12:00, 12 January 2014 (UTC)

I think Babbage is underemphasized. After all, he was the first to describe a proper computer. '1801: punched card technology 1880s: punched card data storage' is a very strange set of sections and there is far too much emphasis on Hollerith, who's invention was a simple calculating device, similar to Pascal's invention. The article also lacks a 'flow' - it's very disjointed and doesn't explain clearly the important stages. The layout could be greatly improved, the intro shortened, the last section removed as there is a dedicated article for it already. There is also little information on analog computers. All these deficiencies were removed with my edit.Noodleki (talk) 15:06, 12 January 2014 (UTC)
Babbage's work is one stage in the history of computing hardware. There is more to computing than Babbage. You are welcome to flesh out Babbage's role, but he is not center stage today. The current article states clearly that the pace of development is unabated. --Ancheta Wis   (talk | contribs) 04:42, 14 January 2014 (UTC)
Here is an example of an inline tag.[discuss] --Ancheta Wis   (talk | contribs) 04:55, 14 January 2014 (UTC)
As an example of the pace of computing hardware development, there are multiple streams of development for qubits which are in progress right now. There is no clear winner for implementation, philosophical explanation, or technological exploitation yet. But large amounts of money are being risked right now, as in the Babbage case. IBM is taking yet another path, to make things even more interesting. __Ancheta Wis   (talk | contribs) 12:57, 14 January 2014 (UTC)
I'm not suggesting Babbage is 'center-stage'. I don't know why you bring up qubits - that could go in the Post-1960 article. Anyway, you still haven't provided an explanation for your revert, and you haven't reversed it, despite saying you would. So, I will provisionally put those changes back in, and you can point out problems that you might have with inline citations. ? . Noodleki (talk) 12:04, 16 January 2014 (UTC)

vacuum tube computers

Is there a Wikipedia article dedicated to vacuum tube computers?

I think there's enough material in this article about vacuum tube computers to create an article (WP:SPINOUT) focused on that category of computers.

Usually when there exists both a Wikipedia category about some topic, and also a Wikipedia "List of" article about that same topic, there is usually an WP:EPONYMOUS article dedicated to exactly that topic.

For example, there is both a list of transistorized computers article and a category: transistorized computers, so I am glad to see there is also a transistor computer article.

I see there is both a list of vacuum tube computers article and a category: vacuum tube computers, so I am surprised that there is apparently no article dedicated to vacuum tube computers.

When I click on vacuum tube computers, hoping to find an article dedicated to them, today I find it is a redirect to vacuum tube, which has much less information (mostly in vacuum tube#Use in electronic computers) about such machines than this "History of computing hardware" article.

Is there an article that more specifically discusses vacuum tube computers that vacuum tube computer and vacuum tube computers should redirect to? --DavidCary (talk) 18:37, 28 May 2015 (UTC)

I think that there definitely should be such an article. The only article I only know of (and find) List of vacuum tube computers, which you already know about. Stored-program computer is also relevant, but doesn't have enough information. Bubba73 You talkin' to me? 02:34, 29 May 2015 (UTC)

Wilbur machine

Hi,

Wilbur machine

Does the Wilbur machine (analog computer, on display in the science museum in Tokyo) fit in the history of (analog) computers or did it have any significans? --Butch (talk) 13:40, 22 November 2015 (UTC)

@Butch: Wilbur could solve up to 9 simultaneous linear equations, but was not programmable for other applications,[1] such as the orbit of Mars, or even the law of falling bodies. It was 'hard-coded', so to speak, and thus inflexible, compared to software-programmable devices. --Ancheta Wis   (talk | contribs) 14:38, 22 November 2015 (UTC)
Thanks for the responses. The in the article mentioned Sir William Thomson device was as far as i can see, also not programmable (only 'settable'). So my question remains, should the Wilbur machine be mentioned in the article? (Notes 1) Maybe not under 'analog computers' but under 'early devices'? 2) In the added reference to MIT the keyword 'analog computer' is used! 3) Maybe the Wilbur Machine should have it's own lemma. Anybody from MIT?) BTW Ancheta Wis, have you ever seen a software programmable analog computer?--Butch (talk) 08:18, 23 November 2015 (UTC)
Analog machines are physical configurations. As arrangements of physical objects, they obey physical laws, such as the mass flow of water, or metal balls, for example. The mathematical solutions in some nomogram, say a Smith chart, are typically mathematical transformations which are geometrical shapes, not mathematical equations. So no, the analog machines are not software, they are typically hardware, used to embody a specific mathematical operation. (Think slide rule or planimeter.) By the way, a quantum computer, using some configuration of qubits would also embody some quantum mechanical experiment, as the analog for something else. --Ancheta Wis   (talk | contribs) 09:25, 23 November 2015 (UTC)
Thanks for explaining the analog computer but i happen already to know what an analog computer is (During my training i did wire/setup an analog computer to simulate a moon landing! For some years i also was a maintenance technician for an analog computer of a missile guidance system!) Also adding quantum mechanics does not answer my question. So my main question remain? Should the Wilbur machine be mentioned in this article (or maybe an article at its own?--Butch (talk) 09:45, 23 November 2015 (UTC)
Since this is a wiki, you are free to contribute to the article. --Ancheta Wis   (talk | contribs) 12:55, 23 November 2015 (UTC)
Thanks, Ofcourse i know i can add text to the article (done it before ;-) I just ask for other peoples opinion whether the Wilbur machine is worth mentioning in this article. Seems a simple question to me.--Butch (talk) 13:04, 23 November 2015 (UTC)

Probably, a separate article would fare better, along the lines of the Atanasoff–Berry computer, which attacked the same application (systems of linear equations). Or, a contribution to System of linear equations, including both Atanasoff–Berry computer and Wilbur machine would add interest to a math article. --Ancheta Wis   (talk | contribs) 13:14, 23 November 2015 (UTC)

@Butch, I see that Google's quantum computer from D-wave is also a hard-coded device. That is, it embodies some quantum-mechanical experiment. In Google's case, it was quantum annealing. So we are back to the limitations of the Wilbur machine; like the Wilbur machine, the current Google machine is not general purpose, even though it ran 10^8 times faster[2] than a conventional computer working on the same problem,[3] simulated annealing. --Ancheta Wis   (talk | contribs) 15:53, 9 December 2015 (UTC)

References

  1. ^ Wilbur machine, 1930s, MIT accessdate=2015-11-22
  2. ^ However IBM researchers have previously projected no improvement using quantum annealing
  3. ^ "...Google says [the D-Wave 2X's] quantum annealing can outperform simulated annealing on a single-core classical processor, running calculations about 10^8mtimes faster" Wired, 11 Dec 2015

Ars Magna?

Hi
any particular reason Lull's Ars Magna is not included (or at least referenced) here?
T88.89.219.147 (talk) 23:50, 17 May 2016 (UTC)

There is an undefined reference to "Robinson" in the portion related to Colossus. — Preceding unsigned comment added by 146.18.173.105 (talk) 19:06, 8 June 2016 (UTC)

I had to fall back to find a version of the text without the fragment "Both models were programmable using switches and plug panels in a way the Robinsons had not been." dated January 2014. Might this be what you refer to? --Ancheta Wis   (talk | contribs) 21:09, 8 June 2016 (UTC)
In fact this sentence shows up in the very next edit. --Ancheta Wis   (talk | contribs) 21:29, 8 June 2016 (UTC)
Here is a citation for the tape drives (the Robinsons). --Ancheta Wis   (talk | contribs) 00:55, 9 June 2016 (UTC)

The Ishango bone

There is a picture of this artifact but no mention of it in the text. Such an arrangement is not helpful. Kdammers (talk) 17:53, 12 September 2016 (UTC)

I propose the caption and text "The Ishango bone is thought to be an early tally stick.".[1] --Ancheta Wis   (talk | contribs) 01:09, 13 September 2016 (UTC)
  1. ^ The Ishango bone is a bone tool, dated to the Upper Paleolithic era, about 18,000 to 20,000 BC. It is a dark brown length of bone, the fibula of a baboon. It has a series of tally marks carved in three columns running the length of the tool. It was found in 1960 in Belgian Congo. --A very brief history of pure mathematics: The Ishango Bone University of Western Australia School of Mathematics – accessed January 2007.
How about "prehistoric" or "paleolithic" instead of "early"? Kdammers (talk) 14:15, 13 September 2016 (UTC)

 Done --Ancheta Wis   (talk | contribs) 21:36, 18 September 2016 (UTC)

Stored-program computer & MESM

I assume that MESM, "the first universally programmable computer in continental Europe", that is, present-day Ukraine, should be added to History of computing hardware, before EDVAC. That or EDVAC removed from that section, since it's unclear how it contributes anything there. Or maybe both.

Ilyak (talk) 05:24, 13 March 2017 (UTC)

@Ilyak, Thank you for your note. In the process of tracing the extant entries in the article, I noticed that one of the Xtools entries for the listed computers failed to activate. To me, this indicates that the entry's article is so little visited that Xtools had not yet built up a persistent item in its stores. While revisiting MESM and other Soviet-era entries, the Xtools report for one of them finally popped up for me, but it took several visits to the entry.
What your Good Faith contribution highlights is the nature of the community of editors. We write what we know about. I personally learned of the Strela computer from a 1970s era IEEE Spectrum entry, while others contributed what they knew, such as MESM. It is my observation that our sources for our articles are gathered in the same way -- organically, step by step. Students from Iowa learn about ABC, students from Ukraine learn about MESM, students from UK learn about Turing, students from Penn learn about von Neumann, and so forth.
While adding entries, I used Gordon Bell's Computer Structures, but he had never heard of the Bletchly Park machines, so they were not in this article, at first. In the same way, although there is a Soviet-era computing article, it will take work to add these entries. The article strives for completeness, so MESM ought to be on it. I invite your contribution.
What this article traces is an evolution, from marks on sticks and clay, to dedicated hardware, first powered mechanically, then electro-mechanically, electrically, electronic, and beyond (artificial neural, qubit, etc.). We add what we know. I invite you to do so. --Ancheta Wis   (talk | contribs) 09:49, 13 March 2017 (UTC)

Zuse

I have been curious how IBM had such good knowledge of Zuse. Perhaps a history which details Dehomag can clarify this. --Ancheta Wis   (talk | contribs) 18:12, 23 August 2017 (UTC)

Based on this citation I am interested in the connection between Willy Heidinger and Konrad Zuse. --Ancheta Wis   (talk | contribs) 21:18, 23 August 2017 (UTC)

"History is written by the victors" —Anonymous

This quotation is a paraphrase of Machiavelli, The Prince, ch. XVIII:

Be it known, then, that there are two ways of contending, one in accordance with the laws, the other by force; the first of which is proper to men, the second to beasts. But since the first method is often ineffectual, it becomes necessary to resort to the second. A Prince should, therefore, understand how to use well both the man and the beast.

—ch. XVIII

Since we are seeing a revert war, might we consider:

  1. What good is it to rile up the editors of the article. What purpose is served by trolling the article? There are policies against this.
  2. "Clausewitz had many aphorisms, of which the most famous is 'War is the continuation of politics by other means.' " Might we think the Holocaust was war, but begun 10 years earlier?
  3. Francis Bacon noted "knowledge is power", and counted the invention of gunpowder as an advance of his civilization. Galileo figured out the equations for a falling body because he was paid to do so but they apply directly to gunnery tables. Think ENIAC.
  4. One of the inventors of a new mathematical notation which is just now being applied to the newest programming languages starved to death as a direct result of his membership in the Nazi party.
  5. The footnote #141 Kalman 1960 was applied directly to an aerospace defense application, as implemented in integrated circuits
  6. The new computer languages of the 1950s forward were applied directly to an aerospace defense application
  7. Elon Musk warns of the application of AI to a new world order. The internet is destroying our political institutions; must we wait any further before designating this as a theater of war?

I am being vague because these statements could be misused against the existing order. I for one wish to preserve the stability of the existing order. --Ancheta Wis   (talk | contribs) 07:58, 8 September 2017 (UTC)

See: The Social Construction of Reality. In other words, as social beings, we belong to social systems which can be at war with each other. Can't we rise above the issues that divide us, and join in building up the social systems that unite us? --Ancheta Wis   (talk | contribs) 08:15, 8 September 2017 (UTC)

I paraphrase the preface to The Answers of Ernst Von Salomon to the 131 Questions in the Allied Military Government "Fragebogen" (This book has never been out of print in Germany, ever since its first publication) Ernst von Salomon wrote (I paraphrase) "As I wrote my answers, which would determine whether I lived or died, whether I would remain imprisoned or go free, I got the sense of a vast alien intelligence that had not the slightest interest in my own well-being ..." 08:31, 8 September 2017 (UTC)

"magnetic storage" under "stored program"?

I don't think that "magnetic storage" should be a subsection under "stored program". Magnetic storage isn't necessary for a stored program computer. Bubba73 You talkin' to me? 02:31, 24 September 2017 (UTC)

There is a qualitative difference, akin to reading from scrolls versus codices versus hypertext. We think and program differently in the respective cases. The techniques scale differently as well. Assembler versus FORTRAN versus the web languages. Maybe this takes more planning for the article. --Ancheta Wis   (talk | contribs) 16:02, 24 September 2017 (UTC)

Why no mention of Alonzo Church, who predated Turing?

Turing is known for articulating the idea of a universal computer, but the first description of a universal computer was the lambda calculus, invented by Alonzo Church (who then became Turing's thesis advisor). Doesn't he belong in the same section with Turing? Briankharvey (talk) 20:50, 16 October 2017 (UTC)

Cite it and write it! --Wtshymanski (talk) 20:52, 16 October 2017 (UTC)
This addition is a big step because we would be writing about computer science and abstract machines (such as the lambda calculus) rather than the simple generalization by Turing from paper tape reader and punch. It's a whole new article, history of abstract machines. If the text starts here, it will have to be moved eventually. --Ancheta Wis   (talk | contribs) 22:18, 16 October 2017 (UTC)
@Briankharvey, Other Wikipedians have delved into this history before; see Talk:Post–Turing machine. I'm afraid you are going to need a suitable citation for the claim that Church's work on a universal computer preceded Turing. There were a lot of threads that hit all around the topic. See for example, rough timeline:
___________________>Bertrand Russell-->Alonzo Church <-- Turing
|________________________________>History of computer science, List of pioneers in computer science
01:06, 17 October 2017 (UTC)

Amateur computing

Nothing on amateur computing?

john f 2.26.119.204 (talk) 09:24, 5 December 2017 (UTC)

Importance of NeXT Computer Mention in Article

A NeXT Computer and its object-oriented development tools and libraries were used by Tim Berners-Lee and Robert Cailliau at CERN to develop the world's first web server software, CERN httpd, and also used to write the first web browser, WorldWideWeb. These facts, along with the close association with Steve Jobs, secure the 68030 NeXT a place in history as one of the most significant computers of all time.[citation needed]

This strikes me as opinion, and not necessarily fitting for a topic on computing hardware. Internet history, definitely, however it is still phrased as opinion. I happen to agree that the NeXT Computer (I believe the NeXTCube) that Tim Berners-Lee used to develop the WWW, and I would add John Carmack's development of Doom on a NeXtStation, are historical, I don't feel this paragraph fits in this article.

Perhaps the history of the WWW, an article on the history of Next, video games, etc. but not in this article.

Communibus locis (talk) 21:42, 18 January 2018 (UTC)

reference

The article has a book citation to Reconstruction of the Atanasoff-Berry Computer by John Gustafson. I can't find such a book but there is this paper. Is that it? Bubba73 You talkin' to me? 04:22, 8 April 2018 (UTC)

Too much women bias

Both men and women contributed, but works of women have been exaggerated and sources are inaccurate, based on words of feminist authors rather than neutral . When a job is male specific we never say "the field was primary dominated by men", but if a women in the slightest roles we bring up "women were involved", jobs primarily specified to women we say women were more involved. This is an article on computer hardware not a feminist propaganda article! The source Light, Jennifer S. (July 1999). "When Computers Were Women". Technology and Culture. 40: 455–483. comes from a feminist[citation needed] author rather than a neutral research and is unreliable. Respected Person (talk) 10:16, 14 December 2018 (UTC)

See WP:BRD —The expectation is that we will Discuss changes to the article on this talk page. --Ancheta Wis   (talk | contribs) 16:04, 14 December 2018 (UTC)
Ms. Light is a well credential historian and there is no evidence that she is a "feminist" author. I suggest we have, so to speak, a "woman bites dog" issue here. IMO the domininace of one gender as the early "computers" is worth including as well as the other objected to feminist "bias". Tom94022 (talk)
List of pioneers in computer science had a similar problem. Bubba73 You talkin' to me? 18:04, 14 December 2018 (UTC)
I'm learning that list is roughly alphabetical by surname. But it's not rigorously alphabetized. Might you object if we editors added in the article aliases in the list: for example 73, Bubba; or Lovelace, Ada; or Post, Emily? I tried the experiment and it doesn't seem to break that list. That way we could improve the alphabetic sort in that list, one pioneer at a time. --Ancheta Wis   (talk | contribs) 10:35, 15 December 2018 (UTC)

Hardwired?

The article has "...but the 'program' was hard wired right into the set up, usually in a patch panel". Is it correct to call a patch panel (plug board) hard-wired, since it is easily changed? See this dictionary. Bubba73 You talkin' to me? 03:11, 24 December 2018 (UTC)

It's a relatively more difficult way to 'program'. ENIAC was set up to solve equations by directing the results from one bank of operations to the next. --Ancheta Wis   (talk | contribs) 04:07, 24 December 2018 (UTC)
It is a lot more difficult, but it is really not "hard wired". Bubba73 You talkin' to me? 04:09, 24 December 2018 (UTC)
Certainly it was an evolution; the computations on punched cards also were directed by moving cables on plug boards, directing data from one unit to the next in this way. It's in the article. If you wish to rephrase this, feel free. But electronic computation did not spring forth fully formed. The process is still evolving, with Optical components as the next phase.
In the same sense, FPGAs are another waypoint on the spectrum of 'wiring', relatively less 'hard-coded' than other circuits and more 'hard-wired' than applications software.
Feel free to improve the text. --Ancheta Wis   (talk | contribs) 04:33, 24 December 2018 (UTC)
One plug may be easily changed, but programming took weeks (ENIAC#Programming, at least till 1948 ENIAC#Improvements).
Plug boards required physical connections, so they may be considered as "hard wired" (in "directly connected", "connected by cables" and "controlled by hardware" sense: American, [1], [2]). --MarMi wiki (talk) 23:50, 29 December 2018 (UTC)