Talk:Word (computing)
From Wikipedia, the free encyclopedia
Contents |
[edit] Article title
This article has many possible titles, including word, computer word, memory word, data word, instruction word, word size, word length, etc. I believe the general form should be used since none of the specific forms is well established. Of course "word" has to be qualified to disambiguate from non-computing uses. I'm not particularly happy with "(computer science)", though. Perhaps "Word (computing)" would be the best choice overall. -R. S. Shaw 05:00, 6 June 2006 (UTC)
- Word is the best name, but I agree that (computing) is the better choice. "computer word", "memory word", and so on can redirect here. --Apantomimehorse 15:55, 30 August 2006 (UTC)
-
- I've moved this from Word (computer science) to Word (computing); I'll fix up the double redirects now. -R. S. Shaw 01:07, 19 January 2007 (UTC)
- I noticed that the 64-bit architecture page has a side panel which allows you to click on 16-bit data size link, which links directly to this page. I think that's anathema to the idea that a word-size is defined by the architecture. There should be a page that redirects from 16-bit data size to Word (computer science). This way, the link from 64-bit architecture does not lend the web surfer the wrong impression that 16-bit data size is always equal to word-size -- anonymous jan 18 2007 2:58 pm est —The preceding unsigned comment was added by 24.186.76.45 (talk) 19:56, 18 January 2007 (UTC).
-
- Preceeding comment copied to Template talk:N-bit since that template is the source of the table of links in question. -R. S. Shaw 05:29, 19 January 2007 (UTC)
[edit] Merge Articles
I can't see why not to do this. It would be silly to have a different article for every word length. Merge them all into the same article. I don't mind under what category you qualify word, seems like a bit of an arbitrary choice anyway. --FearedInLasVegas 22:39, 28 July 2006 (UTC)
- Agreed for merge. Redirect "dword" et al. here. --Apantomimehorse 15:56, 30 August 2006 (UTC)
OK, I did the merge, but it's fairly unintegrated. Anyone feel free to smooth it out...Lisamh 15:46, 21 September 2006 (UTC)
[edit] Causality backwords?
It's my understanding that the concept of "word" size is a nebulous one and not very defined on some systems. Should the article make this more clear? The way the article makes it sound, computer designers pick a world size and then base other choices around it. While there are good reasons to use the same number of bits for registers as you do for bus-widths and so on, I just don't think this is the reality any more. The x86/x64 archictectures, for instance, are a mess of size choices. So what I'm proposing is a reversal of emphasis by saying: 'computers have certain bus-widths and register-sizes, etc., and the size of a "word" is the number of bits most common to them; this is not always a straight-forward determination". --Apantomimehorse 16:04, 30 August 2006 (UTC)
- The article shouldn't imply (if it does) that a word size is chosen out the blue. It isn't. It's chosen exactly because it and its multiples are judged a good compromise for all the detailed design choices - memory bus width, numerical characteristics, addressing capability of instructions, etc., etc.
- A word size is rarely undefined, but may be a somewhat arbitrary choice out of the several sizes that belong to an architecture's "size family". The article does discuss the concept of a family of sizes. A modern, complicated computer uses many different sizes, but most of them are members of the small family of tightly-related sizes. In the x86 case, it's clear that the word size was 16 bits originally, and that the extension to primarily-32-bit implementations extended the size family to include 32 as well as 16 and 8. As the article mentions, the choice of one of the members of the family as the single "word" size is somewhat arbitrary, and in the x86 case has been decided by historical precedent. The word size for x86 is straightforward: it is defined as 16 bits, even on a Opteron. On a new design, the definition would be chosen to be closer to what is thought to be the most important size, e.g. 64 bits (but the new design would also have much less use for other members of its size family). -R. S. Shaw 02:57, 31 August 2006 (UTC)
-
- I'll take your word [<--pun!] for it that a word is still a meaningful concept and that selection thereof is important in design (I think learning x86 as your first processor rots your brains), but I still feel the article gives a strange impression about which defines which: does the word size define the size of the things which gel with the word size or vice versa? I see you changed 'influences' to 'reflected in', and that's a good start. I'm not happy, though, with the phrase 'natural size' as it's a bit mysterious to anyone who might wonder what in the world is natural about computers. By 'natural' is it meant 'best for performance'? Or 'convenient for programmers'? I understand this is addressed down in the article, but I think the opener needs something about the importance of word size other than just saying it's important. I'd address these issues myself, but I'm not qualified to do so.
-
- (Heh, just realized I mispelled 'backwards' in the subject heading. Let's just pretend that was a pun.)--Apantomimehorse 22:33, 31 August 2006 (UTC)
[edit] "Word" and "Machine word"
"Word" seems quite ambiguous in modern usage, since people very often mean it as 16-bits rather than the actual word size of the processor (likely now to be 32 or 64 bits). There's Intel and Microsoft who, for reasons of backwards compatability, use "word" to mean a fixed size of 16-bits, just as "byte" is used to mean a fixed size of 8-bits. Although in formal contexts, "word" still means the basic native size of integers and pointers in the processor, a huge number of people believe that "word" just means "16-bits". Would it be useful to mention the phrase "machine word" as being a less ambiguous way to refer to that native size, and try to avoid the danger of the fixed 16-bit meaning? JohnBSmall 14:10, 8 October 2006 (UTC)
- Word has always been ambiguous, as if it is not to be vague it requires knowledge of the context: the particular machine. A huge number of people only have knowledge of x86 machines, so until their horizons extend naturely think of a word as 16 bits. The "native size" of an undefined machine is intrinsically vague and/or ambiguous, and that is fine for many uses where a precise size isn't needed for the discussion to make sense. Modern computers use many different sizes, and designating one of those sizes the "word" size is rather arbitrary. Using "machine word" doesn't really clarify anything, since it is no more or less ambiguous than just "word"; you need to know the particular context to pin down the meaning. Additionally, if the context is the x86 series of machines, "machine word" is more ambiguous than "word", since the latter is known to be 16 bits by definition (a good convention, IMO), but the former leaves open questions as to what is meant. Even if you know the context is, say, a AMD64 machine, you would mislead many people if you used "machine word", since different people would take you to mean 16b, or 64b, or even 32b since most operands in use on such a machine are probably 32b. -R. S. Shaw 03:00, 9 October 2006 (UTC)
[edit] Minicomputers
There were a whole bunch of minicomputers before the VAX. I used Motorola, HP, and DEC. There were a few others in the building for "real time" computing and some pre-PC desktops.
I remember the CDC 6600 as having a 64-bit word and there were 128 characters available. The 128 characters does not match a 6-bit word, but rather a 7-bit word. You could run APL on it and it used a whole bunch of special characters. .
I only read that part of the manual once, and there were a few "hall talk" discussions about the size. So, there may be a wet memory error there.
Ralph —The preceding unsigned comment was added by 67.173.69.180 (talk) 15:33, 6 January 2007 (UTC).