Talk:Word (computing)

From Wikipedia, the free encyclopedia

This is the talk page for discussing improvements to the Word (computing) article.

Article policies
This article is within the scope of WikiProject Computer science, which aims to create a comprehensive computer science reference for Wikipedia. Visit the project page for more information and to join in on related discussions.
Start rated as start-Class on the assessment scale
Top rated as top-importance on the assessment scale

Contents

[edit] Article title

This article has many possible titles, including word, computer word, memory word, data word, instruction word, word size, word length, etc. I believe the general form should be used since none of the specific forms is well established. Of course "word" has to be qualified to disambiguate from non-computing uses. I'm not particularly happy with "(computer science)", though. Perhaps "Word (computing)" would be the best choice overall. -R. S. Shaw 05:00, 6 June 2006 (UTC)

Word is the best name, but I agree that (computing) is the better choice. "computer word", "memory word", and so on can redirect here. --Apantomimehorse 15:55, 30 August 2006 (UTC)
I've moved this from Word (computer science) to Word (computing); I'll fix up the double redirects now. -R. S. Shaw 01:07, 19 January 2007 (UTC)
I noticed that the 64-bit architecture page has a side panel which allows you to click on 16-bit data size link, which links directly to this page. I think that's anathema to the idea that a word-size is defined by the architecture. There should be a page that redirects from 16-bit data size to Word (computer science). This way, the link from 64-bit architecture does not lend the web surfer the wrong impression that 16-bit data size is always equal to word-size -- anonymous jan 18 2007 2:58 pm est —The preceding unsigned comment was added by 24.186.76.45 (talk) 19:56, 18 January 2007 (UTC).
Preceeding comment copied to Template talk:N-bit since that template is the source of the table of links in question. -R. S. Shaw 05:29, 19 January 2007 (UTC)

[edit] Merge Articles

I can't see why not to do this. It would be silly to have a different article for every word length. Merge them all into the same article. I don't mind under what category you qualify word, seems like a bit of an arbitrary choice anyway. --FearedInLasVegas 22:39, 28 July 2006 (UTC)

Agreed for merge. Redirect "dword" et al. here. --Apantomimehorse 15:56, 30 August 2006 (UTC)

OK, I did the merge, but it's fairly unintegrated. Anyone feel free to smooth it out...Lisamh 15:46, 21 September 2006 (UTC)

[edit] Causality backwords?

It's my understanding that the concept of "word" size is a nebulous one and not very defined on some systems. Should the article make this more clear? The way the article makes it sound, computer designers pick a world size and then base other choices around it. While there are good reasons to use the same number of bits for registers as you do for bus-widths and so on, I just don't think this is the reality any more. The x86/x64 archictectures, for instance, are a mess of size choices. So what I'm proposing is a reversal of emphasis by saying: 'computers have certain bus-widths and register-sizes, etc., and the size of a "word" is the number of bits most common to them; this is not always a straight-forward determination". --Apantomimehorse 16:04, 30 August 2006 (UTC)

The article shouldn't imply (if it does) that a word size is chosen out the blue. It isn't. It's chosen exactly because it and its multiples are judged a good compromise for all the detailed design choices - memory bus width, numerical characteristics, addressing capability of instructions, etc., etc.
A word size is rarely undefined, but may be a somewhat arbitrary choice out of the several sizes that belong to an architecture's "size family". The article does discuss the concept of a family of sizes. A modern, complicated computer uses many different sizes, but most of them are members of the small family of tightly-related sizes. In the x86 case, it's clear that the word size was 16 bits originally, and that the extension to primarily-32-bit implementations extended the size family to include 32 as well as 16 and 8. As the article mentions, the choice of one of the members of the family as the single "word" size is somewhat arbitrary, and in the x86 case has been decided by historical precedent. The word size for x86 is straightforward: it is defined as 16 bits, even on a Opteron. On a new design, the definition would be chosen to be closer to what is thought to be the most important size, e.g. 64 bits (but the new design would also have much less use for other members of its size family). -R. S. Shaw 02:57, 31 August 2006 (UTC)
I'll take your word [<--pun!] for it that a word is still a meaningful concept and that selection thereof is important in design (I think learning x86 as your first processor rots your brains), but I still feel the article gives a strange impression about which defines which: does the word size define the size of the things which gel with the word size or vice versa? I see you changed 'influences' to 'reflected in', and that's a good start. I'm not happy, though, with the phrase 'natural size' as it's a bit mysterious to anyone who might wonder what in the world is natural about computers. By 'natural' is it meant 'best for performance'? Or 'convenient for programmers'? I understand this is addressed down in the article, but I think the opener needs something about the importance of word size other than just saying it's important. I'd address these issues myself, but I'm not qualified to do so.
(Heh, just realized I mispelled 'backwards' in the subject heading. Let's just pretend that was a pun.)--Apantomimehorse 22:33, 31 August 2006 (UTC)

[edit] "Word" and "Machine word"

"Word" seems quite ambiguous in modern usage, since people very often mean it as 16-bits rather than the actual word size of the processor (likely now to be 32 or 64 bits). There's Intel and Microsoft who, for reasons of backwards compatability, use "word" to mean a fixed size of 16-bits, just as "byte" is used to mean a fixed size of 8-bits. Although in formal contexts, "word" still means the basic native size of integers and pointers in the processor, a huge number of people believe that "word" just means "16-bits". Would it be useful to mention the phrase "machine word" as being a less ambiguous way to refer to that native size, and try to avoid the danger of the fixed 16-bit meaning? JohnBSmall 14:10, 8 October 2006 (UTC)

Word has always been ambiguous, as if it is not to be vague it requires knowledge of the context: the particular machine. A huge number of people only have knowledge of x86 machines, so until their horizons extend naturely think of a word as 16 bits. The "native size" of an undefined machine is intrinsically vague and/or ambiguous, and that is fine for many uses where a precise size isn't needed for the discussion to make sense. Modern computers use many different sizes, and designating one of those sizes the "word" size is rather arbitrary. Using "machine word" doesn't really clarify anything, since it is no more or less ambiguous than just "word"; you need to know the particular context to pin down the meaning. Additionally, if the context is the x86 series of machines, "machine word" is more ambiguous than "word", since the latter is known to be 16 bits by definition (a good convention, IMO), but the former leaves open questions as to what is meant. Even if you know the context is, say, a AMD64 machine, you would mislead many people if you used "machine word", since different people would take you to mean 16b, or 64b, or even 32b since most operands in use on such a machine are probably 32b. -R. S. Shaw 03:00, 9 October 2006 (UTC)

[edit] Minicomputers

There were a whole bunch of minicomputers before the VAX. I used Motorola, HP, and DEC. There were a few others in the building for "real time" computing and some pre-PC desktops.

I remember the CDC 6600 as having a 64-bit word and there were 128 characters available. The 128 characters does not match a 6-bit word, but rather a 7-bit word. You could run APL on it and it used a whole bunch of special characters. .

I only read that part of the manual once, and there were a few "hall talk" discussions about the size. So, there may be a wet memory error there.

Ralph —The preceding unsigned comment was added by 67.173.69.180 (talk) 15:33, 6 January 2007 (UTC).

[edit] tword

Is "tword" worth mentioning? It is used, at least, in nasm as a ten-byte field. I found a few other references to it (for example, GoAsm). I was looking because the nasm documentation didn't really say how big it is. --Ishi Gustaedr 17:56, 20 July 2007 (UTC)

I don't think it belongs here, as it's not a generally recognized term. It seems pretty much to be a convention internal to nasm and a few other tools. -R. S. Shaw 03:39, 21 July 2007 (UTC)

[edit] Variable word architectures requires fix and article needs to be improved

the Model II reduced this to 6 cycles, but reduced the fetch times to 4 cycles if one or 1 cycle if both address fields were not needed by the instruction

Suggestions:

1. Article needs to differentiate between general word meaning and Intel's definition (more clearly). Generally word should mean maximum data, which could be stored in a register, transfered through the memory bus in one cycle, etc (width of them). Now Intel (or x86 programmers) have coined the new word definition, which is 16 bits. This is only one of the possible word sizes like byte can mean not only 8 bits. IA-32 (since 80386) natural word size is 32 bytes cause such amount of data is maximum register capacity and is the size with which the system operates. 16 bit part is only part of the register. As well, 16 bit word can also be divided into high and low parts and all these together make one 32 bit register. Same goes for x86-64, but not the latest IA-64. Furthermore, C language standards ANSI C and C99 define that int size is equal to the size of the word in particular system. On IA-32 it is 32 bits.

2.

and is still said to be 16 bits, despite the fact that they may in actuality (and especially when the default operand size is 32-bit) operate more like a machine with a 32 bit word size. Similarly in the newer x86-64 architecture, a "word" is still 16 bits, although 64-bit ("quadruple word") operands may be more common.

Seems too much uncertainty (may, more like) —Preceding unsigned comment added by 193.219.93.218 (talk) 13:49, 11 June 2008 (UTC)

1. As I see it, Intel didn't really coin any new definition of "word". They defined a word as 16 bits when they came out with the 8086. They continued with that convention with the upward-compatible 80186 and 80286 (even though the latter had many 32-bit operations). The '386 was also upward-compatible, and initially most programs were using 16-bit operands (indeed, could run on a 8086 or 8088), so of course they continued to have the term 'word' mean 16 bits. To do otherwise would have only created competely unnecessary confusion. This is not unusual; the term 'word' is vague except when applied to a particular machine, where it usually has a very particular value. This usually has to be done by definition for the machine, because any modern machine uses different sizes in different contexts; it is not a simple maximum register size. The x86-64, for instance, has some registers bigger than 64 bits, eg 128 bits. The "Size families" section of the article talks about this.
2. The prose seemed to reflect the situation, which is not really uncertain but complicated and conditional - a "32-bit" 386 can be running 8086 code and not operating like a native "32-bit word machine" in most respects. But I have made some revisions to it which may have improved things. -R. S. Shaw (talk) 19:40, 11 June 2008 (UTC)
Thanks. I usually try to contact the maintainer first. The main thing why I question this word issue is because the definition of word crosses with C's definition of int. Int is defined as a word on particular system. Because of this I came here to verify my assumption that int would be 64bit on x86-64. But according to this article int should be 16bit. And we know that int size on 32bit systems is 32bit. Btw, memory bus size is 32/64 bits and even if we operate with only ax, bx, cx, dx, si and di, data is sent from the eax, ebx, ... with leading 0 appended. Could this word term be defined in some IEEE released terminology standard? 193.219.93.218 (talk) 03:47, 12 June 2008 (UTC)