Talk:Number
From Wikipedia, the free encyclopedia
[edit] blackboard bold
blackboard bold does not mesh well with a paragraph. Pizza Puzzle
Eh, I'm used to seeing blackboard bold; plain bold reminds me of a variable, not a set.
I agree that we should use it; however, the current font for it is too big and doesn't mesh with the rest of the page. Perhaps, if somebody submits a slightly smaller .png we can use that. Pizza Puzzle
I plan to upload a whole bunch of <.png>s like this later this month. Of course, feel free to beat me to it. ^_^ Still we should still prefer markup that renders most directly in HTML, when such a thing works -- for now. -- Toby Bartels 05:57 12 Jun 2003 (UTC)
Could somebody produce a little Venn diagram picture showing the various number sets? I removed this verbal description of the Venn diagram. AxelBoldt 15:23, 29 Sep 2003 (UTC)
This statement is incorrect:
"Ratios of integers are called rational numbers or fractions."
In fact, ratios of integers are fractions but NOT rational numbers. The union of the set of integers and the set of fractions equals the set of rational numbers. The distinction between integers and fractions is that no less than two integers (by ratio) are required to define a fraction.
I suspect this error has been carried over to a couple of other related pages. It must be corrected.
OmegaMan
- Rationals are usually defined as equivalence classes of ordered pairs of integers. Saying that they are ratios of integers is reasonable; this is only supposed to be an informal statement. A formal construction is given in the rational number article. --Zundark 08:17, 17 Nov 2003 (UTC)
Yes but the formal construction you directed me to is for the rational numbers- NOT the fractions. The presentation of the various sets of numbers is more clearly understandable if it incorporates a brief summary of their methodical construction, one built onto the next.
Where the integers have already been defined seperately, all trivial cases of fractions which equal integers should then be eliminated as redundant (i.e., those where ratios of two integers, converted to fractions, can be simplified such that the denominator is equal to one). Then, the rationals can be defined as the union of the two underlying subsets.
Note that integers require only one integer (obviously) to define themselves reflexively which is not possible for fractions.
You may think I am splitting trivial hairs. Still, the distinction I am making is reality-based and relevant. I am not just making this stuff up as I go along. It came directly from a "theory of arithmetic" textbook I own.
OmegaMan
This article is missing the ordinal/cardinal distinction for finite numbers. Although the finite ordinals are the same as the finite cardinals, the use to which they are put is different: "I have five beads" vs. "this is door number 5", and so they are conceptually different, even if mathematically equivalent. Can someone help put this distinction in the article? -- The Anome 13:54, 27 Jan 2004 (UTC)
[edit] Umpteen
I've added umpteen in the "see also" list, largely to de-orphan it. I'm not absolutely sure this is the right article to link to it, but I can't think of an alternative. Suggestions would be more than welcome. DavidWBrooks 20:01, 16 Feb 2004 (UTC)
[edit] Natural Numbers and Zero
I have never known zero to be included in the set of natural numbers (a.k.a. counting numbers, hence the exclusion of zero, as one never counts the zeroth member of a set). Rather, it is the only non-natural member of the set of whole numbers. I hope someone will correct this, or at least address the question if I am in error.
Arnold Karr
Peano would be the man to ask, but he's no longer around.John H, Morgan 16:36, 2 June 2006 (UTC)
Peano, who formulated the axioms that define natural number, identified zero as a natural number. Because of this we should clearly differentiate 'natural' from 'counting' numbers in anything that is written about either of them. Counting numbers are identical with the positive integers, but the definition of integer relies on axioms of natural number, making a definition of "counting numbers" as "positive integers" somewhat circular.John H, Morgan 09:31, 12 February 2006 (UTC)
- It's very common to include 0 as a natural number (so that "natural number" and "finite ordinal" mean the same thing). It's undesirable for some purposes, however, so not everyone does it. --Zundark 07:42, 23 May 2004 (UTC)
-
- I think it's very natural to have zero items of something (in contrast to having a negative amount of something). In fact, all of us own zero items of almost everything. It is at least as natural as the usual definition of (the number) zero as the empty set.
- Of course we do not count the zeroth member of a set, but when we count something, we start out with zero items counted, before adding the first to the inventory, if there is any. When somebody asks you to count the number of apples in your pocket, you would not protest saying "I cannot count them". You would maybe say "there are none", but this is just a synonym of zero. MFH 13:31, 7 Apr 2005 (UTC)
-
-
- This is, however, original research. We can only report that this is a subject on which authorities disagree. Rick Norwood 17:51, 2 June 2006 (UTC)
-
[edit] Mixing Numbers & Biology
I advocate the total removal of the speculative "biological basis" section, regardless of whether or not it may be wholly or partially correct. We should stick to provable information in an article involving mathematics in an online encyclopedia. OmegaMan
[edit] extensions and generalizations
The section "generalizations" should be merged into "extensions" (which could receive subsectioning). I suggest to put it after the nonstandard stuff and before the comment on abstract algebra. Please feel free to do so. MFH 18:07, 7 Apr 2005 (UTC)
[edit] Numbers
As per the discussion here, i vote to improve the GUI for the number classification. --Electron Kid 01:19, 27 October 2005 (UTC)
- Improve it how? Saying "make this better, instead of worse" doesn't contribute much.
- You basically have two choices: One, you can specify what you think is inadequate now, and give ideas for how it might be improved. People might discuss that with you, if they feel like it. Or, you can be WP:BOLD and take your best shot. If we don't like it, we'll put it back. --Trovatore 03:23, 27 October 2005 (UTC)
[edit] Integral domains
I've removed this:
- Preserving the main ideas of "quantity" except for the total order, one can define numbers as elements of any integral domain. Unfortunately, the set of polynomials with integer coefficients forms an integral domain, and polynomials are not numbers.
I just don't buy that whether we consider something as a "number" or not has much to do with whether it's in an integral domain. The extended real numbers aren't an integral domain; the ordinal numbers aren't; the cardinal numbers aren't. Matrices in general don't form an integral domain, but nonsingular matrices do, and that doesn't make the latter numbers. --Trovatore 16:45, 27 October 2005 (UTC)
[edit] Attempted rewrite.
This is a subject I have thought about carefully for many years. I am going to attempt a rewrite, a little at a time. Rick Norwood 21:31, 5 December 2005 (UTC)
I have removed D from the introduction, but not from the charts, so it will be easy to restore if it has any defenders. All of the other sets here are well defined sets of numbers. D, defined as the set of numbers whose decimal representation terminates, is not a well defined set of numbers, because the set changes if the base changes. In base ten, the fraction 1/3 would not be in D while the fraction 1/5 would be. But if we write in base 12, then 1/3 is in D and 1/5 is not. Rick Norwood 22:29, 5 December 2005 (UTC)
- "Decimal", by definition, means base ten. The set is quite well-defined, just not very interesting. I have no objection to its removal. --Trovatore 22:37, 5 December 2005 (UTC)
- Good point. Rick Norwood 22:45, 5 December 2005 (UTC)
Hearing no objection to the removal of D as a not particularly interesting set, I will remove it. Rick Norwood 00:42, 11 December 2005 (UTC)
[edit] The Number template
I think for the list of constants at the bottom of the template, for a symbol for a constant to provide useful information it should have a link. Would anyone like to undertake a stub, at least, for each of the constants? I would, but I am unfamiliar with many of them. Rick Norwood 01:07, 11 December 2005 (UTC)
[edit] Mathematical Collaboration Of The Week
So this article is now the collaboration. I can't think of much to do with it. How should this be improved? --Salix alba (talk) 09:19, 1 February 2006 (UTC)
- The article is now longer. I've added a few references.
- I still think, as I mentioned above, somebody needs to explain the strange numbers in the box. If nobody knows what they mean, we should delete them.
- It might be appropriate to list the digits from 0 to 9 in Arabic symbols.
- We could get into other ways of writing numbers but I think that belongs in the article on numerals.
Rick Norwood 21:01, 1 February 2006 (UTC)
- Good edits, Jon. Rick Norwood 19:20, 4 February 2006 (UTC)
- JA: Danke. Jon Awbrey 19:32, 4 February 2006 (UTC)
[edit] Could anybody check this?
The article states that the symbol for integer comes from the german word "zahlen". Can anybody check if this is true and give reference? I'm asking this because "Zahlen" (=numbers) seems to be more likely than "(be)zahlen" (=to pay). There are also other possibilities like "zählen" (=to count)... --Cyc 17:02, 4 February 2006 (UTC)
- JA: It should be capitalized, but German words frequently get decapitated in the process of being anglicized, for example, zeitgeist. A more specialized influence occurs when the group of integers and its many-splintered modulations are regarded as cyclic groups. Jon Awbrey 17:14, 4 February 2006 (UTC)
[edit] List of constants
Unless someone objects, I am going to remove from the list of constants all constants with red links. So, act quickly to save your favorite constant. Rick Norwood 16:09, 5 February 2006 (UTC)
- JA: Rick, while you're at it, could you remove the notation of P for primes from the inset box. This is highly non-standard and many more writers will use P as a nonce char for "Positive integers" than anybody does for "Primes". Jon Awbrey 16:16, 5 February 2006 (UTC)
Done.
I do not know how to fix the extra space between Further Generalizations and Extensions.
I think we need more references. Rick Norwood 13:33, 6 February 2006 (UTC)
[edit] Trivia
Pfafrich -- I notice that some people use rv for reversion and others use rw. Is there any difference that I should be aware of? Rick Norwood 13:17, 6 February 2006 (UTC)
- JA: I think you may be seeing rvv = rv2 = revert vandalism as rw. Jon Awbrey 17:08, 10 February 2006 (UTC)
Thanks. Rick Norwood 18:45, 10 February 2006 (UTC)
[edit] References & Bibliography
- JA: I'll be rummaging through my pack for some suitable stuffing as the day/week wears on. Jon Awbrey 14:32, 6 February 2006 (UTC)
[edit] long addition to the article
I really don't think this belongs here -- most of it is about numerals rather than numbers. Also, I'm told that bullet points are considered unencyclopedic. Rick Norwood 14:30, 7 February 2006 (UTC)
- Yes I know its long at the moment. I've done a big cut and past from Timeline of mathematics to get most of it in. I'm sure it could be cut down considerably, but I though its best if a complete history is there as a starting point to work with.
- As to numerials its worth distinguishing material about place systems from specific numerials. Place systems are an important development. Much of first section covers 0, -1 and the roots of our decimal system from Asia. --Salix alba (talk) 15:17, 7 February 2006 (UTC)
[edit] Non(-)negative integer & Positve integer are becoming standard
- JA: In places where the longstanding equivocality of the term "natural number" has becomes a pain-in-the-[insert your favorite anatomical location here] on a second-by-second basis, like Ninja Sloane's OEIS, it has become standard to end all the fussn'-&-fightin' by using the dab terms non(-)negative integer and positive integer. Those who miss the excitement can still go tete2tete over the hyphen in non(-)negative. Jon Awbrey 16:18, 12 February 2006 (UTC)
- Oh, were it so easy! The problem, both historically and pedagogically, is that the natural numbers come before the integers, so your definiton defines the simpler concept in terms of the more advanced concept, which is, of course, then in turn defined in terms of the simpler concept. Rick Norwood 19:40, 12 February 2006 (UTC)
- JA: This is about a current usage that avoids ambiguity — and that's a good thing. It's not about some sourcerror's apprentice game of "Quintessentially Universally Ideally Definitive Definitions In Totalizing Conceptual Hierarchies" (QUIDDITCH). Jon Awbrey 22:42, 12 February 2006 (UTC)
- The problem is that there are two current usages. Each, individually, avoids ambiguity. As long as both are in use, ambiguity is inevitable. All, then, that is necessary to avoid ambiguity is for one side or the other to give in. That hasn't happened yet. Rick Norwood 22:16, 13 February 2006 (UTC)
[edit] Relatedness of items in history section
I noticed that much of the information in the history section has little to do with the content of this article. For example, is this really the page to mention the history of solutions to quadratic equations? I do think that there needs to be some discussion regarding the scope of this page before major changes are made. Grokmoo 18:15, 13 February 2006 (UTC)
- Yes as pointed out above, the history section could do with a good copy edit. As to why quintics are important to the development of number: they are the lowest degree polynomials whose solutions cannot allways be expressed by radicals (algebraic expressions involving roots), hence opening the way to transendental numbers. They were also an important step in development of Galois theory, the main technique for proving transendentalism of pi and e. --Salix alba (talk) 22:17, 13 February 2006 (UTC)
- Yes, but perhaps these sorts of things belong in number theory. If not, then there should at least be some mention of how these things relate to the main topic. Grokmoo 02:10, 14 February 2006 (UTC)
- The definitely do not belong in number theory, that field is about properties of integers, especially primes. This article is about the different number systems and how they are related. --Salix alba (talk) 09:40, 14 February 2006 (UTC)
- Number theory is not only about integers, but also about algebraic and transcendental numbers (see algebraic number theory). Of course, it doesn't necessarily follow that the connection quintics → algebraic numbers should not be mentioned here. -- Jitse Niesen (talk) 11:30, 14 February 2006 (UTC)
- The definitely do not belong in number theory, that field is about properties of integers, especially primes. This article is about the different number systems and how they are related. --Salix alba (talk) 09:40, 14 February 2006 (UTC)
- Yes, but perhaps these sorts of things belong in number theory. If not, then there should at least be some mention of how these things relate to the main topic. Grokmoo 02:10, 14 February 2006 (UTC)
[edit] Minor fix
square root of − 1 to Macwiki
[edit] Imaginary unit
But actually, i is not the square root of − 1, since negative numbers can't have a square root. In reality, it should be said thus: i2 = − 1, which is not the same thing. This is actually said in the article about the imaginary unit.
- The square root function is extended from the real numbers to the complex numbers, after which the principle square root of minus one is indeed i. Still, it might be better in the article to say i squared is minus one. Rick Norwood 14:16, 11 April 2006 (UTC)
[edit] Origins of Number
Perhaps it would be worth adding some reference to theories about the origins of counting, such as that put forward in A. Seidenberg, 'The ritual origin of counting', Archive for History of Exact Sciences, Volume 2, Issue 1, Jan 1975, Pages 1 - 40. I don't have a copy of this paper lying around, so am not best placed to add the material. — Stumps 13:38, 15 February 2006 (UTC)
[edit] Ben Standeven's edit
Good work removing bullet points and shortening. I hope you, or somebody, tackles the rest of the bullet points.
In particular, what is already covered in Numeral should not be repeated here unless very briefly. Rick Norwood 13:59, 15 February 2006 (UTC)
[edit] problems with the introduction
Here is how the introduction to this important article read before my recent edit. Below, I mention some of the problems with this version.
"Originally the word number meant a count, or tally, of discrete objects. These days this application is referred to as cardinal number. Later use of the word came to include quantifying measurements of basic physical quantities like mass, length, area, volume and time.
- Mathematicians have extended the meaning of number to include abstractions such as negative numbers, transcendental numbers and the , also known as i. In common usage, one may find number symbols used as labels (telephone and highway numbers) They are also used together with, or instead of, alphabet letters to indicate ordering (serial numbers and letters ). The latter usage is encapsulated in the notion of ordinal number — first, second, third, etc. + In common usage, one may find number symbols used as labels (telephone and highway numbers). They are also used together with, or instead of, alphabet letters to indicate ordering (serial numbers and letters ). The latter usage is encapsulated in the notion of ordinal number — first, second, third, etc."
"Originally the word number..." The word "number" is of relatively recent origin -- it is the meaning of the concept, not the word, that is the subject of this article.
"...referred to as cardinal number". No, ordinals are also used for counts or tallies: "first, second, third".
"...later use of the word..." we do not know enough about the origins of numbers to say definitively that counting came earlier than measuring. This is an assumption.
"...used together with, or instead of, alphabet letters..." Rather, occasionally alphabet letters are used in serial numbers, but this is a minor point, not one necessary to cover in the article.
The article, especially the introduction, needs to concentrate on the concept number. There is already an article on numeral. Rick Norwood 14:57, 18 February 2006 (UTC)
Although I am pleased to note that somebody else has taken an interest in the intro. to thia article, I do think several points in Rick's edit need addressing.
The first point seems OK to me; I was trying to find a suitable opening sentence but ended up with something akin to a dictionary definition of a word :-(
Although ordinals are used as an aid to counting, it is not necessary to use them to obtain a count. Putting sets into one-to-one correspondence with a collection of standard sets will also allow one to obtain the cardinality, a process that is akin to tallying against marks made e.g on paper with a pencil. Furthermore, mathies seem to agree that cardinals are more basic than ordinals, and yet Rick's edit has removed any reference to the former in the article.
I don't know enough Archeology to challenge the third point, but it must surely be known which one comes first in the historical records. I doubt very much if Neanderthals thought about measuring before they got to count.
The useful extension provided by addition of letters to numbers and creating "alphanumerics" is very useful in categorisation. I have added this thought to the intro. Nominal data in statistics is often coded this way, and the use of aplphanumerics even spills over into ordinal scale data utilised by statisticians.
Final point I more or less concur with, but the introduction needs to catch the reader's attention a little, too. John H, Morgan 19:32, 18 February 2006 (UTC)
There are monolithic monuments more ancient than any writing which has survived, which require both the ability to count and to measure. The earliest surviving evidence of number that I am aware of concerns counts. Sealed jars containing a number of pebbles exist. Apparently a caravan master carried the sealed jar along with his cargo, so that the merchant at the other end could break open the jar, count the pebbles, and make sure the caravan master had delievered the required number of items. These jars predate any discovered writing. But presumably measurement was also in use, if only in the form of laying off a distance using sticks of equal length. There exist very ancient 3-4-5 triangles. In any case, I'm sure we can work together to get the intro into good shape. Rick Norwood 00:49, 19 February 2006 (UTC)
[edit] Ben Standeven's edit
Good work removing bullet points. I'm going to change the present tense to the past in the material you added, though.
The difference between References and Bibliography is this, and it is essential. References are works actually used in writing the text, or works that verify statements in the text. Bibliography is much more general, and can include other works on the same subject and also related works, even if nothing in them appears in the article. Rick Norwood 23:44, 28 February 2006 (UTC)
[edit] Inadequate summary definition
A number is an abstract entity that represents a count or measurement.
...Yet, then it goes and talks about i.
I think this article needs a more comprehensive summary definition for a number.
I noticed this because I don't really know what the definition of "number" is, myself. So I can't really help, sorry. LogicalDash 02:16, 17 April 2006 (UTC)
- Well the article does state that mathematics has extended the definition (i.e. a representation of count and measurement) to include negative, transcendental and imaginary numbers. Maybe we could say what these sets represent though, something like "these extensions came about as the need to solve certain equations arose, hence represent solutions to equations that no natural number is a solution of". Ugh, that's dodgy, maybe someone can formalise it a bit :)
- Anyway, the point is numbers are nothing but inventions we've created as we needed them. We started by inventing numbers to describe how many apples someone has (count) or how many steps to the apple tree (measurement). Eventually it becaume useful to invent negative numbers to solve simple equations (for example, to represent debt). No-one really has -3 apples, but it is useful to represent it as so. The same thing applies to transcendental numbers and imaginary numbers. --darkliighttalk 06:49, 17 April 2006 (UTC)
- My personal point of view is that mathematicians are now really really sorry they ever considered the complex numbers numbers; while the quaternions are now usually stripped of that title, it's kind of stuck for complex numbers.
- (Other mathematicians disagree, of course).
- If it helps at all, "number", without a qualifier, is now very rarely used by mathematicians to formally refer to anything that is not a real number; the situation is similar to the word "tea", which, without qualifier, refers to a beverage made from Camellia sinensis, but which can refer to other plants when an adjective is added to it, as in "herbal tea".
- I would thus agree with the summary definition as it now stands, with the caveat that mathematics is currently un-extending the definition of "number" at least back to complex numbers. RandomP 14:09, 25 September 2006 (UTC)
-
- RandomP, you are the first mathematician I ever saw or heard stating that (s)he didn't like to consider complex numbers as numbers. On the other hand, I've seen a number of exposures on the theme 'Complex numbers are as real as real ones' (and of course also explain this myself when teaching). I also have no idea where you got the idea that unqualified use of number except 'very rarely' refers to real number - except, of course, if you mainly are considering beginner's undergraduate texts in e.g. calculus. My experience is, that in texts written by people who are mathematicians in the first place, 'number' is not used unqualified; but the qualification often is placed early, and then cover the entire usage of the term. In a book where the scope is not as fixed (say, an introduction to 'real and complex analysis'), there may be reason to qualify each usage of the term. In an 'arithmetic number theory' context, just positive integers may do fine; while in a field like algebraic geometry, where there is a tangible difference not to work over an algebraically closed field, assuming all varieties to be complex whenever the converse is not explicitly stated may be more natural.
- Writers who primarily are something else, e.g., physicists, sometimes seem to be less clear (and I must admit that sometimes so are algebraic geometers). I suspect that some physicists tend to consider numbers as more 'discovered' than 'invented', and that this makes a difference. From this point of view, and since complex numbers indeed seem to be indispensable to physics, they may glide from real to complex number without the reader noticing.
- What is true, on the other hand, is that 'numbers' are not so important to distinguish in much of modern mathematics, since it is now recognised that they form just a limited set of possible basic structures. In linear algebra, e.g., 'scalars' may be elements in any fixed field, not just R or C. JoergenB 16:44, 25 September 2006 (UTC)
-
-
- Sorry, should have been clearer on this: My claim might be less surprising if I rephrased it as: if the complex numbers were a new invention today, given the overwhelming connotations that people have with the term "number", another name would probably be chosen (consider "group" as another example). I don't think a mathematician talking to a lay audience would even consider saying "number" when they meant specifically "complex number", except in order to deliberately mislead their audience.
- But my basic point stands: a complex number is as much of an unqualified (non-mathematical-context) "number" as a free group is an unqualified (non-mathematical-context) "group": the connotations just don't match.
- I like the complex numbers as a field, particularly as a topological field. For me, the intuition of the unqualified term "number" is that it is something that is defined in analogy to a measuring process; that simply doesn't work for complex numbers, because of complex conjugation. Complex numbers require knowledge of an arbitrary convention to make sense, unless they happen to be real.
- Of course, this personal point of view should be kept out of the article.
- I would, however, argue vehemently (in other forums, obviously) against declaring the elements of any new mathematical structure (with a very few predictable exceptions) "numbers".
- (I believe that "number of" is commonly used in mathematical writing as a synonym for "the cardinality of the set of". This is unqualified use of the term "number", but refers to cardinal numbers. As for the "very rarely" thing, I'd like to take that back. I still think it's rare for unqualified use of "number" by non-physicists to refer to complex non-real numbers.).
- Of course complex numbers are "as real" as the real numbers. They're just not as numbery. ;-)
- RandomP 17:24, 25 September 2006 (UTC)
-
[edit] Redirects
Not sure where to put this, so I put it here. Many of the x (number) pages for 3-digit numbers simply redirect to the previous 100, i.e. abb (number) redirects to a00 (number). Is this really desirable? I think it's confusing. ~iNVERTED | Rob (Talk) 22:52, 26 August 2006 (UTC)
[edit] 'Decimal number' definition
I tend to use the term 'decimal number' either for a real number which is possible to write in decimal form, i.e., a rational number with least denominator dividing some power of ten; or (slightly inappropriately) for an integer in base ten representation. In the first case, 1/4 is a decimal number, but 1/3 isn't. (This sense is semantically equivalent to decimal fraction, as defined in the decimal article. However, some people dislike the term 'decimal fraction', which they consider an oxymoron.) In the second case, I would call the base ten representation 28 of decimal, but not the base sixteen representation 1C of the same number. I try to refrain from this usage, however, if I don't have confidence in my listeners' ability to distinguish numbers on the one hand and their representations on the other.
In the real numbers section, 'decimal number' 'decimal numeral' seems to be used with a rather different meaning. I ((in the first place didn't)) note that the reference is just a redirect to decimal. I would like to know if you've discussed this before, reached a consensus on this other use of the word, and decided to make an article out of it. Otherwise, I'd like to avoid the term in this context.
Please note that this may be rather confusing. I've several times encountered university students who considered 1/3 (but not 1/4) as a good example of an 'infinite number'; and this clearly came from confusing representations of numbers and the numbers themselves. JoergenB 17:17, 25 September 2006 (UTC)
- Don't want to be too nitpicky, but "decimal number" isn't used in the article. "Decimal numeral" is, though things get a bit sloppy, and "decimal" is too. I'd suggest removing instances of the latter, unless you have a better idea? RandomP 17:30, 25 September 2006 (UTC)
- I stand corrected! Sloppy reading... but 'decimal numeral' is not clearer to me.
- Yes, I'd like to delete the term; but it might be there for a reason. If so, I'd like to hear that reason first. JoergenB 17:49, 25 September 2006 (UTC)
-
-
- I like "decimal numeral" - a "string" (which might be infinite) of decimal digits, with an optional minus sign, a (non-optional) decimal point, which represents a unique real number (but several numerals might represent the same one).
- It appears to me to be very close to a lay idea (I'm treating this as an article for a non-mathematical audience, BTW, just to avoid confusion) of what a real number is.
- Of course there might be a better term, but I haven't heard it. (But I hear there are professional mathematicians on WP who might be able to suggest one, hint, hint)
- "decimal" without the "numeral", though - totally unacceptable to me.
- I'll give this a shot.
- RandomP 18:43, 25 September 2006 (UTC)
-
-
-
- By the way, feel free to stop me at any point and call a time-out for discussion. The article should be acceptable to everyone.
- You ask several times for a reason behind previous editorial decisions, and whether there has been discussion: generally, if the discussion is still relevant to the article, it should be visible on the talk page (or its archives); however, you might want to try looking at the (much-vandalised, and somewhat long) history. RandomP 19:06, 25 September 2006 (UTC)
-
[edit] Replace "decimal numeral" by "decimal representation"?
I'm undecided about this, but I might like to replace "decimal numeral", which is a nonstandard term, by "decimal representation", which is a standard term that some think applies only to positive (or non-negative) real numbers. An alternative would be to use "signed decimal representation", or a similar neologism.
Opinions?
RandomP 19:14, 25 September 2006 (UTC)
I've checked a few things now... I notice that the 'classical (in my POV)' book "Science awakening" by B.L. van der Waerden (translated by Arnold Dresden; second English edition; P. Noordhoff ltd; no print year or ISBN given), 'numeral' is used for the single symbols only, while compounds like 'notation for numbers' and 'number systems' are used for systematic ways of representing numbers by means of symbols. E.g., the subsection "Hindu numerals" starts:
- Where do our Arabic numerals 0 1 2 3 4 5 6 7 8 9 come from? Which people invented our excellent decimal positional system?
If not someone documents other, more modern usage, I'd be inclined to retain the limited van der Waerden usage of 'numerals'. JoergenB 09:46, 26 September 2006 (UTC)
Er, I'm not quite sure I understand: do you want to switch to van der Waerden's usage ('numerals' for single symbols only), or do you want to stay with the current usage (where 'numeral' refers to a string of digits, plus a decimal point and sign in some cases)?
If the latter, I think we've got all the support we need for that :-)
RandomP 20:55, 3 October 2006 (UTC)
- The former. I'd like to retain the older (v. d. W.) usage, which implies changing this article; unless someone presents support for what you call 'the current usage'. In other words, is this a current usage also outside wikipedia? JoergenB 15:31, 4 October 2006 (UTC)
[edit] "should" is not a good word for wikipedia
- Numbers should be distinguished from numerals, the symbols used to represent numbers
Really? Why? It's perfectly possible (if really really awkward) to define a number to be (the mathematical representation of) a symbol used to represent that number. I think that one of the things that makes mathematics hard to understand for some people is that, ultimately, the mathematical definition of most number systems is a string of symbol to be formally manipulated according to certain rules. That's not the intuition I (and, I would think, most other mathematicians) have, but it might be a bit of a letdown for people first to be told that a number isn't really a string of symbols, only to be then taught otherwise by a moderately eccentric maths professor.
I'm not sure how to fix this. "can be distinguished"? "are distinguished"?
RandomP 19:24, 25 September 2006 (UTC)
[edit] wikipedic medieval zero
For the time being I would restrict my criticism to one paragraph of this text (History of zero), namely the one concerning the so-called medieval zeros. Wikipedia says: “When division produced zero as a remainder, nihil, also meaning nothing, was used.” The Latin word ‘nihil’, possibly sometimes abbreviated to N, means ‘nothing’ indeed, but in early medieval Europe it never means ‘zero’, for in this Europe nobody did know the number zero. In Latin no word for zero existed, because the Romans did not know the numeral zero, let alone the number zero. Keep in mind that in early medieval Europe division was always repeated subtraction (there were no division algorithms). Where Beda Venerabilis explains dividing 198 by 30 (using in his calculations no other numerals than Roman ones), he says first that 6 times 30 makes 180 and then that there is a remainder of 18, or that 18 is left over. But he refrains from using the number zero to tell us which remainder one obtains when dividing 210 by 30, for answering this decisive question he simply says “nihil remanet” or the equivalent “non remanet aliquid” (see “De Temporum Ratione”), meaning “there is nothing left over” resp. “there is no remainder”. So there is no reason at all to conclude Bede meant ‘zero’ by his ‘nihil’. He and his great predecessor Dionysius Exiguus simply did not know the number zero because nobody in early medieval Europe did know the number zero. Hence in early medieval Europe division produces either no remainder or a positive one, but never the number zero as a remainder.
In Faith Wallis’ standard work about “The Reckoning of Time” we find a modern version of Beda Venerabilis’ Easter cycle, with our modern numerals and with lunar epacts being 0 every nineteen years, and even with the year -1 (= the year 1 BC). But in Bede’s original manuscripts you will see no nonpositive numbers at all and find only ‘nullae’, meaning only ‘none’, or ‘nihil’, meaning only ‘nothing’, on the places where you would expect to find the number zero. There where Bede uses Roman numerals he never uses zero. And there where he enumerates the Greek numerals he does not observe that there is no numeral zero among them.
Zero is a numeral (our tenth digit) and simultaneously a number (even the most important one). Therefore knowing the meaning of the term ‘nothing’ does not include knowing the meaning of the term ‘zero’ (if it did, Adam should be the inventor of zero). Knowing the number zero includes knowing how to use the numeral zero in ones calculations. But Dionysius Exiguus and Beda Venerabilis and even Gerbert (who became pope Sylvester II in the year 999) could impossibly make use of the numeral zero in their computations, because in first millennium Europe nobody did know the numeral zero and no symbol (0 or something else) or word for this numeral existed. Inversely they did not need the numeral zero at all because there were no algorithms available yet (DE around 520 and BV around 720) or one made do with simple algorithms in which the numeral zero played no part (Gerbert around 980); keep in mind these men were no mathematicians of consequence. The only mathematician of consequence in early medieval Europe was Boetius (around 500), but even in his writings we find no trace of zero. Inventing the number zero did not happen in Europe but in India. It was the great Indian mathematician Brahmagupta who (about the year 630) was the first who not only used the numeral zero in his calculations but also made explicit the most important properties of the number zero. The number zero reached Europe only around the year 1200.
I establish that the wikipedic medieval zero only rests on the misunderstanding that zero amounts to ‘nothing’. So the wikipedic medieval zero is nothing indeed, it does not amount to anything, it is a dummy. It is neither numeral zero nor number zero, it is no zero at all, let alone a true zero. Inventing zero is much more by far than abbreviating ‘nothing’ to N. Consequently, there is no serious reason to maintain (and only serious reason to delete) the paragraph concerning the so-called medieval zeros. So I would propose to delete that paragraph or to replace it with a text compatible with reality. I hope to meet with approval, and to enter into consultation with everyone who wants to support me as well as with everyone who wants to resist. Away with the wikipedic medieval zero! :) Jan Z 12:48, 11 December 2006 (UTC)
- It is not very clear to me exactly what you are saying, but then again, the article's use of "true zero" isn't very clear either. One thing is clear, however: Treating zero as a number does not at all depend having a zero digit. The article is quite clearly talking about symbols representing zero/nothing in contexts where numerals were used. This may not be treating it as a number as much as Brahmagupta did, but I don't think it should be simply deleted. 15:37, 11 December 2006 (UTC)
-
- What DE, BV, Gerbert and everyone in before about 1200 Europe did was by no manner of means "treating zero as a number", for their "zero" was only 'nothing', with which they didn't calculate at all. My probihitive complaint against the wikipedian medieval zero is that it suggests that 'zero' is identical with 'nothing' (if such was the case, Adam should be the inventor of zero). We have to distinguish between 'zero' and 'nothing'. And we have to distinguish between numeral zero (we have ten numerals) and number zero (we have infinitely many numbers). It is the standard of Wikipedia that matters. And yes, we have to delete the paragraph in question. Jan Z 03:19, 12 December 2006 (UTC)
-
-
- Rewriting it to be more accurate is usually more helpful than deleting. Our ten digits (we have infinitely many numerals) are completely irrelevant in this context, so your repeated mention of them (as "numerals") makes it very hard to understand your point. We clearly have to distinguish between the number zero and the digit 0, but it is you trying to connect them, not the article. JPD (talk) 14:39, 12 December 2006 (UTC)
-
Categories: Delisted good articles | B-Class core topic articles | A-Class mathematics articles | Top-importance mathematics articles | Vital mathematics articles | Wikipedia CD Selection | Wikipedia Version 0.5 | Wikipedia CD Selection-0.5 | Wikipedia Release Version | B-Class Version 0.5 articles | Mathematics Version 0.5 articles | B-Class Version 0.7 articles | Mathematics Version 0.7 articles