Talk:CIE 1931 color space
From Wikipedia, the free encyclopedia
Please add new topics at the bottom
[edit] Luminance
Can someone explain from where the relation Y = 0.2125R + 0.7154G + 0.0721B came from? The matrix equation says Y = (1/0.17697)(0.17697R + 0.81240G + 0.01063B). Am I missing something?-- 19:00, 26 July 2006 MM
[edit] Grassmann spelling
If you look at the biography of Grassmann, you'll see it mentions his color work. The law comes, I believe, from his article "Theory of compound colors", Philosophical Magazine 4 (7), 1854, 254-264. You can cross-check in a few library catalogues under, e.g. Sources of color science, ed. David L. MacAdam, MIT Press [1970]. --Macrakis 23:35, 11 July 2005 (UTC)
[edit] Why is the bottom flat?
The outer contour iof the CIE figure lists wavelengths (from 380 to 700 nanometers), but what are the wavelengths of the colors along the straight line from 380 to 700?
How do they fit in, and why is that line straight?
- The wavelengths along the straight line at the bottom are called "non-spectral colors". They are colors which no monochromatic beam can match, they must be made up of a combination of two or more monochromatic beams. The outer contour is called the "spectral locus" and contains all of the colors that a monochromatic beam can match. In order to be a physically possible light beam, any beam must be made up of one or more monochromatic beams (or an infinite number for a continuous spectrum.)
- A second fact to note is that if you pick two points inside the range of possible colors (the gamut), every point on a straight line between the two must also lie inside the gamut. That means that every point between the two far ends of the spectral locus must lie inside the gamut. However, if a point were to lie outside the line connecting the two ends, then you would be unable to choose two or more monochromatic beams to specify that point. That means such a point is impossible. PAR 15:31, 6 September 2006 (UTC)
-
- This is probably a silly question, but isn't purple a part of the spectrum? 86.5.107.158 03:08, 20 September 2006 (UTC)
-
-
- The straight line mentioned above is sometimes called the "line of purples". If we define any color on the straight line as a type of purple, then, no, there is no wavelength that corresponds to purple. The sensation of purple can only be realized by using a combination of monochromatic light sources. PAR 03:29, 20 September 2006 (UTC)
-
[edit] mistake
The CIE XYZ color space was deliberately designed so that the Y parameter was a measure of the brightness of a color.
Should not be "Z parameter"?
- No, its the Y parameter. PAR 15:03, 6 September 2006 (UTC)
- becuse we see the most bright color in the medium of wavelenght, and Y is the present of this wave lenght, so it's the best choise. —Preceding unsigned comment added by 81.12.9.2 (talk) 20:31, 4 November 2007 (UTC)
[edit] chromaticity diagram
Is the xy chromaticity diamgram on page 2 plotted at a fixed brightness? Could one display any number of such diagrams with different brightnesses values or is this diagram always plotted at some special brightness?
- If you click on the diagram, it will take you to the diagram page. There is an explanation there of how the diagram was generated. PAR 01:54, 2 June 2006 (UTC)
[edit] normalization
Should
be instead something like
? As far as I understand, and V are separately and empirically determined up to a scalar constant, so it doesn't make much sense to set one equal to the other. Jcreed 04:32, 29 September 2006 (UTC)
- That certainly looks right. I have changed it, and will check that it is in fact true. PAR 00:25, 20 October 2006 (UTC)
[edit] Purples outside the gamut of Wright and Guild's primaries ?
I think I understand how the blue-green out-of-gamut part of the XYZ color space can be derived from the Wright and Guild color matching functions (using negative values for red) but how does that work for the purples?
- I believe you would use negative values of both red and blue. PAR 00:25, 20 October 2006 (UTC)
- Thats wrong. Make it negative values of green. PAR 00:29, 20 October 2006 (UTC)
-
- Why aren't there negative values of green in the color matching functions as shown in the article? Those matching functions go down to 380nm, where they're outside of the Wright and Guild primaries' gamut.
[edit] Red response peak
"The 700 nm wavelength, which in 1931 was difficult to reproduce as a monochromatic beam, was chosen because it is at the peak of the eye's red response..."
If the supplied diagrams are correct, then the statement that 700 nm is at the peak of red response must be wrong. Maybe the authors meant "600 nm"? This is much closer to the peak.--WhAt 18:07, 26 December 2006 (UTC)
- Yes, that is an error, and I will fix it. To quote from "How the CIE 1931 Color-Matching Functions Were Derived from Wright–Guild Data" by Hugh S. Fairman, Michael H. Brill, and Henry Hemmendinger:
-
"Guild approached the problem from the viewpoint of a standardization engineer. In his mind, the adopted primaries had to be producible with national-standardizing-laboratory accuracy. The first two wavelengths were mercury excitation lines, and the last named wavelength occurred at a location in the human vision system where the hue of spectral lights was unchanging with wavelength. Slight inaccuracy in production of the wavelength of this spectral primary in a visual colorimeter, it was reasoned, would introduce no error at all."
- PAR 21:31, 26 December 2006 (UTC)
[edit] The Structure of The Article needs to be re-ordered?
Does the structure of the article needs to be re-ordered? The X,Y,Z comes from nowhere and then it is said to be the basis "on which many other color spaces are defined", but then it says XYZ is derived from RGB. So which one is more fundmental ? The XYZ space or the RGB?
Certainly the RGB is more fundmental! Because they want to eliminate the NEGATIVE value of Red component as a linear transform is taken, after which XYZ space forms. Therefore the article needs to talk about RGB first, followed by XYZ. After this, the normalization may be discussed!
The order of the article is totally upside-down!!!!
--Puekai 15:19, 19 Jan 2007(UTC)
- The RGB data is the statement of the direct measurements, yes. But the XYZ is just a reformulation of those results. Both carry the same content, its just that the RGB are the result of direct measurements, while XYZ is derived from those results. I don't think RGB is more fundamental, it is just closer to the actual experiments.
- The article is named "CIE 1931 color space" and is about the XYZ space. For someone who wants to learn about the CIE 1931 color space, they should not have to wade through two or three sections about RGB space first. Thats only for people interested in the foundations upon which the CIE space is built. PAR 15:35, 19 January 2007 (UTC)
-
- I have to disagree with you both. RGB is NOT more fundamental, and is NOT a result of direct measurements. There do not exist sets of spectral sensitivity curves that you can use in a sensor to directly measure RGB, precisely because of those negative regions in the RGB color matching function, which would mean your sensor would have to have negative sensitivities to some wavelengths. That's why you need to understand this stuff. But I'll review the article, because if it really does say that "XYZ is derived from RGB" then that needs to be fixed, since it's not true. Dicklyon 16:21, 19 January 2007 (UTC)
- Well, maybe it's right what it says about the history. The CIE RGB space may have been defined first, in terms of monochromatic primaries. However, that historical order doesn't make it more fundamental. Both spaces are based on the same experimental data, but with different goals for the properties of the color matching functions. I've ordered the referenced book so I can understand the history better. Dicklyon 16:31, 19 January 2007 (UTC)
-
-
- Yes, it needs to be clear that the "RGB" space here is not what is commonly called "RGB" space, it is a very particular set of measurements, and should probably be referred to as the "CIE RGB" space, or "Wright-Guild RGB" space, or something. PAR 16:46, 19 January 2007 (UTC)
-
-
-
-
- Indeed it is cleary identified as CIE RGB in the article. It's not a set of measurements, though, any more than CIE XYZ is; less, actually, since you can actually make filters and detectors that CAN measure X, Y, and Z, but you CANNOT for R, G, and B. Dicklyon 19:07, 19 January 2007 (UTC)
-
-
-
-
-
-
- Are you sure about this? It seems strange to me that you can measure X,Y, and Z, since of the three, only Y has a definition that is more or less physical, and it even depends on the experimental Luminosity function. I would say it's far more likely that the filters and detectors measure according to other values (possibly the way we do), and then transform it to X,Y,Z using whatever mathematical formula are needed. In this case, making "CIE RGB detectors" merely involves doing one more mathematical transformation.
-
-
-
-
-
-
-
-
- What I meant was that you can't measure R, G, and B by putting filters in front of photodetectors, since the sensitivity curves are negative at some wavelengths, and filter/detector systems can only have non-negative sensitivities at all wavelengths. The X, Y, and Z basis functions were specifically defined to be nonnegative, so that ALL chromaticities are inside the XYZ color triangle. You are correct that ANY set of sensitivities that span the same subspace of spectral space can be used and transformed by a 3x3 matrix multiply to XYZ or RGB. Dicklyon 04:33, 1 November 2007 (UTC)
-
-
-
-
-
-
-
-
- I also feel a bit uneasy about the statement that CIE RGB is not a result of measurements, since from what I understand, it was defined on the basis of these experiments (and to obtain negative values, they added the primary color to the test color). I'm also not sure about CIE XYZ not being derived from CIE RGB, since from what I understand, they first had a good look at their model of human vision in CIE RGB, then they decided they wanted to transform it linearly to something nicer, and it ended up being CIE XYZ. Ratfox 00:14, 1 November 2007 (UTC)
-
-
-
-
-
-
-
-
- If I understand your conjecture, it's that the RGB primaries are the same as what were used in the matching experiments. Maybe so. Let's look at sources and see. Dicklyon 04:33, 1 November 2007 (UTC)
-
-
-
-
-
-
-
- I strongly agree that the "RGB" space needs to be clearly differentiated from the 'common' RGB space. And I don't think writing "CIE RGB" is enough: the lay person could easily read this as saying that what they know as "RGB" is a 'short form' of "CIE RGB" (Like "McDonald's" versus "McDonald's Family Restaurants", the listing in the Australian telephone directory.). The section ought to start with a statement along the lines of "The CIE RGB colour space is one of many RGB colour spaces. It is based on the work of ...." and then go into the existing text.
- By the way, RGB color spaces says "Note that CIE 1931 (or CIE XYZ) is not an RGB color space." What's up with that? Completely contradicts this article!
- —DIV (128.250.204.118 04:47, 30 July 2007 (UTC))
-
-
-
-
-
-
- See if recent changes to both articles make more sense now. Feel free to make more suggestions for clarifications. Dicklyon 06:04, 30 July 2007 (UTC)
-
-
-
[edit] R,B, and G are never defined
I propose that these three variables be defined explicitly on this page. I know that you can find them out by clicking on the Grassman's law link, but it is rather awkward. That article is so short, why not just include the whole thing here? There's already a section heading for it. Bababoef 08:58, 7 April 2007 (UTC)
- OK, I did it, but I'm not totally comfortable having the overline{r} notation for both the normalized color matching functions and the scaled ones. Is there any ohter symbol people use to distinguish these? Dicklyon 16:22, 7 April 2007 (UTC)
Er, strictly speaking all of the functions are "scaled", aren't they? Even for the normalized color matching functions the normalization constant is not defined. (Is this the way it is in the literature?) I assume you mean scaled for luminance vs radiant pwr? I am not familiar with the notation in the literature but you are right, there is clearly a notation problem here. Perhaps the simplest way to resolve this is to define R,G, and B in the following manner:
I think this helps make everything more explicit while making the notation consistent and showing that the color spaces are defined in terms of radiant power. (Assuming i did everything right here)
On a related note, why do the ratios have differing numbers of significant figures? For example, for r:g:b = 1:4.5907:0.0601 the r:g ratio has 5 sig figs, the r:b ratio has 3 sig figs, and the g:b ratio has 3 sig figs. Shouldn't they all have the same number of significant figures since this is a color standard and can be defined to arbitrary precision? Bababoef 20:07, 7 April 2007 (UTC)
- I'm not sure picking the numbers for "source radiant power" means anything that I can understand. It's possible that no scale factor is actually the right thing; it would give a "neutral" result for the "E" (equal spectrum) illuminant, looks like. I'm not really very familiar with this space, but I have a book I can consult... Dicklyon 23:40, 7 April 2007 (UTC)
It has to be the radiant power curves. Otherwise you wouldn't integrate them with the "spectral power distribution", you'd integrate them with some luminous distribution. Bababoef 00:14, 8 April 2007 (UTC)
- Oh, I see what it is now; not what you're thinking, I believe; in all cases they are weights to be applied to the specral randiant power distribution. Those are the scale factors you need to convert from the RGB variables to the luminances or radiant powers of the three defined monochromatic primaries. Therefore, those scalings come after the RGB definition; we can use the normalized functions (the ones that integrate to 1) to define R, G, and B, and thereby get a more or less normal neutral for the colorspace, at E. Now, I still need to find a book that will verify that I've got this right... Here's a clue. Dicklyon 01:13, 8 April 2007 (UTC)
No they are not. The weights change when different primaries are used. They go with the normalized color matching functions, not the power spectrum. The weights do not change when the spectrum changes. Their function is to unnormalize the normalized color matching functions so that they can actually be used. For each set of primaries there are 2 sets of weights so that you can integrate either with a power spectrum (unnormalized to the human eye) or with a luminance spectrum (normalized to the human eye). The spectral power distribution I(λ) is a quantity used in physics and has nothing to do with human vision. It is measured in Watts/Hz (unnormalized). (Radiant intensity in Watts/(m^2 Steradian Hz) is more appropriate here actually and this should be used instead.) The radiant power weights either have to be included explicitly in the integrals or a new radiant power color matching function has to be defined. Since the normalized color matching functions are explicitly defined on this page, and I(λ) is also strictly defined through the link to "spectral power density" the definitions for R, G and B as currently written are wrong. Bababoef 11 April 2007 (UTC)
- That doesn't seem to be in accord with the books I was checking. Do you have a reference? Dicklyon 00:44, 12 April 2007 (UTC)
[edit] Human gamut not a triangle
The statement bothers me somewhat. Its true, if you can't find 3 points in the gamut that contain the gamut, then its not a triangle, but its also not some other things too. In other words, the converse is not true - If you can find 3 points that contain the gamut, that doesn't mean it IS a triangle. Only when you introduce the other fact, that the gamut must be convex, does the converse become true as well. PAR 15:59, 16 April 2007 (UTC)
- True; I considered mentioning convexity after I realized my statement was incomplete. Feel free to remove or fix it. Dicklyon 17:06, 16 April 2007 (UTC)
[edit] Creating the horseshoe gamut
I'm trying to understand how the CIE 1931 chromacity diagram is created and mapped to the horseshoe gamut. I experimented in MATLAB and wrote the following code with the output image at right. The output looks somewhat different from Image:CIExy1931.png (for example, the luminosity bands in Image:CIExy1931.png from the 460nm point on the border to 550nm and from 470nm to 605nm are not present in my image).
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% % CIE 1931 color space %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% A = [256:256:3]; for i = 1:256, for j = 1:i, A( i, j, 1 ) = ( ( j - 1 ) / 256 ) * 255; end; end; for i = 1:256, for j = 1:i, A( i, j, 2 ) = ( ( 256 - i ) / 256 ) * 255; end; end; for i = 1:256, for j = 1:i, A( i, j, 3 ) = 255 * ( 1 - ( ( 256 - i + j ) / 256 ) ); end; end; for i = 1:256, for j = 1:i, rgb_max = max( A( i, j, : ) ); A( i, j, 1 ) = ( A( i, j, 1 ) / rgb_max ) * 255; A( i, j, 2 ) = ( A( i, j, 2 ) / rgb_max ) * 255; A( i, j, 3 ) = ( A( i, j, 3 ) / rgb_max ) * 255; end; end; A = uint8( A ); figure, imshow( A );
Mainly I'm curious about how an image such as mine is mapped to the horseshoe gamut. Perhaps this information should be in the article?
Are the regions outside the horseshoe simply clipped, or is there a geometric transformation which occurs? There is far more pure red, green and blue in my image than in the official CIE 1931 chromacity diagram. Thanks! ~MDD4696 23:04, 20 April 2007 (UTC)
- Interesting, my image looks very similar to the first "rogue" image at http://www.techmind.org/colour/rogues.html. ~MDD4696 23:08, 20 April 2007 (UTC)
It's really not clear what you're doing here; seems to be a pure RGB hack with no reference to XYZ. You need to start with the data tables for the X, Y, and Z curves, parameterized by wavelength to get the horseshow. You can then compute for each x,y in the image what a good (X,Y,Z) is, and transform it by matrix to an sRGB image, and show that; clip when outside the horseshoe if you only want to show color corresponding to physical (non-negative) spectra. Dicklyon 23:10, 20 April 2007 (UTC)
- So, if I interpret you correctly, I'd need a table of wavelengths which specifies values for X, Y, and Z (red, green, and blue) for each wavelength to create the horseshoe. I'm confused about what you mean by choosing a good X,Y,Z from the normalized x,y values though. Thanks for responding. ~MDD4696 23:31, 20 April 2007 (UTC)
-
-
- Click on the chromaticity diagram to get information about the image. It has a description of how it was generated. The r,g,b numbers that you use are assumed to define a particular color or chromaticity in the sRGB color space. There is then a transformation (described in that article) on how chromaticities in sRGB can be converted to CIE XYZ coordinates. The sRGB gamut is smaller than the CIE XYZ gamut, and when the color in the CIE diagram steps outside the sRGB gamut, the color is set to a presumably "close" value. (I think if you draw a line from the D65 white point, the color on that line outside the gamut will be a constant, equal to the color where it first steps outside the gamut). Also - I *DO* see a white star in your diagram! I don't know why you dont. PAR 06:30, 21 April 2007 (UTC)
-
[edit] How is Y parameter a measure of brightness?
Our article says "the Y parameter was a measure of the brightness". It's not only an inexact starting point for our math, it also doesn't fit to the rest of the article. Since y is (apart from normalization) the same as Y, how come the diagram shows no increase in brightness from bottom to top? The cyan and the white at y=.3 look brighter to me than the green at y=.8. What's wrong: That statement, the definition of y, or the picture? — Sebastian 18:44, 8 June 2007 (UTC)
- The correct term is luminance (relative). Y is luminance, but y is normalized so that it's a pure chromaticity coordinate; if you double the intensity (brightness, power, luminance) of a color, Y doubles and y stays unchanged. In the xy space, luminance is unspecified, so it can be rendered with whatever luminance pattern you like. I don't think there's much wrong, but maybe the more precise word would be better. Dicklyon 02:21, 9 June 2007 (UTC)
Thanks for your reply, but it still doesn't make sense to me. I assume, by "xy space", you mean the projection of xyz space onto the xy plane (by neglecting z). If that's what you mean then it is not y Y = luminance that's unspecified, but z. Or did you mean a projection on the xz plane? — Sebastian 22:33, 9 June 2007 (UTC)
- Yes, xy is the projection of xyz, ignoring z. Since xyz is by definition normalized to x+y+z = 1 by dividing X, Y, and Z by X+Y+Z, y is not proportional to Y. Dicklyon 23:06, 9 June 2007 (UTC)
I see! I was confused by the word "normalization" which for me suggested a normalizing constant, but of course 1/(X+Y+Z) is not a constant. Are you sure "normalized" in the article is the right term? — Sebastian 00:32, 10 June 2007 (UTC)
- To me, normalized means adjusted by a factor to make it a predetermined size, in this case as measured by an L1 norm (Manhattan distance). Dicklyon 01:01, 10 June 2007 (UTC)
Funny, that similarity between "norm" and "normalization" never occurred to me! I'd say, it's coincidential, though. Is "normalization" really commonly used in the context of the color space, or is that just our article? In the first case, I'd vote for an explanation so others don't run in the same misunderstanding as I did. When I read "normalization", I always think: "It's just a formality, not a new physical aspect." — Sebastian 02:24, 10 June 2007 (UTC)
-
- I too do not understand the sentence about Y is the measure of intensity. First of all at that point of the article, and in the math formulas below it is not been told what X, Y, Z are. It is not even told what x and y are, but if X, Y,, and Z were defined, than the formulas would be enough to define them. I can suppose, but this is jus my opinion, that is common, to specify numbers as (x,y,Y) and you are referring to that in the sentence
- The derived color space specified by x, y, and Y is known as the CIE xyY color space and is widely used to specify colors in practice.
- However here Y is not the same Y that appears in the math formulas. This is just confusing. And this confusion is reflected in the article. -- AnyFile 11:59, 21 August 2007 (UTC)
- I too do not understand the sentence about Y is the measure of intensity. First of all at that point of the article, and in the math formulas below it is not been told what X, Y, Z are. It is not even told what x and y are, but if X, Y,, and Z were defined, than the formulas would be enough to define them. I can suppose, but this is jus my opinion, that is common, to specify numbers as (x,y,Y) and you are referring to that in the sentence
[edit] What is Y?
The introduction says "X, Y, and Z [are] roughly red, green and blue, respectively". But section "The CIE xy chromaticity diagram" says: "the Y parameter was a measure of the brightness or luminance of a color". Which one is it? The two are certainly not the same as can be seen by the fact that yellow (586 nm) appears brighter than green (546 nm). — Sebastian 00:32, 10 June 2007 (UTC)
- It's both. It's exactly luminance, by definition, and it's highly correlated with, or "roughly", green. Dicklyon 00:58, 10 June 2007 (UTC)
-
- Thank you. How are the other two, X and Z, defined, then? If one can see them as weighing functions, do they have negative values at blue resp. red? (Maybe the answer is somewhere in our articles or maybe I could ask this in a better way; I'd have to study this some more.) — Sebastian 02:30, 10 June 2007 (UTC)
- They were picked to be a non-negative basis for the 3D subspace of spectral space spanned by the human cone sensitivities. So, no, they are not negative anywhere, though most other sets of color-matching functions are. The Z is picked to be the narrowest possible non-negative function on the blue end, and then X is left as the smallest possible non-negative final dimension, which ends up with a bump on the blue end as well. Dicklyon 05:34, 10 June 2007 (UTC)
- Thank you. How are the other two, X and Z, defined, then? If one can see them as weighing functions, do they have negative values at blue resp. red? (Maybe the answer is somewhere in our articles or maybe I could ask this in a better way; I'd have to study this some more.) — Sebastian 02:30, 10 June 2007 (UTC)
- Can the article be updated to reflect this? I believe it is good practice to define terms before using them. I do not understande this topic enough to do that myself - that was why I was reading it! 81.178.102.250 15:34, 10 July 2007 (UTC)
-
- Good idea. I'll see if I can find a ref... Dicklyon 15:41, 10 July 2007 (UTC)
- At constant radiance, yellow is not brighter than green. It looks that way on a computer because "pure yellow" is red+green, which will be brighter than just green. If we normalised the xy diagram to be even vaguely constant-intensity in sRGB, most of it would look pretty dark — the xy diagram is only supposed to approximate chromaticity. ⇌Elektron 02:26, 30 August 2007 (UTC)
[edit] Sorry, but I can not see it
It is written in the article, in the point list describing the diagram
- It is seen that all visible chromaticities correspond to non-negative values of x, y, and z (and therefore to non-negative values of X, Y, and Z).
Sorry, but I really can not understand the presence of this sentence. That all visible color could be written by saying 3 positive numbers X, Y, Z was an hypothesis, so you can not conclude that. Moreover it has not been proved that all the color are in the diagram (by the way I can not see brown there) -- AnyFile 12:25, 21 August 2007 (UTC)
Moreover I can not understand why the border of the gamut are where they are in the figure. Why, for example, the point (x=0.1,y=0.85) is outside the border? Of course I can understand that it could not be that x+y>1 (because that was how x and y was defined), but not why for example the couple I wrote above can not be in the visible part.
Is it that if a light beam of components that lay outside the gamut is seen by an human eye, it is not seen by the human eye? (I am strongly feeling as all of this description is not about the colors, but about the perception of colors by human eyes).
And why is not the point (1,0) inside the gamot? If a pure red is X=1,Y=0,Z=0 than it should be a x=0,y=0 (and the averall intensity Y=1). -- AnyFile 12:57, 21 August 2007 (UTC)
-
- That all colors can be represented by 3 numbers X,Y, and Z is an experimental fact, it is found by experimentation. It cannot be "proven" in a mathematical sense, and it is not a "hypothesis". If X,Y, and Z actually do represent every color, then it follows that the chromaticities x and y will represent every chromaticity.
-
- The color brown is simply the color yellow at low luminosity. Brown and yellow have the same chromaticity.
-
- With regard to the (0.1, 0.85) point - The first thing you have to understand that the curved portion of the gamut is the line of monochromatic colors. This is an experimental fact. These points correspond to light at a single wavelength. Every light beam can be thought of as a mixture of a number of different monochromatic beams. The second thing you have to understand is that the human eye is linear (Grassmann's law). This too is an experimental fact. This means that for two points on the xy chromaticity diagram, any mixture of the color corresponding to these two points will lie on a line connecting the two points, and will lie between them on this line. It also means that any color can be constructed as the mixture of two monochromatic colors. The (0.1, 0.85) point is not on the monochromatic line, therefore it must be a mixture of monochromatic colors. You cannot find any set of monochromatic colors that will mix to give (0.1,0.85). Geometrically, you cannot find two points on the monochromatic curve such that (0.1,0.85) lies between them! This is an experimental fact that is derivable from the two experimental facts mentioned above.
-
- You say this article "is not about the colors, but about the perception of colors by human eyes". This is a bad statement. There is no such thing as "color" that is independent of the eye. Without human eyes, there is no color, there is only light with different spectra.
-
- With regard to the (1,0,0) point, pure red is NOT (1,0,0) in the XYZ space. It is pure red in the sRGB space. We are not talking about sRGB here.
- PAR 13:39, 21 August 2007 (UTC)
[edit] Converting a full set of chromaticities (xy for RGB+W) to a full set of triplets (Yxy/XYZ)
The sets of RGB-primaries along with white point are often defined with chromaticities solely, i. e. 4 xy-pairs for red, green, blue and white. For example, this is the only information stored in EDID, as well as numerous reference tables of standard RGB spaces throughout the Wikipedia and many (but not all) other sources.
How these chromaticity sets can be converted to XYZ- or Yxy-sets?
Of course, within common mathematics we can't just add 4 single values for luminance — there will always be some 4 degrees of freedom. But with the concept of that R+G+B=W and the assumption that we don't need absolute Y-values, just relative ones (i. e. fixing luminance of white point to Y=1 or Y=100), it sounds possible.
As a result, I developed some equations, starting with Xr+Xg+Xb=Xw, etc.
- The source chromaticity data: .
- Additionally, we fix YW = 1 (or any other positive value, maybe 100).
- Some intermediate variables (as in Yxy->XYZ, but without Y-multiplier):
- Some auxiliary variables (differ only in first X'; the subscript G is referring to the RGB-primary we start with):
- After some reductions, we finally have the answer:
As you can see, all this seems to give unambiguous results. But, despite of looking very credible and passing all the checks (by definitions), the results for well-known RGB spaces, such as sRGB or Adobe RGB, don't match predefined values of Y (refer to Bruce Lindbloom). Could someone tell me what I'm missing? Which degree of freedom was not eliminated, and how could this happen with such precise formulae?
For your convenience, I've prepared a small Excel spreadsheet which does these calculations automatically. It can be downloaded here: avsco.nm.ru/EDID2XYZ_eng.xls (only 17 Kbytes in size, no archiving).
Any hints are appreciated. Thanks in advance. 213.234.235.82 11:26, 17 October 2007 (UTC)
- I haven't studied it in detail yet, but where you say R+G+B=W and Xr+Xg+Xb=Xw, you probably mean to use R=G=B at whitepoint or something like that. That is, spaces are specified such that equal amounts of the primaries add to form the color defined as white. Dicklyon 17:30, 17 October 2007 (UTC)
-
- This topic is about primaries, i. e. red, green, blue and white at maximum intensity. 213.234.235.82 17:42, 17 October 2007 (UTC)
-
-
- Primaries and whitepoints are chromaticities, independent of intensities. Dicklyon 04:17, 18 October 2007 (UTC)
-
-
-
-
- Color stimulus is unthinkable without intensity. My question is about primary stimuli of a real display device, which has certain luminocity aside from chromaticity. Only this way the sum of red, green and blue can render the specified white point (no matter whether Y is normalized to 1 for white or not). Otherwise, if we just take some arbitrary "reddish", "greenish", "bluish" and "neutral" stimuli, which do not form a single system, the conversion from 4×xy to 4×XYZ would not be possible, of course. 62.118.220.182 08:40, 18 October 2007 (UTC)
-
-
-
-
-
-
- But if you're going to asking for help, listen to the answers (and preferably not on an article talk page, since this is supposed to be about article content; but I'll assume that's where you're trying to go with it). A chromaticity is a color with intensity factored out. An xy specification of the primaries doesn't tell you how intense those primaries are, and in general they are variable, the max limit notwithstanding. The whitepoint is a specification of the neutral chromaticity that you get by producing "equal" amounts of the three primaries; a color at the whitepoint chromaticity can be white, gray, or even nearly black. By "equal" is meant that the colorspace specifications of the primary intensities, generally called the R, G, and B values, are numerically equal; these multiply whatever the intensities of the primary sources are; for example, if you scale the R, G, and B range to be 0 to 1, then they multiply the maximum intensity of the primaries. If you just want to look at brightest white, that's R=G=B=1, but in general any neutral color has R=G=B and generates a color with chromaticity equal to the white point. There is no sense in which the sum R+G+B is a meaningful quantity in colorimetry as I know it. Hope this helps. Dicklyon 14:54, 18 October 2007 (UTC)
-
-
-
- As for help figuring out why your specific formulas are giving you wrong results, you might try emailing Bruce Lindbloom directly. He has in the past been quite responsive and helpful. Or perhaps for a broader response, try the Apple ColorSync users mailing list. --jacobolus (t) 20:14, 18 October 2007 (UTC)
[edit] Red, Green and Blue Not Objective Concepts,
The article manages to imply that there are objective color values that are universally recognized as red, green, and blue. It should be more clear about the fact that the primary colors used in any color model are chosen to make the model work, not derived from objective fact. --Isaac R 17:58, 18 October 2007 (UTC)
[edit] Isn't the word tristimulus counterintuitive?
From what I understand, the XYZ- or xyz-space has been chosen so that the so-called "desirable properties" would be met. For instance, the Y value had to be equivalent to the experimental luminosity of the different wavelength to the human eye. And the X and Z values were defined in a purely mathematical way so as to fit the gamut as best as possible, and so that the other conditions would be met, such as being positive everywhere, etc. For instance, if I was to take the values on the graph here, normalise them so that the sum is 1 for each wavelength, I would get a parametrisation of the horsehoe shape on the chromacity diagram.
Now, on the Tristimulus page, it is said that tristimulus values are "the amounts of the three primary colors in a three-component color model needed to match that test color. The tristimulus values are most often given in the CIE 1931 color space, in which they are denoted X, Y, and Z."
Something is wrong. From the definition, tristimulus values is the amount of the three primary colors needed to match a test color, (sometimes "negative", unless the gamut is a perfect triangle, which it is not). In this case, there is a color corresponding to each positive set of values, since they correspond to the superposition of (existing) primary colors. However, in the case of X,Y,Z, there are triples not corresponding to any color, such as (1,0,0), (0,1,0) and (0,0,1). X,Y,Z are not supposed to correspond to any primary colors. Or do they correspond to convenient "imaginary" colors that are of no particular interest, except for generating a nice parametrisation? If this is the case, I'll add something to that effect on the Tristimulus page. Ratfox 23:29, 31 October 2007 (UTC)
[edit] Is there any tables to generate the images in the article?
What research is done that results the images in the article (the curves)? There must have been some research project behind the images with the curves in, from which the images then is generated. Is it possible to get the result of that research, for further researching? Or at least the data the images is generated from?
Since the images could be analyzed and data extracted from the curves, the data should be publicated somewhere to get. An analyze would be such a work around... :-\
-Kri 23:25, 11 November 2007 (UTC)
[edit] What are the extremes of the visible colour spectrum?
What I mean is, what range of wavelengths are considered to be part of the visible colour spectrum, as opposed to infrared or ultra-violet? I can see from the article that it's roughly 400-700 nm, but I wondered whether there is a more precise definition used by the CIE, or other standards body. For example, the article visible spectrum uses the range 380-750 nm. Do those values come from the CIE? (a reference would be appreciated). Thanks, Thunderbird2 (talk) 12:03, 12 January 2008 (UTC)
Oh, and defining wavelength ranges for individual colours would be nice too :-) Thunderbird2 (talk) 12:11, 12 January 2008 (UTC)
- There are a number of different tabulations of the CIE color matching functions, beginning with the 1931 standard version which covered 360-830 nm. The "abridged" values were a more "user friendly" version (back when calculations were done by hand) and they ran from 380-780 nm. There are others, 1964 improved version, etc. etc. I don't know if there is a set of numbers that the recent literature has settled on, but the 360-830 probably covers everything. Check out Wyszecki and Stiles ("Color Science") for tables of about twenty different versions of color matching functions. PAR (talk) 17:10, 12 January 2008 (UTC)
-
- But with respect to the original question, parts of that extended range would certainly be considered IR and UV as opposed to visible, even though it has some (very small) visible effect; 400 and 700 are the most common "conventional" boundaries, but I don't know that they have any special status. Dicklyon (talk) 18:12, 12 January 2008 (UTC)
-
-
- Thanks for your replies - both helpful. Thunderbird2 (talk) 18:44, 12 January 2008 (UTC)
-
[edit] CIE Standard colorimetric observer
Please can someone write a separate section—or even article—to explain what it is and why it is needed. I'm not comfortable with the flow of the article (to put it mildly) so I don't know where to insert it myself.--Adoniscik (talk) 01:36, 21 January 2008 (UTC)
- It should not be a separate article. The standard observer is the set of three spectral sensitivity curves that are used to convert a spectrum to an XYZ tristimulus vector; other observers (set of curves) will give other kinds of tristimulus values. Dicklyon (talk) 02:23, 21 January 2008 (UTC)
- The reason you might make it an article instead of a section is that the 1931 definition isn't the only one. In any case, it needs more attention so people flocking from other articles where the concept is mentioned do not have to read the entire article to get a definition. It's scattered right now.--Adoniscik (talk) 02:33, 21 January 2008 (UTC)
-
- They're going to have to read some of this article anyway to understand the standard colorimetric observer; too much necessary context overlaps between this article and that one. I also oppose article proliferation in this case, unless some section of this article becomes so long and detailed that it must be split out. --jacobolus (t)
[edit] Undo removal of external links
Why?
- The so called (→External links: clean up) removed the links I typically use on this page
- But dead links were NOT removed
- This is exactly what a link clean-up NOT should do
http://cvrl.ioo.ucl.ac.uk/ —Preceding unsigned comment added by 80.156.4.26 (talk) 19:24, 27 February 2008 (UTC)