Talk:Input lag

From Wikipedia, the free encyclopedia

I've removed the monitor list, as it is completely unsourced. The same goes for the whole article however. - JohnyDog 13:11, 27 September 2006 (UTC)


This article should be changed to note that input lag isn't only an LCD phenomenon, but rather a digital display phenomenon. Plasmas, DLPs, and even digital CRTs can suffer from it as well. It probably has most to do with the imagine buffering and scaling these displays do, and the processing involved with deinterlacing when the video source is an interlaced format.

Um video from a computer usually isn't interlaced and scaling does not occur at the native resolution AFAIK Nil Einne 19:05, 31 July 2007 (UTC)
He does not say they do... Computer material requires less processing and thus should (not will) have less input lag. Some displays always pipe data through scaler even at native resolution, and thus have uniform input lag.
Above anonymous poster is totally correct according to A/V Synchronization: How Bad Is Bad? and HDTVs and Video Game Lag: The Problem and the Solution. --Musaran (talk) 10:37, 23 February 2008 (UTC)


Article begins by describing this effect and why it is, by its very nature, variable and not accurately measurable, and then goes on to point out several times that 'manufacturers do not advertise input lag for their displays.' This should be obvious - we've already determined that there is no set input lag value for any given display. Generally poorly written and should be removed if not revised/sourced. Tagged as unreferenced. 67.86.141.73 01:24, 16 November 2007 (UTC)

"we've already determined": Who is "we"? Where?
"there is no set input lag value for any given display": Tests measure reproducible lag values. Do you mean there is no such thing as input lag, or that it depends on display settings and material mode? If so, how would that be incorporated to this article? --Musaran (talk) 10:37, 23 February 2008 (UTC)
It is accurately measurable, as many computer hardware review sites have shown. It just requires (ironically already mentioned) that you just need a "control" display, a "input lag" display, a camera with a high shudder speed and free stopwatch software to measure. The only part about it that is not "accurately measurable" is the fact that during testing you will find monitors have minimums, maximums, and averages for how long it takes to display each frame. Similar to benchmarking frames per second. See here: [1] -- Cody-7 (talk) 21:45, 28 February 2008 (UTC)

[edit] Sources

I've added a source to the first paragraph, but some of these specific claims are hard to cite. For example, the "Most sensitive users can tolerate latency under 20ms." claim. I happen to have a 24 LCD that has been measured at 20ms lag, and I agree 20ms is probably a bearable number for a sensitive user, but how do I back that claim?

I'd be great full if you all could help find some more sources. I'm removing the no sources template at the top because it now has at least one source. I feel that tag clutters the page, it's obvious will still need more sources. Thanks. -- Cody-7 (talk) 22:10, 28 February 2008 (UTC)