Multiple sub-Nyquist sampling encoding
MUSE (Multiple sub-Nyquist sampling encoding), was a dot-interlaced digital video compression system that used analog modulation for transmission to deliver 1125-line high definition video signals to the home. Japan had the earliest working HDTV system, which was named Hi-Vision (a contraction of HIgh-definition teleVISION) with design efforts going back to 1979. The country began broadcasting wideband analog HDTV signals in the late 1980s using 1035 active lines interlaced in the standard 2:1 ratio (1035i) with 1125-lines total.
History
MUSE, a compression system for Hi-Vision signals, was developed by NHK Science & Technology Research Laboratories in the 1980s, employed 2-dimensional filtering, dot-interlacing, motion-vector compensation and line-sequential color encoding with time compression to 'fold' an original 20 MHz source Hi-Vision signal into a bandwidth of 8.1 MHz.
- Japanese broadcast engineers immediately rejected conventional vestigial sideband broadcasting.
- It was decided early on that MUSE would be a satellite broadcast format as Japan economically supports satellite broadcasting.
Modulation research
- Japanese broadcast engineers had studied many kinds of HDTV broadcasts for long time.[1] At first they thought they had to use SHF, EHF or optic fiber to transmit HDTV by reason of the signal is too broad band and HLO-PAL would be used for terrestrial emission.[2][3] HLO-PAL is a conventionally constructed composite signal (Y+C, like NTSC and PAL). It uses a Phase Alternating by Line with Half-Line Offset carrier encoding of the wideband/narrowband chroma components. Only the very lowest part of the wideband chroma component overlapped the high-frequency chroma. The narrowband chroma was completely separated from luminance. PAF, with Phase Alternating by Field (like the first NTSC color system trial) was also experimented with, and gave much better decoding results, but NHK abandoned all composite encoding systems. Because of for satellite transmission, Frequency modulation(FM) should be used with power-limitation problem. FM occurs triangular noise. So if sub-carrierrd composite signal is used with FM, demodulated chroma signal has more noise than luminance. Because of this they studied[4] and decided[2] to use Y/C component emission for satellite. Once it seemed that FCFE(Frame Conversion Finess Enhanced) , I/P conversion compression system,[5] would be chosen, but MUSE was adopted at last.[6]
- Separate transmission of Y and C components was explored. The MUSE format that is transmitted today uses separated component signalling. The improvement in picture quality was so great that the original test systems were recalled.
- One more power saving tweak was made: Lack of visual response to low frequency noise allows significant reduction in transponder power if the higher video frequencies are emphasized prior to modulation at the transmitter and de-emphasized at the receiver.
Technical specifications
- Aspect Ratio: 16:9
- Scanlines (compressed/active/total): 1,032/1,035/1,125
- Pixels per line (Fully interpolated): 1122 (still image)/748 (moving)
- Interlaced ratio: 2:1
- Refresh rate: 60.00 (in order to improve compatibility with 50 fields/sec systems).
- Sampling frequency for broadcast: 16.2 MHz
- Vector motion compensation: horizontal ± 16 samples (32.4 MHz clock) / frame, a vertical line ± 3 / Field
- Audio: 48 kHz 16bit(2ch)/32 kHz 12bit(4ch supports F3-R1 surround)
DPCM Audio compression format: DPCM quasi-instantaneous companding
MUSE is a 1125 line system (1035 visible), and is not pulse and sync compatible with the digital 1080 line system used by modern HDTV. Originally, it was a 1125 line, interlaced, 60 Hz, system with a 5/3 (1.66:1) aspect ratio and an optimal viewing distance of roughly 3.3H.
For terrestrial MUSE transmission a bandwidth limited FM system was devised. A satellite transmission system uses uncompressed FM.
The pre-compression bandwidth for Y is 20 MHz, and the pre-compression bandwidth for chrominance is a 7.425 MHz carrier.
The Japanese initially explored the idea of frequency modulation of a conventionally constructed composite signal. This would create a signal similar in structure to the Y/C NTSC signal - with the Y at the lower frequencies and the C above. Approximately 3 kW of power would be required, in order to get 40 dB of signal to noise ratio for a composite FM signal in the 22 GHz band. This was incompatible with satellite broadcast techniques and bandwidth.
To overcome this limitation, it was decided to use a separate transmission of Y and C. This reduces the effective frequency range and lowers the required power. Approximately 570 W (360 for Y and 210 for C) would be needed in order to get a 40 dB of signal to noise ratio for a separate Y/C FM signal in the 22 GHz satellite band. This was feasible.
There is one more power saving that appears from the character of the human eye. The lack of visual response to low frequency noise allows significant reduction in transponder power if the higher video frequencies are emphasized prior to modulation at the transmitter and then de-emphasized at the receiver. This method was adopted, with crossover frequencies for the emphasis/de-emphasis at 5.2 MHz for Y and 1.6 MHz for C. With this in place, the power requirements drop to 260 W of power (190 for Y and 69 for C).
Sampling systems and ratios
The subsampling in a video system is usually expressed as a three part ratio. The three terms of the ratio are: the number of brightness ("luminance" "luma" or Y) samples, followed by the number of samples of the two color ("chroma") components: U/Cb then V/Cr, for each complete sample area. For quality comparison, only the ratio between those values is important, so 4:4:4 could easily be called 1:1:1; however, traditionally the value for brightness is always 4, with the rest of the values scaled accordingly.
Sometimes, four part relations are written, like 4:2:2:4. In these cases, the fourth number means the sampling frequency ratio of a key channel. In virtually all cases, that number will be 4, since high quality is very desirable in keying applications.
The sampling principles above apply to both digital and analog television.
MUSE implements a variable sampling system of ~4:2:1 ... ~4:0.5:0.25 depending on the amount of motion on the screen.
Audio subsystem: Digital Audio Near-instantaneous Compression and Expansion
MUSE had a discrete 2- or 4-channel digital audio system called "DANCE", which stood for Digital Audio Near-instantaneous Compression and Expansion.
It used differential audio transmission (DPCM) that was not psychoacoustics-based like MPEG-1 Layer II. It used a fixed transmission rate of 1350 kbp/s. Like the PAL NICAM stereo system, it used near-instantaneous companding (as opposed to Syllabic-companding like the dbx system uses) and non-linear 13-bit digital encoding at a 32 kHz sample rate.
It could also operate in a 48 kHz 16-bit mode. The DANCE system was well documented in numerous NHK technical papers and in a NHK-published book issued in the USA called Hi-Vision Technology.
The DANCE audio codec was superseded by Dolby AC-3 (a.k.a. Dolby Digital), DTS Coherent Acoustics (a.k.a. DTS Zeta 6x20 or ARTEC), MPEG-1 Layer III and many other audio coders. The methods of this codec are described in the IEEE paper:[7]
Real world performance issues
MUSE had a four-field dot-interlacing cycle, meaning it took four fields to complete a single MUSE frame. Thus, stationary images were transmitted at full resolution. However, as MUSE lowers the horizontal and vertical resolution of material that varies greatly from frame to frame, moving images were blurred. Because MUSE used motion-compensation, whole camera pans maintained full resolution, but individual moving elements could be reduced to only a quarter of the full frame resolution. Because the mix between motion and non-motion was encoded on a pixel-by-pixel basis, it wasn't as visible as most would think. Later, NHK came up with backwards compatible methods of MUSE encoding/decoding that greatly increased resolution in moving areas of the image as well as increasing the chroma resolution during motion. This so-called MUSE-III system was used for broadcasts starting in 1995 and a very few of the last Hi-Vision MUSE LaserDiscs used it ("The River" is one Hi-Vision LD that used it).
MUSE's "1125 lines" are an analog measurement, which includes non-video "scan lines" during which a CRT's electron beam returns to the top of the screen to begin scanning the next field. Only 1035 lines have picture information. Digital signals count only the lines (rows of pixels) that have actual detail, so NTSC's 525 lines become 486i (rounded to 480 to be MPEG compatible), PAL's 625 lines become 576i, and MUSE would be 1035i. To convert the bandwidth of Hi-Vision MUSE into 'conventional' lines-of-horizontal resolution (as is used in the NTSC world), multiply 29.9 lines per MHz of bandwidth. (NTSC and PAL/SECAM are 79.9 lines per MHz) - this calculation of 29.9 lines works for all current HD systems including Blu-ray and HD-DVD. So, for MUSE, during a still picture, the lines of resolution would be: 598-lines of luminance resolution per-picture-height. The chroma resolution is: 209-lines. The horizontal luminance measurement approximately matches the vertical resolution of a 1080 interlaced image when the Kell factor and interlace factor are taken into account.
Shadows and multipath still plague this analog frequency modulated transmission mode.
Japan has since switched to a digital HDTV system based on ISDB, but the original MUSE-based BS Satellite channel 9 (NHK BS Hi-vision) was broadcast until September 30, 2007.
Cultural and geopolitical impacts
Internal reasons inside Japan that led to the creation of Hi-Vision
- (1940s): The NTSC standard (as a 525 line monochrome system) was imposed by the US occupation forces.
- (1950s-1960s): Unlike Canada (that could have switched to PAL), Japan was stuck with the US TV transmission standard regardless of circumstances.
- (1960s-1970s): By the late 1960s many parts of the modern Japanese electronics industry had gotten their start by fixing the transmission and storage problems inherent with NTSC's design.
- (1970s-1980s): By the 1980s there was spare engineering talent available in Japan that could design a better television system.
MUSE, as the US public came to know it was initially covered the magazine Popular Science in the mid-1980s. The US television networks did not provide much coverage of MUSE until the late 1980s, as there were very few public demonstrations of the system outside Japan.
Because Japan had its own domestic frequency allocation tables (that were more open to the deployment of MUSE) it became possible for this television system to be transmitted by Ku Band satellite technology by the end of the 1980s.
The US FCC in the late 1980s began to issue directives that would allow MUSE to be tested in the US, providing it could be fit into a 6 MHz System-M channel.
The Europeans (in the form of the European Broadcasting Union (EBU)) were impressed with MUSE, but could never adopt it because it is a 60 Hz TV system, not a 50 Hz system that is commonplace throughout the rest of the Old World.
The EBU development and deployment of B-MAC, D-MAC and much later on HD-MAC were made possible by Hi-Vision's technical success. In many ways MAC transmission systems are better than MUSE because of the total separation of colour from brightness in the time domain within the MAC signal structure.
Like Hi-Vision, HD-MAC could not be transmitted in 8 MHz channels without substantial modification – and a severe loss of quality and frame rate. A 6 MHz version Hi-Vision was experimented with in the US, but it too had severe quality problems so the FCC never fully sanctioned its use as a domestic terrestrial television transmission standard.
The US ATSC working group that had led to the creation of NTSC in the 1950s was reactivated in the early 1990s because of Hi-Vision's success. Many aspects of the DVB standard are based on work done by the ATSC working group, however most of the impact is in support for 60 Hz (as well as 24 Hz for film transmission) and uniform sampling rates and interoperable screen sizes.
Device support for Hi-Vision
Hi-Vision Laserdiscs
There were a few MUSE laserdisc players available in Japan (Panasonic LX-HD10/20 and Sony HIL-C2EX). These could play Hi-Vision as well as standard NTSC laserdiscs. Hi-Vision laserdiscs are extremely rare and expensive.
The HDL-5800 Video Disc Recorder recorded both high definition still images and continuous video onto an optical disc and was part of the early analog wideband Sony HDVS high-definition video system. Capable of recording HD still images and video onto either the WHD-3AL0 or the WHD-33A0 optical disc; WHD-3Al0 for CLV mode (Up to 10 minute video or 18,000 still frames per side); WHD-33A0 for CAV mode (Up to 3 minute video or 5400 still frames per side)
The HDL-2000 was a full band high definition video disc player.
Video cassettes
W-VHS allowed home recording of Hi-Vision programmes.
See also
- Analog high-definition television system
- HD-MAC, a planned high-definition analog video standard in Europe
The analog TV systems these systems were meant to replace
Related standards
- NICAM-like audio coding is used in the HD-MAC system.
- Chroma subsampling in TV indicated as 4:2:2, 4:1:1 etc...
- ISDB
References
- ↑ 石田順一・二宮佑一(1982) 「3-1信号方式(3.信号方式と伝送)」, 『テレビジョン学会誌』 36(10), 882-888, 1982-10-20
- 1 2 藤尾孝(1980) 「高品位テレビジョン方式 : 規格, 信号方式と放送方式」, 『テレビジョン学会技術報告』 4(28), 19-24, 1980-11
- ↑ 藤尾孝(1981) 「高品位テレビジョン」, 『テレビジョン学会誌』 35(12), 1016-1023, 1981-12-20
- ↑ 河本太郎 and others(1979) 「IT40-11 BSによる高品位テレビのYC分離伝送実験」, 『テレビジョン学会技術報告』 3(26), 61-66, 1979-11
- ↑ 藤尾孝(1984) 「ICS66-5 高品位テレビジョンシステム」, テレビジョン学会技術報告 8(1), 33-39, 1984-04
- ↑ 藤尾孝(2006) 「ハイビジョン(HDTV)が世に出るまで:電子画像メディアの感性化,コインの表と裏(<小特集>電子情報通信むかしばなし)」, 『電子情報通信学会誌』 89(8), 728-734, 2006-08-01
- ↑ http://ieeexplore.ieee.org/iel1/30/2796/00085585.pdf?arnumber=85585
External links
|
|
|