Contents

 

 

10.0 Video Systems

 

10.1 Broadcast Video

10.1.1 NTSC

10.1.2 PAL

10.1.3 SECAM

 

10.2 Videotext

10.2.1 Broadcast Videotext

10.2.2 Wired Videotext

10.2.3 Graphics

 

10.3 Two Way CATV Systems

 

10.4 Videoconferencing

10.4.1 H.261 or Px64

 

Assignment Questions

 

For Further Research

 

 

 

 

 


10.0 Video Systems

 

Objectives

This section will:

Examine how television can be used to support broadcast data services

Discuss two way CATV systems

Review videoconferencing and the H.261 standard

 

Video systems can be segregated into two broad categories, broadcast and interactive video. The first group is composed of the terrestrial and satellite TV industry. The user’s participation is limited to viewing. However, other video systems support user interaction.

Videotex allows many subscribers to retrieve information from remote databases, and display the information on a TV screen. This service can be either wired or broadcast.

http://www.dtg.org.uk/

 

10.1 Broadcast Video

There are three analog video broadcast formats in use today:

NTSC  [National Television Standards Committee]

PAL  [Phase Alternation Line]

SECAM  [Séquential Couleur Avec Mémoire]

http://www.epicmultimedia.com.au/eformats.cfm

http://www.vidpro.org/standards.htm

 

Each of these standards have been implemented in a number of ways.

The world’s first commercial color broadcasting system went into service in the U. S. in 1954, and was based on the NTSC standard. This system was fully compatible with the existing black & white transmission facilities and receivers. In subsequent years, the European community also developed color broadcasting systems namely PAL in Germany and SECAM in France.

All of the systems derive the luminance signal from the same source:

The terms  denote the gamma corrected values.

World Television Formats[1]

 

System Code

Parameter

M (N)

B

C

G (H)

I

D K (K’)

L

Lines per picture

525 (625)

625

625

625

625

625

625

Field Frequency [Hz]

60 (50)

50

50

50

50

50

50

Line Frequency [HZ]

15734 (15625)

15625

15625

15625

15625

15625

15625

Video Bandwidth [MHz]

4.2

5

5

5

5.5

6

6

Channel Bandwidth [MHz]

6

7

7

8

8

8

8

Audio above Video [MHz]

4.5

5.5

5.5

5.5

6

6.5

6.5

Vestigial Sideband Width [MHz]

.75

.75

.75

.75 (1.25)

1.25

.75 (1.25)

1.25

Video Modulation Polarity

-

-

+

-

-

-

+

Audio Modulation

fm±25 KHz

fm±50 KHz

am

fm±50 KHz

fm±50 KHz

fm±50 KHz

am

FM Preemphasis [mSec]

75

50

 

50

50

50

 

 

Bandwidth Comparisons

Broadcasting Systems Comparison

Similarities

Use the same colorimetry principles

 

Similar imaging and display technology

 

Wideband luminance and narrow band chromance

 

Backward compatible with older systems

Differences

Line and field rates

 

Component bandwidths

 

Frequency allocations

 

Color encoding formats

 

Selected Country List

Country

Color

Code

Australia

PAL

B

Brazil

PAL

M

Canada

NTSC

M

China

PAL

D

France

SECAM

L

Germany [West]

PAL

B G

Germany [East]

SECAM

B G

Hong Kong

PAL

I

Japan

NTSC

M

Switzerland

PAL

B G

United Kingdom

PAL

I

USA

NTSC

M

USSR [former]

SECAM

D K

 

10.1.1 NTSC

Since the chroma sub-carrier is an odd multiple of 1/2 the horizontal sweep rate, the reference burst appears to alternate its phase on each scan. Consequently, any color signal passing through the video amplifier on a B&W TV set, appears to visually cancel out.

NTSC is prone to color errors due to differential gain and differential phase variations. In the early days of television, it was apparently nicknamed Never The Same Color since any drift or change in the chroma synchronization leads to a change in hue.

Broadcast Color

Color-coding is achieved by combining the outputs of three separate color camera tubes into two color difference signals, namely B-Y and R-Y. These color difference signals can be vectorially added to provide a composite signal where the magnitude and phase correspond to the hue and saturation.

Since the eye can resolve finer chromance detail in orange and cyan hues than in green and magenta, the chromance signals are advance by 33o from the U & V axis before being broadcast. These new signals are designated I & Q.

The defining equations are:

The chroma reference burst is set to 180o or yellow instead of 0o or blue to reduce the cross coupling between the chromance and luminance signals.

The PAL system uses the U & V signals, while the NTSC system uses the I & Q signals.

Since there is less detail in color than luminance, the I & Q channels are allocated much less bandwidth than the Y signal. The Q axis represents colors that the eye cannot readily distinguish. Therefor the Q signal is somewhat band limited. The positive Q axis corresponds to magenta or purplish hues, and the negative to green. The eye can more readily discern the colors associated with the I axis where positive corresponds to orange, and negative to cyan.

Q is band limited to ±0.5 MHz

I is band limited to -1 to +0.5 MHz

 

Frequencies

Sound

4.5 MHz above the picture carrier

 

H Sync

15734.264 ± 0.044 Hz

 

V Sync

59.94 Hz

 

Chroma carrier

3.57954506 MHz

% Mod

H pulse peak

100%

 

blanking

75%

 

black

70%

 

white

12.5%

 

10.1.2 PAL

http://www.palsite.com/

http://www.dubvideo.com/pal.htm

 

This system was developed in Germany and adopted in most European countries including Great Britain. PAL broadcasting started in 1967. This technique overcame the very strict requirement for phase and amplitude integrity required in the NTSC system. The line-by-line alternation of color information on one of the chroma signals results in a visual self-cancellation of transmission irregularities.

The chroma signal is generated in a similar way as the NTSC system, but the phase of one of the chroma signals is reversed on alternate lines, both spatially and temporally.

The color burst is +135o on odd lines of the 1st & 2nd fields, and even lines of the 3rd & 4th fields

The color burst is +225o on even lines of the 1st & 2nd fields, and odd lines of the 3rd & 4th fields

Any phase distortion tends to cause alternate lines to deviate in opposite directions, and is visually canceled out by the eye

The average phase of the burst is held to 180o ± 2o, thus allowing the system to tolerate a phase differential of 40o.

The sub carrier burst is suppressed during the vertical sync pulses. To ensure that all fields start and stop with the burst in the same phase, the blanking is advanced by 1/2 a line for each field for 4 fields, and then returned to its original starting position.

PAL Chromance

The U and V signals are of equal bandwidths. The V signal alternates phase on alternate lines. The receiver can identify which line has the phase reversal by means of a swinging burst. The phase of this burst is switched ±45o at the line rate. This burst signal is not present during the vertical sync period.

To prevent visual artifacts in the picture, the chroma sub-carrier must have a fixed relationship to the horizontal and vertical sync rates. This relationship is defines as:

The swinging chroma sub-carrier repeats itself after 8 fields. Therefore when a PAL waveform signal is edited or mixed, the signals must be synchronized to this pattern or random phase changes, hence color changes would occur.

10.1.3 SECAM

SECAM is usually a 625 line system, was developed in France and went into service in 1967. It has gone through at least three versions, and the one currently in use is known as optimized or SECAM III.

SECAM separates the hue and saturation signals, and transmits them on alternate lines. As a result, the television set must contain a one line memory element, so that the RGB signals can be recovered though a linear matrix. The delay line is more tolerant than in the other systems

The R-Y and B-Y color difference signals frequency modulate two different sub-carriers.

Color Signal

Frequency [MHz]

Characteristics

R-Y

4.406250

odd lines

282 x the horizontal rate

B-Y

4.425000

even lines

272 x the horizontal rate

 

The vertical color resolution is 1/2 of the NTSC and PAL systems, but this is not very significant. The chroma signals are FM, and therefore immune to amplitude variations. However, this tends to cause interference patterns during B&W transmission.

As in any FM system, pre-emphasis is used to improve the S/N. However, the sub carrier amplitude is also increased if the luminance signal contains components near the chroma carrier.

Two frames or 4 fields are required for the system to complete its chroma swing cycle. There are two different ways the receiver can determine what the chroma phase burst should be at the beginning of a frame. The first approach known as SECAM V, transmits what are known as bottle signals during 9 lines of the vertical blanking period.

A second approach, known as SECAM H, uses the two chroma sub carrier bursts on the horizontal sync pulse to derive the sequence information. The chroma sub-carrier is reversed on every 3rd line and between each field.

SECAM FM Color Modulation[2]

SECAM Chromance

The principle counties using this system are: Egypt, France, Gabon, Iran, Iraq, Ivory Coast, Lebanon, Morocco, Saudi Arabia, Senegal, Tunisia, USSR, & Zaire. A 525 line version of SECAM is used in Cuba, Haiti, & French Guinea

10.2 Videotext

10.2.1 Broadcast Videotext

This facility is a one way information system. A sequencing database con­tinuously broadcasts a collection of pages. The user then simply extracts the page of interest when it comes up. To reduce the delays inherent in this approach, the num­ber of pages is limited to a few hundred.

The easiest way to provide this service is to send the data on the unused lines during the vertical blanking pulse. In North America, lines 14-18 and 20 are used. Line 21 is reserved for closed captioning.

The data burst rate must be high enough that a significant part of the page can be transmitted on each line. The spectral content of the data burst must also be compatible with the TV signal. In the NTSC system, the pulse rate is 5.72 Mbps and is applied through a raised cosine filter limited to 4.48 MHz. In the PAL and SECAM systems the pulse burst rate is 6.93 Mbps.

Because of the high burst rate, teletex signals are rendered un­intelligible by even short echo delays (< 1.5 mSec). The decoder in the set must therefore be able to cancel these echoes. For this reason, the teletex broadcast range is less than the acceptable video range.

10.2.2 Wired Videotext

This is also known as viewdata, and is provided in Canada by Teledon. This system utilizes the PSTN as the interconnect medium. Some potential uses may include:

• Electronic encyclopedia

• Telephone directories

• Time tables

• Catalogues

10.2.3 Graphics

There are three basic ways to encode graphics information:

Alpha mosaic - Characters or graphic elements are positioned in fixed cells on a rectangular grid. This is the most common display method for com­puter terminals.

Alpha geometric - Text is displayed as in the alpha mosaic format, but graphics elements are sent as command strings to define the shape.

Photographic - The image is transmitted pixel by pixel. This uses an enormous amount of memory, and is generally limited to some small portion of the display area.

10.3 Two Way CATV Systems

Two way systems are a significant departure from the traditional cable TV facilities. One of the most obvious differences is in the distribution amplifiers, which now must be bi-directional repeaters. The design of such amplifies is eased somewhat when one realized that the downstream signal requirements are considerable different than those in the up steam.

In the down stream path (to the subscriber), there may be up to 52 video channels and one or two of these may be reserved for signaling to provide:

• Information on request

• Polling

• Equipment adjustment

The much less demanding up stream path, consists only of signaling information:

• Requests for service

• Alarms or metering

• Polling responses

A relatively easy way to provide upstream signaling is by means of FDM. Each subscriber is allocated a return channel. There may be up to 500 such analog channel carriers spaced at 20 KHz intervals from 5 to 15 MHz.

An alternate approach is to use a TDM signaling scheme. Each subscriber is allocated a time slot, which may be acquired on either a scheduling or demand basis.

Some CATV systems provide a return video channel. This would allow local programming to be received by the head end (cable station), and redistributed over the network. Normally this is provided by 2 carriers spaced 6 MHz apart, between 18 and 30 MHz.

The distribution system may include satellite links, terrestrial microwave links, fiber optics, and coaxial cable. To take the greatest advantage of these transmission facilities, local video switches may be deployed at hub sites.

10.4 Videoconferencing

Videoconferencing has been used in the business world for quite some time. It not only saves travel time and expense, but a videoconference can be convened much more quickly than a conference where all parties must be in the same room.

Videoconferencing has migrated into the school system. These systems use compression algorithms and reduced frame rate [15 frames per second] to minimize the bandwidth demands. Some systems examples include:

System

Location

Comments

WHETS 

Washington State

Operated by Washington State University, the system has over 30 nodes

NSBSD 

Alaska

Provides two way video hook up to 8 isolated communities. The system uses a 23 GHz microwave link to the Alascom satellite system, and 62.5 micron multimode fiber to some schools. It also forms the backbone of a WAN.

 

10.4.1 H.261 or Px64

http://www-mobile.ecs.soton.ac.uk/peter/h261/h261.html

 

A new international standard for video compression H.261, more commonly as Px64, is currently being developed. This video coding format uses a discrete cosine transform based, motion compensated, differential pulse code modulation algorithm, and comes in two basic types:

CIF [common intermediate format], with a resolution of 288 lines by 352 pixels at 30 frames per second

QCIF [quarter CIF], with a resolution of 144 lines by 176 pixels for small screen video applications

CIF and QCIF Blocks

Each block represents 8 equivalent lines x 8 equivalent pels. Blocks 1 - 4 contain luminance information, block 5 contains the Red chromance signal, and block 6 contains the Blue chromance signal.

The total number of blocks in a CIF frame is 6 x 33 x 12 = 2376, and the total number of pels per frame is 2376 x 8 x 8 = 152064.

This standard is below the quality of a standard NTSC signal, which has a digital equivalence of approximately 480 lines by 500 pixels.

Although the standard appears to have a great deal of flexibility, particularly when it comes to bandwidth and digitizing voice & data channels, there apparently will be no established standards on reprocessing, encoding, or post processing.

Px64 Packets

Px64 Packet Components

Designation

Function

PSC

Picture start code

TR

Temporal reference [time stamp]

T1

Type 1 [identifies: split screen, document camera, freeze picture release, CIF/QCIF]

PEI1

Picture extra insertion 1

P

Parity

PEI2

Picture active insertion 2

PS

Picture spare [not used]

GOB Data

3 QCIF or 12 CIF data fields

GBSC

Group of blocks start code

GN

Group number

T2

Type 2 [not used]

Q1

Quantizer information [indicates which quantizer tables to use]

GS

GOB spare [not used]

MB Data

Macro block data

MBA

Macro block address number [1 - 33]

T3

Type 3

Q2

Quantizer information

MVD

Motion vector data

CBP

Coded block pattern

Block Data

DCT [discrete cosine transform] coefficients followed by a block delimiter

 

Px64 Bandwidth Allocation

P

Kbps

Channels

1

64

B or 56 Kbps

2

128

2B or 112 Kbps

3

192

3B

4

256

4B

5

320

5B

6

384

6B or H0 [1/4 T1]

12

768

2H0 [1/2 T1]

18

1152

3H0 [3/4 T1]

24

1536

4H0 or H11 [T1]

30

1920

5H0 or H12

 

H.320 and H.324 comparisons

 

H.320

H.324

Date

1990

1995

Framing & Demultiplexing

H.221

H.223

Control

H.230

H.245

Point to Point Call Setup

H.242

H.245

Multipoint Call Setup

H.243

H.243

Video Coding

H.261

H.261/H.263

Audio Coding

G.711/G.722/G.728

G.273

Data

 

T.120

Network

Above 64 Kbps

Below 64 Kbps

Network Interface

I.400

V.34

Network Implementation

N-ISDN

POTS

 

The relevant emerging CCITT standards are:

H.221

Communications Framing

H.230

Control and Indication Signals

H.242

Call Set-Up and Disconnect

H.261

Video Coding

H.263

Video compression using discrete cosine transforms, motion compensation, variable length coding, and scalar quantization.

H.320

N-ISDN Video telephony at rates of 64 Kbps and up. The desktop rate would be 128 Kbps

H.321

BISDN video

H.322

ISO ethernet video

H.323

Ethernet & token ring video

H.324

PSTN video over POTS using V.34 modems [28.8 Kbps]

G.711

64 Kbps PCM Audio

G.722

48/56/64 Kbps ADPCM Audio

G.723

Audio compression at 5.3 and 6.3 Kbps

T.120

Data conferencing via a shared whiteboard

AV.254

16 Kbps Audio

AV.321

Multipoint

 

Assignment Questions

 

Quick Quiz

1.  The (PAL, SECAM) system switches between the two chroma signals on alternate lines.

2.  The vertical color resolution of the SECAM system is [1/2, 2] times that of the PAL system.

3.  The former USSR uses the [NTSC, PAL, SECAM] video signal format.

4.  Upstream signaling in cable TV networks is done by FDM but not TDM. [True, False]

5.  The quarter CIF definition has a resolution of 288 lines by 352 pixels. [True, False]

6.  Blocks 5 and 6 in the CIF picture encode the red and green chromance signals. [True, False]

7.  The Px64 standard does not include call setup procedures. [True, False]

Composition Questions

1.  Why are chromance signals band limited?

2.  Define luminance, hue, and saturation.

3.  Why are the color difference signals [R-Y & B-Y] adjusted to create the U & V signals?

4.  Why is the luminance signal defined as:  Y = 0.3R + 0.59G + 0.11B?

5.  What is the magnitude and phase angle relationship between the primary and complementary colors?

6.  What are the differences between wired and broadcast videotex?

7.  What are the three basic ways to encode graphics information?

8.  What are the applications of videotex?

9.  Suggest some applications for Px64.

For Further Research

 

Godfrey & Chang; The Teledon Book ,

Le Bee, Jean-Pierre; “Rerouting Plus”, Telecommunications, April 1990

Winsbury, Rex; Viewdata in Action ,

 

Broadcast Television

http://www.nab.org/

http://www.ww-radio.ch/bmc_index.htm

http://www.cemacity.org/mall/product/video/hdtv.htm

http://www.smpte.org/

http://www.videointernational.com/Standards.html

http://www.ee.surrey.ac.uk/Contrib/WorldTV/index.html

 

Teletext

http://www.teletext.co.uk/

http://hpslweb.cern.ch/

http://www.verbatoria.com/antik/teletext.htm

http://www.impress.co.jp/teletext/

http://www.ozemail.com.au/~carbuhn/index.html

http://pdc.ro.nu/teletext.html

 

Videotext

http://elektra.e-technik.uni-ulm.de/~mbuck/vtx/links.html

 



[1]     Video Techniques, Gordon White

[2]     Based on fig. 21-23 Television Engineering Handbook, K Blair Benson ed

     Washington Higher Education Telecommunications System

     North Slope Borough School District