The Players 1
The Vendor Perspective 2
The Researchers 3
Bell-Northern Research 3
The Regulators 3
Standards Organizations 3
Selected Standards Organizations 3
International Standards Organizations: 3
National Standards Organizations: 3
The Problem of Standardization 3
ISDN Standardization Bodies 3
Standards are essential to networking, without them, there would be massive incompatability problems. They form the basis of products and drive markets.
The global networking industry is worth billions of dollars. Revenues for networking hardware and software in 1996 were $89 billion. Consequently vendors that can shape standards to their technologies are the big winners.
“Ninety percent of the people [involved in the IETF] used to come from research and education. Only 10 percent were vendors. Now the reverse is true,” — Scott Bradner, a senior technical consultant at Harvard University and director at IETF .
It’s possible for the big players to hijack the standardization process and kill off projects that give competitors an advantage or benefit users.
“Vendors will vote things down that aren’t in their products’ interest,” — Elizabeth Adams, managing director of the Network Management Forum.
Rival factions have been able to strong-arm a standards body into signing off on two competing specs. And some vendors can push standards through before they’re stable, forcing forklift upgrades for early implementers.
Robert Madge, chairman and chief executive officer of Madge Networks Inc. (San Jose, Calif.), says it’s not uncommon for switch vendors to wait until competitors have cast features in silicon and then intentionally make changes to a standard, forcing them to reforge their chips. “It’s a game vendors play,” he says.
And it’s usually a game for the top guns, the ones with the financial clout to swing decisions in their favor--typically by packing committees with their delegates. Microsoft Corp. now sends 10 times as many representatives to IETF meetings as it did two years ago. Some vendors also appear to be head-hunting key individuals on standards committees, offering them jobs on the basis of their influence.
As a result, some standards bodies have passed new bylaws that attempt to curb undue vendor influence, few believe these efforts will do much good.
“These days standards have more to do with how much the vendors are willing to spend on participating than what is right or technically workable,” “I don’t have a high opinion of standards bodies.” — David Kaufman, president of Desktalk Systems Inc. (Torrance, Calif.)
Vendors can sometimes speed up the sandardization process. For example, the ATM Frame Relay Forums created specifications which were simply approved by official organizations like the IETF and IEEE. The ITU-T indicates it now rolls out rules five times faster than in the past, because of vendor assistance.
Vendor participation helps guarantee that standards are based on commercial demand rather than elegant engineering.
“Having more vendor involvement is a boon,” “It helps ensure that standards closely reflect the real-world requirements of the people that use them.” — Bradner
In the past, standards were largely the work of government-sanctioned organizations. And although vendors have always been involved in the standards process now their influence is predominant. Consequently there are two types of organizations working on standards: The official ones and the vendor-driven groups.
In the U.S., a standards body is official if it’s endorsed by ISO. Overseas, government endorsement sets the official groups apart from vendor forums. Thus, the Commission of the European Communities (CEC) set up ETSI on behalf of European governments.
Everything else is a consortium, and there’s a reason why there are now so many of them.
“Consortia were started to drive market awareness and promote interoperability. The standards organizations don’t promote standards,” on the other hand “Just about anyone can start a consortium.” — Gary Robinson, director of standards at Sun Microsystems Inc.
Consortia help bring standards to life more quickly. A notable example is 100Base-T. Most of the spec was defined by the fast Ethernet Alliance, which passed its recommendations along to the IEEE. The full spec was ratified in two-and-a-half years.
In contrast, ANSI has spent over 10 years trying to complete FDDI (beginning in 1984).
“Unlike fast Ethernet, where everything is defined in one document,” says Bob Finke, deputy head of networking and telecommunications at Lawrence Berkeley National Laboratory (Berkeley, Calif.), FDDI was split into many documents--four of which are compulsory. ANSI published MAC (media access control) in 1987; PHY (physical layer protocol) in 1988; and PMD (physical media dependent) in 1989. SMT (station management) wasn’t published until 1994.
“Because SMT is a mandatory part of the standard, the delay almost killed FDDI,” says Karl Shimada, vice president of Rising Star Research (Lakewood, Colo.).
Vendors are in close contact with the real world products. This reduces the risk of creating specs that are too ambitious, complicated, or expensive. The OSI standards from ISO are aclassic example of this problem.
ISO “started with a blank sheet of paper and tried to design something for an ideal world. OSI is very elegant but impractical. There was no commercial drive to get it done,” In contrast, “TCP/IP was invented by people who wanted to communicate. That’s why it succeeded.” — Bill Pechey, chief engineering officer for Europe at Hayes Microcomputer Products Inc.
Unfortunately, consortia reduce much of the debate that goes on at official standards bodies and shift it instead into a vendor-run forum, usually behind closed doors. That reduces users’ chances of having a say. The ATM Forum is a good example: Voting rights are open only to principal members, and that costs $10,000 a year. Users pay $1,500 to join ENR (End-User Network Roundtable). But they can’t vote. In 1995 the forum set up a liaison program that lets end-user representatives attend technical sessions and market awareness meetings and report back to ENR, which is allowed to submit written comments.
“An open standardization process with no closed doors would force vendors to be honest,” — Tom Nolle, president of CIMI Corp.
“It’s absolutely true that we’ve created a strategy to increase our focus on standardization,” “Our customers want interoperability across the board. The way to do that is to engage early and often in the standards process.” — Cornelius Willis, group product manager of Internet platforms at Microsoft.
Willis indicates that there also is a PR element to Microsoft’s involvement. “We will always be the Antichrist to some people, but standards will help us convince others that we’re just another corporation.”
Cisco has good reason to be concerned with what happens at IETF meetings since the group is in charge of IP routing specs. Cisco’s market leadership in large part is thanks to its proprietary routing technology--IGRP (Interior Gateway Routing Protocol). But times have changed, and Cisco acknowledges that it can’t continue to dominate with a single-vendor technology. “It’s a different play now. Standards are now part of Cisco’s market development strategy,” says Don Listwin, senior vice president of marketdevelopment.
What’s more, IGRP has a new challenger--IP switching from Ipsilon Networks Inc. The new scheme could supplant IGRP on some parts of the network, which gives Cisco a vested interest in shaping IGRP’s successor. The solution that it’s championing is tag switching, which it has submitted to the IETF as a proposed spec.
Cisco’s tag-switching strategy is laid out clearly in an internal e-mail sent February 3 by Tom Downey, director of product marketing for the core business unit, to senior staff involved in work at the IETF. One of Cisco’s stated “Goals/Objectives” is to “... encourage customers who value standards to wait on purchasing this technology until the standards are defined (and hence not buy IP switching today).”
It’s a tactic that some industry observers think could work very well. “By promoting tag switching Cisco is very much a threat to Ipsilon,” says Kevin Fong, general partner with Mayfield Fund (Menlo Park, Calif.), a venture capital firm. “It’s a marketing tactic. Cisco is creating controversy and there’s now genuine confusion. It’s very deliberate.”
Like Microsoft, Cisco also is very concerned about its public image. Another stated goal is to “Be perceived as making forward momentum in the IETF tag standards work, to counter claims that Cisco wants the standards to go slow.”
But sending more reps doesn’t always guarantee that a vendor can influence a standards body in its favor. For example, the IETF requires “general consensus” to approve any standard. Consensus has been reached when no objections are raised. Those who oppose a standard must justify their objections with sound technical reasons. In the event of a standoff the chair of the working group can designate a specific direction to follow--for instance, choosing between two technical points that have stymied the effort.
Not all organizations work the same way. The IEEE, for example, allows those who have attended three consecutive plenary or interim meetings to vote in working groups--whether or not they’re IEEE members. And there’s no limit to the number of delegates vendors can bus in.
Many in the industry believe that Hewlett-Packard Inc. (HP, Palo Alto, Calif.) took advantage of the IEEE’s open door policy to push its 100VG AnyLAN technology onto the standards books--resulting in two 100-Mbit/s Ethernet specs.
As mentioned, the IEEE 802.3 working group started on 100-Mbit/s Ethernet in 1992. It had two proposals to consider: the one backed by HP and AT&T Microelectronics is now known as 100VG; the other from Sun, Synoptics (now Bay Networks Inc. [Santa Clara, Calif.]), and Grand Junction (now part of Cisco) is 100Base-T.
The IEEE regularly has to choose between competing proposals. In fact, that’s one of the keys to the entire standards process: Technologies are debated and decided on by ballot. The IEEE requires proposals to win 75 percent of the vote before they can move on to the next stage in the standards process.
Most observers agree that 100Base-T had far more industry support than 100VG. Despite that, when the 802.3 working group called for a vote, it ended in a split decision: 70 percent favored 100Base-T; 30 percent, 100VG.
“HP stuffed the IEEE 802 committee with its people,” — Doug Spreng, executive vice president of interface products at 3Com Corp.
“We’re not going to send 50 or 80 people to a meeting to pack it,” — Patricia Thaler, principal engineer for LAN architectures and standards at Hewlett-Packard Co.
Either way, the IEEE decided to compromise: Both standards were put forward for ratification--and both were passed.
It turned out that two nearly identical standards were more than the market could bear. 100VG has sold respectably in Europe but made only marginal inroads in North America. HP has since abandoned its plans for a higher-speed version, leaving current customers without a clear migration strategy.
“It would have been a lot easier if the IEEE had just come out with one standard in the first place,” says Marcos Castallanos, network manager with XL Group Inc. (Miami, Fla.), a flower importer. Castallanos shelled out for a 100VG hub but decided that 100Base-T was a better bet as prices started coming down.
Even some HP employees think one standard makes more sense. “It’s unfortunate that we wound up with two 100-Mbit/s specs,” admits Gary McNally, general manager for Roseville Networks.
Some vendors aren’t simply sending more delegates to meetings; they also appear to be trying to hire folks who have the most pull with standards bodies. Steve Waldbusser, one of the authors of RMON MIB (remote monitoring management information base) and SNMP version 2, says that his role in net management at the IETF played a big part in moving from academia to the post of chief network architect at International Network Services (INS, Mountain View, Calif.). Waldbusser is the former manager of network development at Carnegie-Mellon University (Pittsburgh).
“Clearly this was a factor in INS hiring me,” “The time I put into writing MIBs and participating on committees keeps INS in the forefront. It means we’ll be leaders in standards I help write.” — Waldbusser
Fred Baker, a senior software engineer at Cisco and the chair of the IETF’s IESG (Internet Engineering Steering Group)--which oversees the entire task force--has been approached by Microsoft headhunters. Did the job offer have anything to do with Baker’s position at the IETF? “Sure,” he says. “I’m a recognized person in the industry.”
Microsoft denies head-hunting on the basis of influence. “We’re aggressive recruiters,” says Willis. “We hire people because they’re smart. It would be ludicrous to hire someone because he or she is part of a pro bono standards effort.”
Vendors have two related reasons for investing in standards. First, standards create a market. Second, vendors that can standardize their own technologies can get a jump on the competition.
Standards also can be crucial for startups looking to win the backing of venture capitalists.
“It’s what the market wants,” says Terry Glarner, a consultant with Norwest Venture Capital (Minneapolis). “Apple is struggling in part because it wasn’t involved in standards.” He adds that being part of the standards process can be the deciding factor between two otherwise identical startups.
But vendors have to spend money to make it. And when it comes to standards, it takes a lot of cold cash. Anil Singhal, chief executive officer of Frontier Software Development Inc. (Chelmsford, Mass.), says he spent $7,500 attending five standards meetings in 1996 (all in the U.S.). “That doesn’t include my time,” he adds.
Some committees are far more costly: They meet more frequently and overseas. Last year the ATM Forum met six times--once in Los Angeles; Anchorage, Alaska; Orlando, Fla.; Baltimore; Montreux, Switzerland; and Vancouver, Canada.
The IEEE also is pricey. “A typical meeting costs around $2,000 to attend, including the hotel and flight,” says Jim Carlo, chair of IEEE 802. “But with lost work time it’s closer to $5,000 per meeting.”
End-user companies face the same sorts of issues; they need to factor in the cost of lost productivity while their highly paid IS professionals are away at meetings. In many cases this far outweighs the more visible expenses of bed, board, and Boeing 767. That helps explain why end-users are underrepresented at standards meetings. On average, end-user corporations make up less than 5 percent of the membership of most standards groups.
“I’d rather spend more on staff or tools than pay to attend standards [efforts],” says Nitin Naik, director for support services with NASA Classroom of the Future (Wheeling, W. Va.), a government program that demonstrates how leading-edge technology can be used in schools.
It’s the same story when it comes to overseas users. “The process is still too labor-intensive to permit even major users to participate,” says Nick White, deputy chairman of Intug (International Telecommunications Users’ Group) and global network manager for Unilever PLC (London).
Corporate networkers who are interested in shaping standards sometimes run into an attitude problem--at their own companies. “There’s this feeling that you’re off on a boondoggle,” says Mike Erlinger, professor at Harvey Mudd College (Claremont, Calif.) and a consultant for The Aerospace Corp. (El Segundo, Calif.). He recalls an IEEE meeting planned for Cancun, Mexico. It turned out that hotel and airfare were cheaper than in the U.S., but the meeting was moved to a U.S. location because so many companies found it hard to believe their engineers would get any work done in Mexico.
When end-users don’t have enough say, it removes a potential safeguard against vendor excesses.
“It’s good when you have at least one major user [participating in a standard],” “It’s a sanity check. I can read the vendors the riot act and tell them if this is a mistake.” He continues, “If I can’t see a benefit to users, then I know it shouldn’t be in the standard.” — Lawrence Berkeley’s Finke, IEEE 802.3z gigabit Ethernet task force.
A good example of what can happen when there’s not enough end-user involvement can be seen in the history of the ill-fatedMIC (Management Integration Consortium).
The consortium was founded in May 1994 with a simple goal: Get net management software vendors together to agree on a standard way to define data across their applications. If it had succeeded, the benefits would have been huge. Console vendors would have had a way to promote their wares as true platforms for third-party applications; application vendors would have been able to easily port their products across multiple consoles, opening the market to a wider range of applications; and users, above all, would have gotten more control over their networks, enabling them to further their own business interests.
But to achieve this aim, the leading net management platform vendors would have had to expose the underpinnings of their operating systems to their chief competitors. “This would make it tough for platform vendors to maintain proprietary solutions designed to promote their computing environments,” says Jill Huntington-Lee, vice president of marketing communications at Micromuse (New York).
In January 1995, as the final draft of the MIC spec was being prepared, a bomb fell: Executives at HP, IBM, Sun, and Digital Equipment Corp. (DEC, Maynard, Mass.) published a joint letter of resignation from MIC. Without their cooperation, the standard quickly went belly up.
“An HP employee told me that HP joined the MIC in order to destroy it,” — an industry source who requested anonymity.
Sun and IBM also were considered villains by some observers. “I remember a guy from IBM after we had one of the early MIC meetings,” recalls Desktalk’s Kaufman. “Afterward he said, ‘What the MIC is doing is trying to define something that’s different from our infrastructure in Netview. So why are we going to support it? We’d have to rewrite our product.’“
The users were the real losers. “It was a tragedy,” says Michael Emanuel, who at the time was vice president of marketing at Network Managers (UK) Ltd. (Guildford, Surrey, U.K.), a MIC member. He now works as a product manager at Microsoft.
More end-user involvement also might have averted a forklift upgrade for early implementers of ATM (who thought they were following the rules). The ATM Forum originally came up with three service classes: CBR (constant bit rate), UBR (unspecified bit rate), and VBR (variable bit rate). Vendors implemented these in hardware and some net managers deployed the products.
Subsequently, the ATM Forum released a fourth service class: ABR (available bit rate). It got a mixed reception--for good reason. On the one hand the nespec made more efficient use of bandwidth. On the other, it forced users of first-generation switches to scrap their gear and buy new boxes.
One of the reasons that ABR was not offered along with the other three services was a lengthy battle between backers of two opposing flow control schemes: rate-based and credit-based. Ultimately, the rate-based camp won out, but not without bitter infighting.
Some believe the ATM Forum’s problems in trying to reach a consensus related more to vendors’ product plans and development than technical debate. “It was a brutal war because some vendors already had switch implementations that would be hard to change,” says Mike Goguen, then chair of the PNNI (private network-to-network interface) working group who now works for Sequoia Capital (Menlo Park, Calif.), a venture capital firm. “They poured considerable dollars into development and wanted the spec to go their way.”
Flow control isn’t the only ATM spec that has created a problem. When the forum ratified UNI (user network interface) 3.1, it wasn’t backward-compatible with UNI 3.0 (UNI is a protocol that defines signaling between ATM end-stations and switches). Thus, equipment implementing the different specs couldn’t communicate.
Vendors had to scramble to release UNI 3.1 software, and corporate networkers were forced to upgrade their networks all at once (rather than roll out UNI 3.1 in stages). Needless to say, this didn’t win the ATM Forum any friends among vendors or net managers.
George Dobrowski, chair of the ATM Forum’s worldwide technical committee, admits the UNI interoperability problem. “With hindsight we know this was wrong.”
Both users and vendors have openly criticized the ATM Forum for delivering “unstable specs.”
“The ATM Forum forced users to buy into half-baked technology before it had any business value,” says CIMI’s Nolle. As a result, the market for ATM froze: Users stopped buying products; vendors stopped implementing the latest specs--ironically enough, these are precisely the sorts of things the ATM Forum had been set up to prevent.
The Forum seems to have learned its lesson. This past June it signed off on the Anchorage Accord, which lays out a foundation for ATM that ensures all future additions will be backward-compatible.
The ATM Forum is not the only standards body looking to put its house in order. The Network Management Forum has made changes to ensure that there is a balance between users and vendors on its board of trustees.
Since the 100VG debacle, the IEEE has changed its rules to allow the chair to take a “one-vote-per-company” ballot. But Jeff Thompson, current chair of the 802.3 working group, indicates “the end result is very rarely different.” It’s also now easier to dismiss a chair with a no-confidence vote.
And the IETF recently revised its rules to prevent vendors from derailing standards either by refusing to license patented technologies or by charging too much for licenses. It made the change in the wake of an incident a few years back involving Motorola Inc. (Schaumburg, Ill.).
The IETF was trying to define a way to standardize compression over PPP (point-to-point protocol) connections. When the final draft was being readied, Motorola announced it had a patent on the technology and declared it was not interested in licensing it, according the IETF’s Baker.
Motorola denies this. “The standard was developed by a group in which Motorola was not participating. When we learned about it, we told the IETF that we had a patent and would license on reasonable terms,” says Ed Roney, vice president and director of standards and technology transfer.
But Baker is adamant. “Motorola was using the process of the IETF for proprietary advantage.” It took two years to sift through the various issues involved in this dispute and set a new IETF policy for handling intellectual property in the IETF. This is spelled out in RFC 2026 Section 10.
Until this incident, the IETF had relied on a “tear-off” contract that Baker acknowledges was too elementary to cover the scope of intellectual property agreements for large companies. “We didn’t document the process well enough in the past,” he says.
And intellectual property rights still dog the standards process. A spec for 56-kbit/s modems is stalled in committee at the ITU-T.
“Intellectual property, not technology, may become the deciding factor in this situation,” “Neither side has been forthcoming with their technologies,” — Ken Kretchmer, TR30 committee of the TIA .
Even with all the changes at various standards bodies, there’s no doubt that vendors still have an almost total lock on the standards process worldwide. It remains to be seen whether their influence will ultimately be a force for good or for bad. On the one hand, net managers should reap the benefit of speedily developed, commercially viable specs. On the other, their interests may sometimes end up taking second place to those of the vendors.
Bellcore is the research and development arm of the RBOCs. It is also a key player in the establishment of international standards. It for example, is the publisher of the set of about 50 TR† s collectively called ‘National ISDN-One’. These standards include ones produced by other organizations such as the 2B1Q protocol defined by ANSI.
This organization is owned jointly by Bell Canada and Nortel [formerly Northern Telecom].
There are more than 70 carriers in Canada. Ownership ranges from the private sector to cities, provinces and even the federal government.
Standards agencies are composed of representatives from various special interest groups including manufacturers, operating companies, research labs, government agencies, and others. Consequently they are inherently quite political and a consensus is frequently difficult to achieve. Standards bodies are often looked at with disdain by those who must implement the results of their deliberations.
In spite of these difficulties, standards organizations are essential to the well-being of modern society and are taking on a greater relevance in today’s world. Major corporations who once could create de facto standards with impunity, are now having a very difficult time doing so.
Organization: AIW (APPN Implementors Workshop)
Focus: APPN and SNA
Type: Vendor consortium
Membership: 45 vendors and consultants
User Input: none
Organization: ANSI (American National Standards Institute)
Focus: LANs and WANs
Type: Standards body
Membership: 1,400 companies, organizations, government agencies, and institutions
User Input: No formal channel
Organization: ATM Forum
Membership: 880 members (280 primary and 600 auditing), including vendors, carriers, and consultants; 155 user companies in End-User Network Roundtable (ENR)
User Input: End-User Network Roundtable established 1993
Organization: DMTF (Desktop Management Task Force)
Focus: PC management
Type: Vendor consortium
Membership: 100 companies, nearly all vendors
User Input: No official procedure
Organization: ETSI (European Telecommunications Standards Institute)
Type: Standards body
Membership: 410 full member companies (includes 27 user outfits), 25 associate members, and 84 observers
User Input: Encourages user input by offering membership discounts and operating a special forum
Organization: Frame Relay Forum
Focus: Frame relay
Type: Vendor consortium
Membership: 300 members (equipment vendors and carriers)
User Input: No formal channel
Organization: GEA (Gigabit Ethernet Alliance)
Type: Vendor consortium
Membership: Gigabit Ethernet 85 members (equipment vendors and consultants)
User Input: No formal channel
Organization: IEEE (Institute of Electrical and Electronic Engineers)
Type: Standards body
Membership: 320,000 individual members from 147 countries
User Input: No formal channel
Organization: IETF (Internet Engineering Task Force)
Focus: Internet and related technologies
Type: Standards body
Membership: Will not disclose membership; more than 2,000 people attended December meeting; vendors now heavily represented
Meetings: U.S. and Canada
User Input: No formal channel
Organization: ISO1 (International Organization for Standardization)
Focus: Information technology
Type: Standards body
Membership: National standards institutes from 118 countries
User Input: Indirect user input via national standards institutes
Membership: 187 governments and 400 other members (mostly PTTs, carriers, VAN operators, and vendors)
User Input: Some countries invite user comments when formulating positions
Organization: Network Management Forum
Focus: Net management
Type: Vendor consortium
Membership: 200 members; roughly 80% areelcos or their suppliers
User Input: Via User Advisory Council (about members)
Organization: OMG (Object Management Group)
Focus: Object-oriented software
Type: Vendor consortium
Membership: 700 members, nearly all vendors
User Input: Via End-User Special Interes Group (20 to 50 members)
Organization: The Open Group
Focus: Open systems
Type: Vendor consortium
Membership: 450 member companies, including 90 vendors
Meetings: Quarterly meetings: 3 in U.S., 1 international
User Input: Via Customer Council (150 representatives)
Organization: OURS (Open User Recommended Solutions)
Focus: Infomation technology
Type: Vendor consortium
Membership: 40 members, roughly 50% vendors
User Input: Group itself is intended as a user liaison with vendors
Organization: SMDS Interest Group
URL: http://smds-ig.org; ftp.casc.com
Type: Vendor consortium
Membership: 20 service providers, 8 user companies
User Input: Via SMDS User Group
Organization: W3C (World Wide Web Consortium)
Focus: World Wide Web
Membership: About 160 members, including 135 vendors
User Input: No official procedure
ISO’s information technology standards are created in conjunction with the IEC (International Electrotechnical Commission).
APPN = Advanced Peer-to-Peer Networking
SMDS = Switched Multimegabit Data Service
• ITU - International Telecommunications Union. This is an agency of the UN. Until March 1/93, it consisted of 4 groups: the General Secretariat, IFRB, CCIR, and CCITT. It was then reorganized into three sectors: Radio Communications [formerly IFRB and CCIR], Telecommunications Standardization [formerly CCITT and now designated ITU-T], and Telecommunications Development [new].
• IFRB - International Frequency Registration Board. This administered international radio frequency assignments and organized the WARC† conferences. The two most recent conferences were in 1987 and 1992. This is now part of the Radio Communications Sector of the ITU.
• CCIR – Consultative Committee on International Radio. This agency consists of a number of groups responsible for wireless standards including the technical aspects of spectrum usage and interworking. One of the more significant working groups in this agency is IWP8† which has been developing FPLMTS†. This is now part of the Radio Communications Sector of the ITU.
• CCITT - Comité Consultatif Internationale de Telegraphiqué et Telephoniqué. This agency consists of a number of study groups which make recommendations for wired telecommunications networks. The scope of this includes: signaling protocols, registration procedures, and numbering plans. This is now part of the Telecommunications Standardization Sector of the ITU [ITU-T].
• IEC - International Electrotechnical Commission
• ISO - International Standards Organization. This is a voluntary, non-treaty organization comprised of representatives from other standards bodies. It is responsible for the 7 layer OSI model and works closely with CCITT.
The WARC conferences bring together various agencies from throughout the world to try and resolve common issues in radio administration. The conferences are held whenever they are required and may last several months. It is to be expected that each region will have a slightly different perspective. These differences arise because of population distribution and density, historical spectral deployment, and commercial interests.
At the moment there appears to be a slight difference of opinion as to the use of the existing spectrum in the 2 GHz band. The US prefers to use it for LEO mobile satellite services while the Europeans would prefer to use it for FLMPTS.
Multi-National Standards Organizations:
• CEPT - European Post and Telegraph Committee
• ECMA - European Computer Manufacturing Association. This is a small trade organization which contributes to ISO and issues its own standards.
• RACE - R & D in Advanced Communications for Europe. This agency is composed of 25 organizations which are attempting to integrate UMTS†, PSTN and broadband systems.
• ANSI - American National Standards Institute. Consists of manufactures, carriers, and users of telecommunications equipment. It is the US voice in ISO.
• IEEE - Institute of Electrical and Electronic Engineers. A professional society which contributes to ANSI and issues its own standards such as IEEE-802.6
• EIA - Electronic Industries Association. A manufacturing trade organization which contributes to ANSI and issues RS-232-C.
• NBS - National Bureau of Standards. This is a government agency which issues standards for equipment sold to the US government.
The telecommunications needs of various parts of the globe can vary dramatically. This is often due to the distribution of population and economic well being of the country. However, the direction that telecommunications takes is also dependant upon national pride, and corporate advantage. Consequently, it may take years for suitable standards to emerge, and when they do, they’re often so vague that incompatabilities and inconsistances are inevitable.
The problems created by a lack of cooperation is well seen by the painfully slow deployment of ISDN.
Study Group VII defining user to network requirements for public data networks
Study Group XI responsible for Signaling System # 7
Study Group XVIII
Technical Committee 97, Subcommittee 6
Technical Committee 32
• ETSI [European Telecommunications Standards Institute]
ECSA [Exchange Carriers Standards Association]
• Bellcore, the research group of the RBOC [Regional Bell Operating Companies]
Each one of these organizations has its own unique economic and technical concerns. Consequently, there is no universally accepted and mutually compatible version of ISDN today. This has been identified as one of the chief obstacles to the deployment of ISDN in the public network.
This has a negative impact on high volume end-users as can be seen in the following comment:
“The single most important thing that must happen is for Nortel and AT&T, the two largest manufacturers of switching equipment in North America, to come to an agreement on ISDN protocols and put it into production”
To overcome the problem, COS [Corporation for Open Systems International] has announced a major agreement among several of the large equipment vendors, to standardize the signaling protocols between CPE and the network. Members of this group include: AT&T, Apple, Bell Atlantic, Bellcore, Boeing, DEC, GM, IBM, Kodak, Motorola, Nortel, NYNEX, Siemens Stromberg-Carlson, and Southwestern Bell. The plan is to base the standard on Bellcore’s technical specification known as National ISDN-1 [SR-NWT-001937].
The difficulties associated with this approach is already beginning to be recognized by some:
“The problem [with the COS announcement] is that they’re trying to do it outside of the classic standards bodies because they’re too slow. But unless everyone who is relevant is included, all that’s accomplished is the development of multiple standards.”
It is expected that ISDN services afforded by National ISDN-1, may become widely available by late 1992. However, “the regional Bell holding companies have said that it will be another four to five years before ISDN is truly a ubiquitous national service”.
No wonder that some have said “ISDN is not dead, it’s just sleeping”.
This inability to agree seems to affect all sectors in the communications industry. For example, each equipment vendors is taking slightly different approach implementing SMDS. The IEEE 802.6 protocol must be used at all external interfaces of SMDS equipment however, QPSX Communications and Siemens are also using it as their internal switching protocol whereas AT&T is not. It’s using a proprietary one.
Herein lies the source of the major problems limiting the deployment of advanced technology and services, particularly as it impacts on CPE:
• The lack of will to develop sensible standards
• Industrial protectionism sponsored by proprietary protocols
Section II: Role of Standards
Standards serve several critical functions. First, standardization can lower the costs of production through increasing returns to scale and through learning by doing. Second, by ensuring compatibility, standardization increases the number of agents with whom an individual can communicate. Finally, standards are promulgated to ensure that the equipment provided meets certain requirements. Without agreed-on requirements, equipment could be rendered useless almost as soon as it is deployed. (Paetsch, 1993).
Through a variety of mechanisms, one standard may come to dominate the market. There are no guarantees that the dominant standard is the technically superior standard. There are generally two ways by which an inferior standard can come to dominate the market. The first method is the "snowball effect" or "the bandwagon". The more users choose a particular standard, the better it looks in relation to alternative standards. Through random selection by users, one standard will become bigger and more widely used (larger market share). The second method is the market's discomfort with uncertainty. If two competing standards are available but their value is not clearly understood, then an early adoption of one over the other could lead to the choice of an inferior standard out of a simple need to reduce uncertainty. (Information Technology Standards: The Economic Dimension, Publication #25, OECD 1991, pp. 42-43)
An effective standard setting process must address certain key issues. Standards should codify established practice and should ensure reasonable quality. Thus a standard is only issued after the requirements have been proven through practical experience including field trials. Standardization is a slow process that must take into account the investment in pre-standard equipment to ensure the stability of the standard. In the rapidly changing fields of technology, many standards experience early obsolescence through this evolutionary process. During this evolutionary period, a proprietary system may emerge and gain dominance thus becoming a "de facto" standard. This may result in a longer term disadvantage to industry and users.
Some experts assert that standards should define the direction of future development and that very limited constraints should be applied sufficiently early to ensure the compatibility of emerging systems. The standard-setting process should allow the user a choice based on cost and quality instead of having to judge between competing systems of different technical design. The dangers of this approach are that without a full scale field trial, untried specifications may be unsound. While early standardization may discourage further innovation, experimental standards allow for the confidence to be gained in the specification which are eventually standardized. ["Trends of Change in Telecommunications Policy," #13, OECD, 1987, pp. 165-66]
In establishing a position with regards to international standards, policy makers must assess several needs. These are outlined below. Pace of Standardization: The main argument against complete standardization is lost variety. Standards can be kept alive in the laboratories, but not in the market place. By keeping several standards alive in the R&D arena, policy makers can insure that if the dominant standard in the market proves inferior, there are several more in the labs awaiting field trail. This can reducing the switching costs. By slowing down the standardization process to encourage the search for information about novel standards, the policy maker can help deter the adoption of the wrong standards by the market.
Extended Agendas: In influencing the standard-setting process, the policy maker may have a broader agenda. This agenda may include promoting the domestic standard on the world market.
Keeping Up: Every standard is part of a technology complex rather than a single technology. A standard may be obsolete by the time it is chosen. For example, ten years ago nuclear power was considered a very bad way of generating electricity. In the 1990s if global warming is a serious a problem as is predicted, nuclear power may become a more attractive technology after all. These uncertainties make standards adoption very challenging for policy makers. (#25, OECD 1991, pp. 46-49)
Search for Innovative Technologies: Consumers are generally unwilling to rigorously search for the best technology. The incentive to use the technology with the most immediate payoff outweighs the more forward looking work of R&D and the search for more information. This suggests a positive role for policy makers in supporting strategic research. Strategic research has the advantage of not being necessarily associated with a particular standard. It can also have the benefit of reducing uncertainty about the standards before they reach the market.
Future Applications: Once a standards has been introduced to the market, consumers determine whether it is successful or not. This suggests that a market can influence the adoption of an inferior standard from the point of view of future applications. It is the role of a policy maker to understand the future applications and the group of consumers who will be able to take advantage of these applications. The problem is determining which standard has the greater future potential.
Compatibility: Consumers generally care about future compatibility of existing standards. It is the role of the policy maker to find early adopters who can form a user base to help a standard along a learning curve. There is a conflict between the benefits of early adoption and the risk of future incompatibilities. Technological innovation sometimes requires the abandonment of previous standards in favor of a new approach.
Section III: International Standards Bodies
Source: The Internet Society.
A series of international bodies have been created to deal with the growing international complexities brought on by technological advances in the fields of telecommunications, electronics, and information systems. One of the most significant of these bodies that is still playing a critical role in shaping telecommunications standards is the International Telecommunications Union (ITU). The ITU was formed at the Telegraph and Radiotelegraph Conference in Madrid in 1932. It was created in an effort to concentrate the regulation and coordination issues concerning radio, telephone, and telegraphy. In 1947, in Atlantic City, the ITU was reorganized and incorporated into the United Nations organization as a specialized agency (Paetsch, 1993). See note
By 1993, the ITU included 166 member countries; more than 300 nongovernmental agencies such as private operating companies; and scientific, industrial, and international organizations. A Convention serves as the legal structure of the ITU. This Convention is revised by the Plenipotentiary Conference which meets every six to eight years to decide on purposes, structures, functions, and general provisions related to telecommunications. Because the ITU has no jurisdiction over sovereign countries, the Plenipotentiary Conference is structured as a treaty-level meeting. For the decisions of the Conference to be binding, all ITU member countries must ratify the revised Convention. The ratifying countries are then obliged to sign the provisions into national law (Paetsch, 1993).
The ITU also holds periodic World and Regional Administrative Conferences such as the World Administrative Telephone and Telegraph Conference (WATTC), the World Administrative Radio Conferences (WARC), and the Regional Administrative Radio Conferences (RARC). The WARCs in particular are important for the wireless communication industry. These conferences result in revised radio regulations that are annexed to the International Telecommunication Convention (Praetsch, 1993). At the 1992 World Administrative Radio Conference (WARC-92), held in Malaga-Torremolinos, Spain, representatives endorsed the concept of "universal personal communications" with both terrestrial and satellite components. WARC-92 also allocated spectrum for Mobile Satellite Service in the L-band (1500- 1700 MHz), thereby giving global PCS ventures the necessary legitimacy to move ahead with deployment plans (Telecommunications, December 1993)
There are also five permanent organs of the ITU: the General Secretariat, the International Frequency Board (IFRB), the International Radio Consultative Committee (CCIR), the International Telegraph and Telephone Consultative Committee (CCITT), and the Telecommunications Development Bureau (BDT). As the consultative committees, the CCITT and CCIR adopt thousands of technical and operational standards to ensure network compatibility. These consultative committees form study groups that issues recommendations on specific technical problems. Because these recommendations are informal and do not require treaty-level adoption, their effectiveness depends on the cooperation of the member countries. The CCITT manages the non radio study groups, including groups that are concerned with the design and standardization of ISDN, broadband ISDN, and intelligent networks. The CCIR is responsible for studying technical and operating issues in order to standardize telecommunications on a global basis (Paetsch, 1993).
In recent years, there has been a great deal of criticism leveled at the ITU's standard-setting process. Critics have charged that its process for approval of standards is too cumbersome and time-consuming. ITU's role as the dominant international standard-setting body has been seriously called into question. While national government telecommunications operators support a prominent role for ITU, other countries and user groups argue against too strong of a role the ITU on the grounds that it could adversely affect competition. While the debate continues over the role of the ITU and who should have input into the ITU standard-setting process, regional efforts at standard-setting have accelerated. Some analysts have suggested that standards will be increasingly influenced by regional organizations (Paetsch, 1993). These regional standard-setting efforts are discussed in greater detail below.
In addition to the ITU, two other organizations play a role in international standard-setting. The International Standards Organization (ISO) is a non-governmental group formed in 1947 to promote the development of standardization in all fields except electronic engineering. The ISO is organized into technical committees. Technical committees on telecommunications and information exchange, interconnection of equipment, and integrated circuit cards should influence the telecommunications market. The International Electrotechnical Commission (IEC) was established in 1904 to promote electronic standards ensure reliability and compatibility of equipment. In 1993 there were 82 technical committees of the IEC relating to telecommunications (Wallenstein, 1989).
European PCS Scenarios
Industry Structure in Europe
With the exceptions of Britain and Sweden, telecommunications industries in Europe have been traditionally dominated by Postal Telegraph and Telephones (PTT) government agencies and domestic manufacturers. Because of the hegemony of the PTTs, the domestic telecommunications industry was characterized by excessive protectionism, lack of specialization and economies of scale, burdensome regulation in customer premises equipment, and high prices (Paetsch, 1993).
As a general case, one of the most effective obstacles that the PTTs set to private involvement was the refusal to connect private networks to the public systems. In those cases where they did provide the services, PTT's charged customers on a highly priced volume basis and only data traffic was allowed.
First liberalization steps were given in the early 1980s with the arrival of the telex retailers to the UK. The idea allowed customers from other European countries to take advantage of the somewhat cheaper international rates from and to the UK by using it as a telecommunications hub. In addition the British Telecommunication Act of 1981 liberalized customer premises equipment and gave users more autonomy on their leased lines usage, including the possibility of selling unused portions.
In 1984, the European Union attempted to establish a cohesive telecommunications policy by enacting the Council Recommendation of November 1984. It advised member countries to "stop the fragmentation of the European market, reduce prices, and expand the market." This initiative was conducive to the publication in 1987 of what constituted one of the most influential steps towards liberalization in telecommunications sector in Europe: the Green Paper on Telecommunications.
The Green Paper of 1987 intended to promote discussion on liberalization, customer premise equipment, and networks deployment oriented towards a market structure that would benefit users with better and cheaper technologies. Its effect was recognized as an important strategic contribution to the general unification efforts that were to be consolidated by 1992 with the European Community. Although limited in scope, the paper emphasized the importance access provisions to private companies and interconnection between countries. In addition, it proposed a phased relaxation on customer premises equipment regulation. It also recommended the separation of regulatory and operational functions, aiming for fairer licensing allocation procedures. Recognizing the crucial role of standards formulation, the paper recommended the creation of the European Telecommunication Standard Institute (ETSI) (Paetsch, 1993).
The Green Paper also focused on the opening of competition in the value-added services. The decision of the European Commission in 1989 to introduce a Directive that would revoke the exclusive rights of PTTs to provide services other than voice created enormous controversy. As a result, the Commission had to enforce its decision, particularly in more conservative countries (France, Greece, Portugal and Spain) based on Article 90 of the treaty of Rome, by which the Commission was entitled to impose directives to individual countries when they refused to comply.
The need for the introduction of Personal Communication Services in Europe was nurtured by two major assumptions: first, the rapid growth of users in the cellular phone system generated fear in the industry that the system would run out of capacity too soon. Second, PCN was conceived to be used mainly by pedestrians who have particular needs regarding size, weight, functionality, quality, and cost of the service. The way PCN has evolved in Europe has determined three differentiated categories: cordless, cellular, and hybrid systems. The differences between cordless and cellular are mainly cell size, transmission power and codec complexity.
Regulatory Environment in Europe
With the enactment of the Mobile Green Paper, the European Commission took a step toward a more cohesive policy framework in the European Union. It was said that mobile telephony was particularly significant for the European Union for it fits the concept of freedom of movement for people, goods, services, and capital in which the Union operates.
The goals of the European Commission with regards to mobile communications can be synthesized in the following points:
• facilitate the development of a Union-wide market for mobile services
• define a common framework for mobile services infrastructure, the development of networks, and the supply of terminals
• promote personal communication services towards a mass oriented market
• maintain links with other markets and organisms that promote international standards that foster innovation
The Commission is particularly sensitive to define the roles of other European telecommunications organizations and it is committed to find channels of collaboration with them. A special emphasis was given to the result of the World Administrative Radio Conference help in 1992 in Spain, where future frequency allocation for mobile services was discussed. The Union is taking part of negotiations regarding telecommunications services provision within GATT members.
The UK was the first country to conceptualize a Cordless Personal Communications Network. It was originally based on the cellular phone system and the idea was to consider a second-generation of cordless equipment to replace the first-generation that was already in use (cordless phones used at home and office settings). A manufacturer extended this concept and used a technology known as CT2 to install base stations (telepoints) throughout the country. As a result, owners of CT2 were able to connect to the phone network from certain areas of coverage. The concept was then expanded so that a pager could also be integrated to the telepoint handset. By doing this, the telepoint user could be alerted about incoming calls.
Because the standard definition process during this period was somewhat anarchical, non-compatible CT2 systems were developed. At this point the British Department of Trade and Industry (DTI) proposed a common interface, but some manufacturers did not embraced the initiative. In order to enforce its standardization concept, the DTI issued license requirements for telepoint operators, that included compliance to Common Air Interface (CAI) standards by the end of 1990. In addition, they were forced to establish inter roaming capabilities among their systems. Eleven operators complied by 1989 and were granted telepoint operations licenses. This enforcement mechanism slowed the development of service infrastructure due to lack of availability of CAI-compliant equipment. This circumstance added to the inability of operators to coordinate their planning efforts significantly affected the telepoint market. Companies started to loose interest in the service and eventually went out of business. In 1994, Rabbit, controlled by Hutchison Personal Communications, was the only company left in Britain providing telepoint services.
A year after awarding telepoint licenses, the DTI announced the launching of PCN. The original idea did not focused in technical capabilities. Instead the DTI was looking for any viable alternative that the market could provide to compete in local access with British Telecom. Other rationales for the new system included the limited capacity of cellular systems, low penetration, high costs of terminals and services and low functionality.
Standards in Europe
During the years of PTT hegemony, international standards were defined by teams of national administrators and major manufactures from industrialized countries in coordination with the ITU (Steinfelid, 1994). Viewed as a non-tariff trade barrier, international standardization efforts were aimed to maintain incompatibility thus perpetuating the benefits of the PTTs and allies. The ETSI was forced to maintain an arduous battle with the established structures from its inceptions, since the conservative group opposed its creation and tried to limit its scope to merely research responsibility. However the Green Paper of 1990 on the Development of European Standardization gave a final push to the ETSI by extending its standardization scope to the entire industry as opposed to individual countries. The ETSI is formed by 200 members from 21 countries from which 60% are manufacturers, 14% are national administrators (PTTs), 11% are public network operators, and 10% are users and service providers (Stainfeild 1994). In response to a US concern that the ETSI might become a "standards fortress" it was admitted as a non voting "associate member."
The ETSI has 12 technical committees (TCs), three of which are in charge of mobile communication standards. The TC PS (paging systems), the TC GSM (Group Special Mobile), and the TC on RES (radio equipment and systems). ETSI's highest authority rests in the Technical Assembly which makes the final decisions on the standards. The ETSI has adopted an effective task-oriented method in which flexible teams are put together to work in specific standards needed by the industry. As a result, standards are being produced in a few months as contrasted with the several years that it used to take before. By October of 1992 more than 300 standards were already issued by the ETSI (Steinfield, 1994).
The formulation of the GSM cellular standard, one of firsts developments undertaken by the ETSI, was considered a success. GSM is important because it has been adopted as an industry reference for product development since its inception in the mid-1980s. Some have argued that GSM, which was conceived of as a digital system, explains why Europe is so far ahead of the U.S. in implementing digital systems. The U.S. is slowly trying to convert its first-generation analog systems to digital systems. A year ago, Business Week reported Europe being 12 to 18 months ahead of the US in digital product availability. In addition, 70 countries have already adopted GSM giving the standard a substantial advantage in front of other technologies.
Following the GSM success, the ETSI formulated the CT2 standard and the Digital European Cordless Telephone (DECT) standard. CT2 Standards are based on the frequency division multiplex access/time division duplex transmission FDMA/TDD, employ digital speech-coding techniques and support dynamic channel allocations. Cordless is essentially an extension of the fixed network and its uses power transmission of around 10mW with coverage ranges from about 50 meter indoors to 200 meters outdoors. In the UK, this system operates in 40 x 100 kHz channels in the 864.1-868.1 Mhz frequency.. The DECT standard, which uses a system developed by Ericsson, Philips and Siemens, serves both house cordless systems and telepoint (pay phones) systems (Steinfield, 1994). DECT operates in the 1.88-190 GHz frequency, using higher data rates and higher peak transmission power. As opposed to CT2, DECT is base don the time division multiple access (TDMA) technology. The DECT standard is fully backed by the Commission of the European Communities
After the U.K. launched PCN in 1991, ETSI acted proactively to prevent the creation of a new proprietary standard and the proliferation of subsequent reactions in other countries. ETSI urged the UK regulators to coordinate the standardization process on a Europe-wide basis. As a result, ETSI was responsible for the formulation of the PCN standard. Some debate was generated during the definition on the standard base to use for PCN, between competing GSM and DECT standards. The decision was finally made favoring GSM, partly because of pressures from countries already using the system, but also because of the major financing and technical assistance coming from the UK, which had the biggest GSM deployment in place.
In 1991 the ETSI concluded the technical PCN standard for Phase I. It was based on the GSM specifications and due to its operability in the 1.8 GHz band it was named DCS 1800. The main differences with the cellular system were a different radio frequency-link definition with reduced output power (250 mW), consistent with a smaller cell size (400m to 8 km). PCS (DCS 1800) is not a new superior network, but a digital cellular network (GSM) operating at higher frequencies and offering more capacity (Paetsch, 1993).
Originally PCS service was only designed for national roaming. In 1989 despite the fact that Phase II was not concluded, the first three PCS licenses were awarded in Britain to Mercury PNC, Unitel and a British Aerospace consortia. The first PCS system was launched in Britain by a company called Mercury One-2-One in September 1993, cutting most of GSM calling costs by as much as 50%. The second PCS system was also established in Britain by a Hong Kong based company, Hutchison Whampoa.
In other European countries, the GSM Standard DCS 1800 was well received. Since its initial launching, the standard has had technical improvements that increased its capacity and congestion problem. E-Plus Mobilfunk, a German consortium, started operation on May 27, 1994 and analysts predict this will be the largest PCS network in the world with 3.3 million subscribers by 2000. (Business Week, May 23, 1994). In France the idea of a PCS service based on GSM was initially rejected since France Telecom favored a system based on DECT standard, that allows connectivity to private combinations facilities. However in 1991 France announced its intention of setting up a testbed of DCS 1800.
Industry Structure in the U.S.
The U.S. wireless communication sector consists of subscriber equipment manufacturers and service carriers. Both sectors are highly concentrated and tend to be dominated by large electronics firms. This is due to three factors: extensive technological barriers to entry, heavy capital requirements, and regulatory restrictions (Business Economics, April 1994). Becoming a player in the PCS market requires a tremendous amount of capital to build PCS systems, relocate the current users of 2-GHz spectrum, and market the final product. Companies will not see returns on their investments for some years (Telecommunications, "PCS: A Progress Report). Nevertheless, some small and mid-size firms are beginning to have a presence. Major producers of wireless equipment include Motorola, both of which are expected to offer handsets for PCS. Forerunners in the U.S. PCS service industry include the regional Bell Operating Companies (RBOCs), long distance carriers such as Sprint, MCI, and GTE and cellular operators such as McCaw cellular (Business Economics, April 1994).
As with other segments of the telecommunications industry, there is a trend towards consolidation in offering national PCS service. US wireless carriers are seeking partnerships in an effort to establish a single, nationwide wireless network of services. While a proposed joint venture between MCI and Nextel suddenly collapsed, MCI is now seeking other wireless partnerships. In similar fashion, Sprint is joining up with Bell Atlantic and NYNEX (Telecommunications, November 1994). AT&T purchased McCaw Cellular in an effort to establish a national wireless network (AT&T's $12 Billion Cellular Dream). The RBOCs are also establishing partnerships.
In addition to establishing national networks, U.S. service carriers recognize that global networks are crucial to PCS profitability. The concept of Global Personnel Communication Service extends the reach of terrestrial PCS islands by overlaying satellite service. Global PCS will initially extend and augment the services of incumbent carriers by providing access to leading edge telecommunication services in locales unserved or underserved. Because of its cost at no less than $3.00/minute and currently in the $7.00 range for Inmarsat (International Maritime Satellite Organization) services, global PCS will not compete with existing, cheaper terrestrial options, including cellular radio. Analysts predict that the rollout of PCS will create a new profit center for incumbent wireline carriers, rather that pose a financial and facilities bypass threat. Because of its world-wide coverage, global PCS needs only a few percentage points worth of market penetration to generate ample returns. If PCS achieves "best case" market penetration, its stands to become an important aspect of the telecommunication infrastructure in its own right, and not because of its ability to augment and extend wireline facilities. (Telecommunications, Dec. 1993)
The growth of the PCS industry includes expansion in the electronics and software markets. Improvements and marketability of PCS will make wireless devices ubiquitous. Additionally, analysts believe that developing easy-to-use software will be the key step in developing an application that will entice users to buy devices that support wireless data. Companies such as Mobile Telecommunication Technologies Corp. which received a nationwide narrowband PCS license to build a two-way wireless data network are teaming with investors such as Microsoft to develop wireless data communications solutions for MTel's data network (Telecommunications, "PCS: A Progress Report).
Regulatory Environment in the U.S.
The Federal Communications Commission (FCC) serves as the primary regulatory body for the U.S. telecommunications industry. The FCC is responsible for allocating spectrum in the U.S. market. In October 1991, the FCC issues a policy statement asserting that PCS promised important economic, competitive, and other public interest benefits and that PCS should be very broadly defined. In making allocation decisions regarding PCS, the FCC recognized that because of issues related to equipment cost, size, power, and performance, and because of international considerations, PCS devices and services have to operate in the 2-GHz band - between 1.8 and 2.2 GHz. This posed a problem because for decades the 2-GHz band had been allocated to commercial applications - common carrier, public safety and some video microwave services. U.S. PCS suppliers and providers argued that the PCS allocations in the 1.8 to 2.2-GHz bands was primarily an international phenomenon and that the U.S. PCS industry risked being foreclosed from the international market if it were not accommodated at home (Business Communications Review, February 1994).
In September 1993, the FCC allocated 160 MHz exclusively for PCS. It allocated another 3 MHz in the 900-MHz band for " narrowband" PCS systems such as advanced paging and enhanced messaging. The FCC adopted a scheme to relocate existing wireless users who occupy frequencies now allocated to PCS. It also allocated additional spectrum for new, global mobile satellite service (MSS). The FCC's actions paved the way for PCS service to be offered in about 2,960 new licensable markets. By authorizing up to seven different total licensees in two overlapping regional market areas - two licensees with 30 MHz each in each of the 51 Major Trading Areas (MTAs) and five licensees (one with 20 MHz and four with 10 MHz) in each of the 492 Basic Trading Areas (BTAs), the FCC attempted to spread the PCS opportunity to as many potential new entrants and participants as possible. However, existing communications statutes continue to bar foreign firms from owning more than 25% of any U.S. spectrum license. (Business Communications Review, February 1994).
The FCC chose to allocate licenses using a simultaneous multiple round auction. In July 1994, the FCC completed its first auction of ten nationwide narrowband PCS licenses in three different types: five 50/50 KHz licenses; three 50/12.5 kHz licenses, and three 50 kHz (one of the 50 kHz licenses was awarded as a pioneer preference (the FCC's Pioneer Preference scheme was adopted in 1990 to provide licensing possibilities to small, entrepreneurial firms that might otherwise find the application and auction process prohibitive (Business Communications Review, February 1994)). By November 1994, the FCC completed its auction of 30 regional narrowband licenses. The FCC auction of 99 This tool inserts the following tag: broadband PCS licenses in the 51 MTAs was completed on March 13, 1995. Two 30 MHz licenses in each of the MTAs were sold, except for New York, Los Angeles, and Washington, in which one of the two licenses were awarded as a Pioneer Preference (Cramton, The PCS Spectrum Auction: Theory to Practice). The FCC rules provide for a 10-year licensing term (Business Communications Review, February 1994).
Standards in the U.S.
The standard-setting process in the United States is influenced in three ways: by the FCC, by contributing to global standards development processes, and through participation in voluntary standard-setting. The FCC is the only agency in the U.S. that has the authority to set mandatory standards. For the most part, the FCC has limited its standard-setting to ensuring efficient use of the radio spectrum. The U.S. also has two public advisory committees that assist the Department of State in formulating positions with regards to the actions of the ITU. These organizations are the U.S. Organization for the International Telegraph and Telecommunications Consultative Committee (U.S. CCITT) and for the U.S. Organization for the International Radio Consultative Committee (U.S. CCIR) (Paetsch, 1993).
In some cases, participants concerned with a given standard (e.g., manufacturers, purchasers) will voluntarily agree to standardize certain features of a telecommunications product. To assist in this effort, a well-established standards body or a trade association initiates a process to obtain comments and build consensus. If consensus is reached, these voluntary standards may be accepted by the American National Standards Institute (ANSI). ANSI's primary role is to coordinate the standardization activities of the private sector. Its membership includes standards organizations, trade organizations, federal and state government bodies, professional groups and corporations. Increasingly, standards are being set by industry participants without any government involvement (Paetsch, 1993).
Competition among digital standards in the U.S. has focused primarily on TDMA and CDMA. U.S. wireless carriers have attempted to secure a competitive advantage in highly contentious market areas by making strategic choices between these standards. In 1988 the Cellular Telecommunications Industry Association (CTIA) published its user Performance Requirements, in which CTIA spelled out its major goals for the move from analog to digital technology. The resulting IS-54 standard is roughly based on a proposal first released Ericsson Corporation in 1988. As noted in a Hughes Network Systems publication, the chief goals were backward compatibility and a tenfold capacity increase in spectrum use. The CTIA incorporated these goals into IS-54 standard for TDMA. IS-54 served as the sole cellular digital standard until 1993, when CTIA developed the IS-95 standard for CDMA (Telephony January 10, 1994).
While TDMA is currently deployable, U.S. analysts believe CDMA deployment is still some years off. Companies are making strategic decisions to either deploy TDMA and offer customers digital service now, or claim that the higher quality of CDMA technology is worth the wait. In particularly competitive market areas such as Washington/Baltimore and Chicago, these decisions are viewed as important signals of the prominence of these standards.
McCaw Cellular was the first cellular carrier in the U.S. to deploy TDMA, offering service commercially at the beginning of 1993. Southwestern Bell has deployed TDMA in its Chicago market and is rolling out service in Boston Washington/Baltimore and other markets in 1994 (Telephony January 10, 1994). BellSouth cited TDMA's availability, proven performance and the ability to quickly roll out enhanced services to customers as its reasons for choosing TDMA (Telephony, June 20, 1994). Bell Atlantic Mobile announced in early 1994 that it is deploying TDMA digital technology in the Washington/Baltimore market rather that waiting for CDMA. The move, which marks the first time a CDMA proponent has chosen to deploy TDMA, will allow Bell Atlantic Mobile to go head-to-head with rival Southwestern Bell. Bell Atlantic explained that the market is ready for digital today, and that TDMA is proven, available, and working (Telephony February 28, 1994). In support of the TDMA choice, analysts assert that CDMA will not be fully field-ready for some time. Thus, if companies want to deploy digital in the near future, TDMA is the answer. TDMA supporters maintain that they do not see TDMA as a short-term transitional technology to CDMA, but rather a digital solution for the "foreseeable future" (Telephony, June 20, 1994).
CDMA proponents include a AirTouch, Ameritech, AT&T, GTE, Motorola, Northern Telecom, Nynex, Qualcom, and US West. The group is in the process of defining a set of technical requirements for CDMA equipment and services in order to facilitate market introduction of CDMA. The group is testing CDMA technology in three phases. Phase one, which involves the definition of laboratory test procedures for CDMA subscriber equipment, has been completed. Phase one established compliance with CDMA standard IS-95 and IS-98, which is the minimum performance standard for subscriber equipment. Phase two involves tests between subscriber equipment and CDMA network infrastructures. Those test requirements are in the process of being defined. The final phase will involve testing subscriber equipment in the field on true commercial systems. (Telephony, August 15, 1994). CDMA proponents claim the technology's superior voice and service quality characteristics (Telephony February 28, 1994). CDMA technology may also sharply reduce the number of cell sites, thus significantly lowering costs to carriers (Telephony March 7, 1994).
The proposed MCI investment in Nextel sparked renewed debate over the prospects for a modified version of the GSM technology as a standard for digital cellular in the U.S. (GSM is used by Nextel). While GSM has been virtually shut out of the U.S. standard debate, some analysts claim that GSM, which offers an entrenched equipment base, current availability, and supports several advanced services, could play a key role as companies try to roll out new services and fend off competition. The collapse of the MCI - Nextel partnership has further silenced talk of GSM in the U.S. with industry experts claiming there is little demand for GSM in the U.S. market (Telephony March 7, 1994).
Industry Structure in Asia
Privatization of national telecommunications authorities in the Asia-Pacific region is in various stages of progress, but the need to expand beyond domestic markets is evident. There is an increasing demand for mobile wireless services including multiple market operators and better telephone line penetration. With the deregulation of customer premises equipment (CPE) market in many countries of the Asia Pacific region, accompanied by the privatization of most telecommunications authorities, competition for lucrative contracts is on the rise. These sweeping changes have produced a policy shift beyond simply fulfilling the basic telecommunications requirements of the region. Increasingly, value-added services are demanded by this explosive market. The influence of firms in Japan, Canada, the United States, and the European Community is increasingly evident in a highly competitive telecommunications market. (Ernst Otto Weiss, pp. 6-10).
The Asia Pacific region has no other option but to develop its telecommunication infrastructure at a highly accelerated rate. It is estimated that in the next twenty years, the Asia Pacific region will need 500 million new telephone lines. Estimates of the size of the telecommunications market vary from 70-75 billion U.S. dollars to more than 100 billion U.S. dollars. Most of the telephone facilities are concentrated in urban areas, leaving the rural populations without even basic facilities (Karlheinz Kaske, p. 18). The mobile communications market in Asia Pacific region is forecast to catch up with and surpass the U.S. and European markets.
In Japan, the mobile cellular market was monopolized by the Nippon Telegraph and Telephone Corporation (NTT) until late 1988 when the Ministry of Posts and Telecommunications (MPT) decided to introduce competition. The Japanese Digital Cellular standard (JDC) was developed by NTT, in association with various equipment manufacturers including AT&T, Motorola, and Ericsson (Sweden). Two of these, Motorola and Ericsson, agreed to cooperate with NEC in promoting the JDC standard in Asia. This was a unique development since Motorola was at the same time promoting AMPS in the United States and Ericsson was committed to GSM in Europe. In early October 1992, the Japanese telecommunications ministry announced that cellular telephone services could be included in Japan's overseas development assistance program (ODA). The MPT believes that cellular systems - which can be installed in a short period of time, and at a relatively low cost - represent the best way of providing developing countries with a modern communications infrastructure. Many developing countries have already begun installing cellular systems since their ordinary networks are inadequate (Megumi Komiya, pp. 76-90).
Testing continues for PHS systems
Japan's Personal Handy-Phone System (PHS) has its roots in the first cordless telephones of the 1970s -- called CT1. The second generation of this technology, CT2, gave cordless telephones users a wider range of several hundred feet. PHS aims to break these constraints and allow users to roam anywhere within a few hundred feet of cell stations installed and operated by service providers. Operating near its base station at frequencies in the 1.9 GHz band, PHS functions like a cordless telephone and is billed as if it were any other home or office telephone. Out of range, however, the handy phone switches to the nearest cell station. Like a cellular phone it can send and receive calls as long as it remains within range of a cell station. Unlike a cellular phone it can not be used in cars or other fast-moving vehicles. Its chief benefit is the very low cost (about one third the cost of cellular service) which would allow almost ninety percent the populations of the developed countries to subscribe to PHS. The second benefit is a very wide usable bandwidth. Unlike cellular technology which uses much of its bandwidth to provide fast cell-to-cell switching (for use in fast moving vehicles), PHS has sufficient bandwidth for multimedia use. PHS service providers can compete with both telephone and cable television providers (Financial Times Ltd, September 1, 1994).
PHS service tests are being carried out mainly by telecommunications carriers with the cooperation of manufacturers, while interconnectivity tests are being conducted by 40 equipment manufacturers under the supervision of the Research and Development Center for Radio Systems (RCR). The service tests have been started by six groups including Nippon Telegraph and Telephone Corp., DDI Corp., Kokusai Denshin Denwa Co. (Overseas Communications Japan), Japan Telecom Co., Ltd., Teleway Japan Corp., Shikoku Information and Telecommunication Network Co., Inc., and Tokyo Telecommunication Network Co., Inc. The remaining two groups, including Tokyo Telemessage Inc., and the Kansai Personal Handy Phone Research Association, are scheduled to start tests soon. The interconnectivity tests are now under way in the metropolitan Tokyo area by individual makers using the facilities of the companies participating in the service tests.
The remarkable feature of PHS is its flexibility in operation. It can be used at home or in the office by accessing the base unit which is connected to the public switched telephone network, and can also be used on the street by accessing the cell stations installed in that area. PHS has a function for location registration for outdoor use. The area size covered by a single cell station will be within a radius of 100 to 200 meters. To prevent illicit use of the terminals, it also has an authentication function. Call charges for PHS are expected to be less than the existing cellular mobile phone charges.
There are three types of PHS system configurations being studied: using public digital networks, connection to public networks, and establishment of an independent network. In the type using the public networks, PHS operation, including authentication and location registration, is dependent on the network. However, the authentication function may be incorporated into PHS. In the type connecting to public networks, all of the PHS functions except roaming are incorporated in PHS. It uses public networks to connect calls. In the type possessing independent networks, all of the PHS functions are incorporated in PHS. Therefore, it uses public networks only for network interconnection. In any of the configurations, both outgoing and incoming calls are available. However, for the configuration connecting to public networks, availability of outgoing calls only is also being considered. In this case, radio pagers are used in place of a receiving function. There is a possibility that PHS could be introduced to the world market in the future, since the product adopts the 1,900 MHz frequency approved by the World Administrative Radio Conference (WARC).
Regulatory Environment in Asia
On June 24, 1994 based upon a report from the "Study Group for Evaluation of the PHS Field Trials" and opinions from telecommunications carriers and users in field trials, MPT decided the servicing guideline for commercialization of PHS. MPT issued Type I Carrier Licenses to 21 PHS carriers on January 31st 1995. The 21 carriers were divided into 3 groups: The ASTEL group, DDI Packet Telephone Group and NTT Personal Communications Network group. PHS Service will start in July 1995 in Tokyo and Hokkaido, and most other carriers are scheduled to begin services in October 1995. There are 21 Personal Handy-Phone Systems carriers licensed by the MPT.
Included in the MPT guideline were the following points:
• Initially, among frequencies in the 1.9 GHz band, the 12 MHz bandwidth would be allocated to up to three PHS carriers in each regional block. Once the PHS business was under way and the status of frequency utilization and demand trends was understood, the allocation of additional frequencies would be studied.
• Within five years of the launch of the business, each carrier must strive to provide PHS service over an area containing at least 50% of the population of the regional block.
• To ensure the sound development of the PHS business and fair competition, public switched network carriers would allow PHS carriers to connect to the public switched network under fair conditions. This means carriers who offer subscriber telephone service or ISDN service via subscriber lines.
On November 1, 1994 the MPT imposed additional guidelines for the expansion of PHS in Japan:
• To expand usage, PHS carriers were expected to provide services with reasonable, diverse rates and over a wider service area, including industrial parks and exhibition sites.
• Carriers are requested develop plans for the inter-connection facilities required for PHS networks to operate efficiently. Fair terms and conditions including fair interconnection charge schedules were mandated.
• The numbering system for PHS would conform to a specified format: (service identification number) -xx (carrier identification number) -xxxxx (subscriber number).
• PHS carriers were to push forward toward inter-regional roaming which would allow for nationwide utilization of PHS services.
• Related organizations were required to provide for the use of public facilities and buildings (traffic lights, utility poles, public telephone booths and railway stations) on reasonable terms and under equal conditions so that PHS carriers can establish base stations efficiently.
The government of Hong Kong became the first outside Japan to adopt Japan's PHS during the summer of 1994. This month (April 1995), five Japanese telecommunications service companies and equipment manufacturers are conducting a joint experiment of PHS in China. NTT, DDI Corp., NEC Corp., Fujitsu Ltd. and Matsushita Electric Industrial Co. are conducting the experiment with the cooperation of China's Ministry of Post & Telecommunications. With Western companies entering the Chinese market with their own phone systems, the Japanese companies decided that a joint experiment would be a quicker way of marketing their products than conducting individual experiments (Japan Economic Newswire, January 7, 1995). The Japanese government is developing a proposal which if accepted by its neighbors could make PHS the pan-Asian standard for wireless communication. (Financial Times Ltd, September 1, 1994).
Standards in Asia
In many parts of Asia, the current network infrastructure is composed of elements which are compatible with European technical standards. Since mobile communications needs to interface to the fixed network, the ground may be already prepared in some Asian countries for the introduction of the European cellular standard, GSM. In fact, the People's Republic of China, Singapore, India all seem to be adopting GSM for the near future. Hong Kong is using a combination of US and European systems. (Megumi Komiya, p. 86)
As for telecommunications infrastructure, smooth globalization begins at a stage where regional users can actively use telecommunications. Based on this concept, the AII (Asia-Pacific Information Infrastructure) concept was proposed in APT and has been promoted for the Asia-Pacific region. This was followed by APII proposal of Korea at APEC in 1994. Japan has been creating an info-communications infrastructure and has been providing the required technology and know-how through AIC (Asian ISDN Council). These efforts will actively be continued .
The Japanese are pushing hard throughout Asia to have their PHS standard adopted. Thailand and Hong Kong will be next online starting early 1996, although standards have yet to be finalized. Operators using the PCS system are to begin services in Singapore and Malaysia in October 1995. But the struggle is only beginning. Experts expect these standards to go head to head until one dominates the international market (Asiaweek, February 3, 1995).
Section V: Policy Alternatives
Based on the context provided above, this Green Paper recommends consideration of three policy alternatives: De facto standard setting; definition by ITU, and definition by international research and development consortia. Given the differences in the PCS markets in Europe, United States, and Asia, any standardization effort is likely to encompass elements of each of these policy alternatives.
De facto Standard Setting
Standard-setting responsibility is left to the private sector. Standards may be set in the private sector either formally or informally. Industry participants may voluntarily seek to establish a standard for a specific technology or process. Perhaps with the assistance of a standard-setting body, the participants will solicit comments and attempt to build consensus. Eventually, the standard that is recommended by this process is then adopted by standard-setting bodies. More informally, proponents of a particular technology may try to force a standard by building a critical mass in the market place. Once this critical mass had been reached, the technology becomes the de facto standard. Particularly in the U.S., standards are often set by the private sector with no government intervention. Proponents of this method argue that a rigid standard-setting process stifles innovation and may lead to the adoption of less effective standards. Alternatively, critics worry that the lack of formal standard-setting procedures increases the chance that technologies are deployed only to be immediately out-dated.
Definition by ITU
National governments rely on international organizations such as the ITU to define standards. Standards that are defined by the ITU are then incorporated into national guidelines by the participating nations. While ITU has the largest international membership of all the standard-setting bodies, it lacks enforcement mechanisms. Additionally, sovereign nations may be unwilling to vote for technically superior standards that are unpopular with domestic providers. As a result, standard setting efforts could be log-jammed.
Definition by International Consortia
This alternative proposes that standards be defined by research consortiums, which are responsible for the entire standardization effort. In principle this consortia would be responsible for:
• assessing existing technologies
• producing standards alternatives based on needs and technologies available
• deploying testbeds
• evaluating results
• defining standards
• coordinate implementation
For example, in Europe this task is currently being assigned to the Research and technology development in Advanced Communications technologies in Europe (RACE) consortia. It was established in 1985 with the main objective of integrating existing technologies and advocating the transition towards more general standards. RACE is a collaborative European research program, running from June 1987 to December 1995 (including Phases I & II and extension). It receives a financial contribution from the European Community of 1103 MECU which represents less than 50% of the overall effort estimated at 2500 MECU.
The role of international standardization in telecommunications is changing. An increasingly important role for standardization is the reduction of costs through the availability of a larger global market. The coordination of national standards benefits the user through making available a wide range of services and the development of new telecommunications devices with a greater range of choice of suppliers. The definition of a standard service gives the suppliers greater confidence that substantial investments in research and development will yield huge returns in this exploding market. This is particularly important in the integration of various levels of service. These benefits must be weighed against the disadvantages. If there is too great an emphasis on standards, the development of new and imaginative ways of deploying new technology may be inhibited which in turn can limit the operation of market forces to choose emerging products ("Trends of Change in Telecommunications Policy," #13, Organization for Economic Co-Operation and Development (OECD), 1987, pp. 166-167). As policy makers work with industry participants to bring PCS to an international market, all of these issues must be carefully balanced.
† Technical References
 Telecom 91 Update: Four Years hat Changed the World, Data Communications, October 1991
 ISDN: Some Current Standards Difficulties, Telecommunications, June 1991
† World Administrative Radio Conference
† Interim Working Party 8
† Future Public Land Mobile Telecommunications Systems
† Universal Mobile Telecommunications Services
 Integrated Digital Services Networks - Architectures/Protocols/Standards, Hermann J.Helgert, pub. Addison Wesley, 1991
 Lonn Henrichsen, Chief of Telecommunications in the Office of the Secretary, Transportation Department of the US Federal Government, quoted in Government Computer News June 11, 1990
 Nationwide ISDN: One Step Closer to Reality?, Telecommunications, November 1990
 Mary Johnston Turner of Northeast Consulting, quoted in Telecommunications April 1991
 Newsfront, Data Communications, April 1991
 ISDN - Not Dead, Just Sleeping, Data Comm Focus/Finnerman, Business Communications Review
 SMDS: The Beginning of WAN Superhighways. Data Communications, April 1991