Two of the world’s most important telecommunications standards - cellular and WiFi - have gone from complementary coexistence to escalating competition with each other. At the same time, the participants in cellular and WiFi standard development, the technology itself, and the applications of the technology, have begun converging. At stake is the lucrative IoT market – a winner take all market that does not need the same bells and whistles as smart phones or even connected cars. This seems to be bringing out the worst in some of the participants in standard development and in some SEP holders. Enforcement of existing laws, and contemplation of new ones, might be necessary for this new modality.
As the technological differences that have characterized different modes of communication disappear, new regulations and policies will be needed that are focused more on services and industry/market structure than on technology.[1]
Here is the first post in what I anticipate will be an occasional series on these issues. This post focuses on the history of WiFi and cellular technologies and standards. I have previously posted about some of the early days of WiFi and cellular (Is SEP Licensing Necessary to Encourage SEP Development - Part 1 (sepessentials.com)) but from a somewhat different perspective. I thought it would be useful to add some additional information to my previous posts for context.
WiFi
The origins of what became WiFi (the IEEE’s 802.11 standard) began back in the mid-1980s when the U.S. Federal Communications Commission released two bands (900 and 2400 MHz) for unlicensed use. There were several companies that began making wireless devices in the unlicensed band for bar code scanning and inventory management.[2] By the mid-1990s, at least one company, Aironet, was already making wireless access points that looked remarkably like today’s 802.11-based access points, but using its own ARLAN technology. Other companies also were making their own proprietary wireless devices for various applications.
At the same time, starting in 1990, the IEEE set up a working group to explore establishing a standard for “wireless ethernet.” The first official meeting recorded in the IEEE’s meeting minutes took place in October of 1990,[3] but the process of forming the working group and reviewing technologies had begun several years before that meeting.[4] The 802.11 Working Group’s chairman for its first decade or so was the legendary “father of Wi-Fi” Vic Hayes who was employed by NCR (e.g. by National Cash Register) at the time the working group was formed.[5]
As Mr. Hayes describes, NCR was looking to create a standardized technology to use for wireless cash registers. The stated purpose of the new IEEE standard was therefore, not surprisingly, to “provide wireless connectivity to automatic machinery, stations that require rapid deployment, which are portable, or hand-held or which are mounted on moving vehicles within a local area.”[3] In other words, Wi-Fi's intended use was primarily for what we would now call IoT devices, e.g. bar code scanners, point-of-sale devices and other types of “automatic machinery,” and also for connected cars.[6]
NCR was acquired by AT&T in 1991 shortly after the working group was formed. With the acquisition, Vic Hayes went over to AT&T but he remained the chair and driving force behind the creation of what became, in 1999 with the release of 802.11a and 802.11b and the formation of the Wi-Fi Alliance, known as “WiFi.” Most of the early wireless companies (e.g. those that already had solutions in the unlicensed bands prior to standardization), including Aironet, Harris, the WaveLAN companies (NCR – which had developed the WaveLAN technology - and AT&T which acquired that technology when it acquired NCR) and others, participated in the development of 802.11. In the process, most of them contributed or abandoned their proprietary solutions in favor of standardization.
In those early days, the WiFi developers had some interest in cellular technologies but viewed them as fundamentally different. The first plenary 802.11 working group minutes notes that ETSI was working on what would later became one of the cellular standards. The group concluded that the ETSI “radio has peculiar characteristics . . . It might be wise to watch what they do.” The minutes also noted that what ETSI was trying to develop was very different from what they were trying to develop: ETSI was aiming for a standard “optimized for voice transmission” but not to carry both voice and data.[7]
Cellular
Cellular was developed as a mechanism for mobile telephone calls. The history of cellular is much more convoluted than WiFi because, for many years, there were several different, mostly national cellular “standards”.
ETSI was only established in 1988 (just about the same time what became the 802.11 working group began to form) with the goal of creating a European standard for 2G cellular communications.[8] By the early 1990s, ETSI had adopted GSM as its baseline technology and the European version of 2G was launched in 1991.
The Japanese (who are credited with deploying the first 1G cellular network) had their own version of 2G that did not interoperate with the European version. Even within the United States, different cellular telephone companies and handset suppliers used different, non-interoperable base-line 2G technologies (GSM and CDMA). 2G technologies were essentially voice-only. They did not really carry what we would call data: does anyone still remember how you used to have to text using the letters on the number pad because cell phones only handled SMS texts?
3G was released in 2001, several years after the release of the first successful Wi-Fi standards, 802.11a and 802.11b in 1999. In its initial incarnation, it was still all over the map with different countries using different underlying technologies (CDMA2000 vs. W-CDMA) or the same underlying technology but different radio interfaces.[9]
3G is often credited with being the first cellular standard to truly carry data. But, speaking as someone who lives in a country where our networks often dial you back down to 3G, I can’t say that I find that to be true. I cannot surf the web or download information on my phone when it is in 3G mode. Real data carrying in the modern sense did not emerge until the implementation of 4G/LTE.
But what early cellular technologies did do was to allow people to make telephone calls on a mobile device. With a cell phone, you could be out and about and still dial a telephone number and make a telephone call. You were no longer locked in to using a land-line, circuit switched telephone to make calls. That was the original purpose of cell phones and of the various cellular communication standards.
Early Co-Existence
In short, the WiFi and cellular standards were designed—at least in their original incarnations—to carry different types of information (data vs voice) using different protocols (IP vs cellular) for use on different devices (IoT devices and laptops vs phones). They started out as being complementary technologies and complementary standards. Cellular allowed people to make telephone calls when they were outside of a house, office, retail facility or manufacturing plant. Wi-Fi allowed people to access data when they were in a house, office, retail facility or manufacturing plant.
The two sets of standards were largely created by different companies for different applications. Although there was some discussion in the very early years about whether they would compete, because of these differences, they coexisted well for many years. This began to change, however, about when 4G/LTE came into play. In my next post in this occasional series, I’ll explain how and why things changed, a change that has only accelerated in the last few years.
[1] One of the best parts of writing this blog is doing research that sometimes results in finding hidden, forgotten gems tucked away in random parts of the Internet. I came across this fascinating paper put out by the U.S. Office of Technology Assessment in September 1995 about wireless technologies which can be found here: Wireless Technologies and the National Information Infrastructure (princeton.edu). It has an interesting discussion of the many potential future uses of wireless technology (with lots of prescient examples) and the additional technologies that needed to be developed or improved to make wireless work better (battery and improved spread spectrum technologies for example). It also discusses the competition issues that impact the future of wireless and the entirety of the U.S.’ “evolving telecommunications and information infrastructure-more formally known as the National Information Infrastructure (NII).”
[2] This history borrows from information from another WiFi legend, Fred Niehaus: TBS - 08 – Fred Niehaus – The History of WiFi (youtube.com) (as well as other sources). This one is interesting too: TBS - 09 – Fred Niehaus – The History of WiFi Part 2 (youtube.com).
[4] Home Page (ieee802.org): “Before the IEEE 802.11 Working Group was formed, an IEEE 802 Task Group - Task Group 4L - met to consider the requirements, technological capability and how to serve these with an IEEE 802 standard. The group met for roughly four years and ended up writing a project authorization request (PAR) for the original 802.11 Working Group.”
[5] Watch Conversation with the father of WiFi, Mr Vic Hayes - YouTube; see also Living Legend: Vic Hayes | Network World and Victor Hayes - IEEE Computer Society.
[6] A May 1994 Byte magazine article characterized the 900 and 2400 MHz bands (which is what the original proprietary wireless solutions operated in) as being for “Industrial” uses. Byte May 1994 (vintageapple.org) at 108.
[9] See for example, (99+) 3G Standards: the battle between WCDMA and CDMA2000 | Anders Henten - Academia.edu