My posts reflect my personal experiences and conclusions, grounded in my observations and supporting evidence. I also recognize the value of diverse perspectives in understanding standards and SEP-related issues. To enrich the conversation, I have decided to host occasional guest posts that offer insights and reflections from individuals with unique expertise or interesting viewpoints.
My first guest post is from Earl Nied. Earl writes his own posts (see Earl Nied | LinkedIn) but has graciously consented to write one for SEPEssentials. Earl was a member of ETSI for nearly twenty years, a member of the DVB Project for about fourteen years, and has been Vice-Chair and Chair of the American National Standards Institute’s (ANSI) Intellectual Property Rights Policy Committee. He spent more than 40 years at Intel, where he was involved in Intel’s standards and IPR-related work. His standards-related experience is deep and vast.
So, I will turn from my posts on why the F/RAND commitment (the U.S. typically uses the terminology RAND whereas Europe typically uses FRAND - for purposes of this article, the terms RAND and FRAND are used interchangeably to mean the same thing) is important to prevent anti-competitive behavior to Earl’s guest post on how the F/RAND commitment can increase a contributing company’s bottom line. What follows is Earl’s description of his “personal journey” with standardization and the benefits of licensing on reasonable terms (and what happens when a commitment to license on reasonable terms is withheld). Earl has witnessed first-hand how standards and a commitment to license on reasonable terms can lead to the development of an entire industry.
The Importance of RAND: A Personal Journey
By Earl Nied
Chapter 1: The Foundational Years
My journey into technology and standards-related issues began in 1972 during my freshman year of college. By the mid-1970s, I was working in a computer center filled with giant machines and the hum of fans, magnetic tape, paper card readers, and the staccato of printers, all cooled with a massive air conditioning system. One day, I witnessed something simple that left an indelible mark on me: individual bits on magnetic tape made visible to the naked eye under a thin solution of rust particles suspended in water. I watched those tiny bits—just 64 per square inch—come to life. At that moment, something inside me clicked. These days, we are way beyond that simple demonstration, considering the five terabits per square inch we can store today (more than 78 billion times denser). But it sparked a fascination that would shape the rest of my life. It ignited my passion for understanding how things worked and, more importantly, how I could build something of my own.
By 1975, my curiosity had evolved into a full-fledged obsession, and I embarked on one of my proudest projects: designing and building my own parallel-processor computer system. This project wasn’t just about assembling parts—it was about diving into the unknown. The system evolved to include a custom floppy disk controller, memory, an operating system, and various peripherals, like a display terminal and printer. I remember staying up late with the steady hum of electronics in the background. I carefully soldered and wire-wrapped connections, testing and debugging the result multiple times until it worked.
One of the biggest challenges was the memory circuits. At first, I built a 2-kilobit memory using 20 integrated circuits and 400 wiring connections—an intricate, painstaking process. But as I progressed, I realized I was fighting an uphill battle. I realized that instead of creating it all myself, I could use existing memory boards designed for the Altair 8800 microcomputer. They were cost-effective and built for the S-100 bus standard (IEEE 696-1983). I added two of these memory boards into my custom design. These two boards expanded my system’s memory by 16,384 bytes (0.016 MB), a 250% increase. That moment was a revelation: I didn’t need to reinvent every aspect of the system. I could focus on designing the more complex and leading-edge aspects and use existing subsystems already honed by the expertise of others.
A few years later, I used my homemade (by then 20 kilobytes) microprocessor system and terminal to demonstrate a proof of concept interactive program that recreated the functionality of a return-on-investment analysis program running on a small IBM mainframe computer at a commercial real estate firm. Thinking back, the scale of it all is humbling. My system operated on an 8-bit processor with just 20 KB of memory—smaller than a single email attachment today. Modern smartphones use several multicore 64-bit processors with about 8 GB of memory, a mind-boggling difference. But at the time, my system felt monumental, a testament to what could be achieved with curiosity and persistence and the judicious use of components made by other people with different expertise than mine.
Chapter 2: Intel’s Early Support for Standards
I joined Intel in 1982. By then, Intel had already built a reputation for pushing the limits of what was possible in memory chip manufacturing and was poised to revolutionize the computing world through microprocessors. I arrived at Intel just as the pace of innovation was accelerating rapidly. What had started as a company built around memory chips quickly evolved into a force driving the future of computing.
At the heart of this evolution was Moore’s Law, the observation made by Intel’s co-founder, Gordon Moore, that the number of transistors on integrated circuits doubles approximately every two years. To put this into perspective, imagine starting with a 1908 Ford Model-T with a 20-horsepower engine. If engine power followed the same exponential growth as Moore’s Law, by 1913—just five years later—the Model-T would be roaring with 512 horsepower, showcasing the incredible speed of technological advancement in computing, which has been unparalleled in other industries for more [note from Marta, see my post: Homage to the Semiconductor Chip for additional information]. What might have seemed like a simple prediction was turning into a self-fulfilling prophecy that drove Intel’s relentless march forward. With each generation of microprocessors, Intel wasn’t just making chips faster—they were completely redefining what computers could do.
But there was a problem. The microprocessor, impressive in its exponential evolution, needed a whole ecosystem of equally sophisticated supporting systems to realize its full potential. As I also had learned in my earlier homemade project, Intel’s leadership understood that they couldn’t effectively build the entire ecosystem alone. Intel made one of its smartest strategic moves: embracing and driving the development of industry standards. Intel understood that collaboration on interface technology and developing and using standard protocols were essential for progress.
In those early days, Intel, like many others, contributed its own technology for standardization. Intel shared its Multibus bus technology with the IEEE for standardization purposes. The IEEE adopted it as IEEE 796.
By contributing its technology for standardization, Intel gained a lot. The standard allowed complex device circuitry (disk and memory controllers, etc.) to share a bus with the computing circuitry (CPU, RAM, ROM, Clocks, etc.), which resulted in industrial controller applications and workstations (such as those from Sun, HP and Silicon Graphics) that used Intel chips. I saw in those early days that companies would contribute to interface standards because standardization allowed them to sell more products for more applications than they would achieve if they had to build out all the supporting technology on their own. Intel’s reward for its contribution to the standard was the creation of an entire ecosystem of products that relied upon Intel’s chips, something that Intel could not have done on its own.
Chapter 3: Without Standards, We Would Not Have Had the Desktop Computer Revolution
In those days, IBM was a titan in the computing world, with its powerful but proprietary mainframes. IBM dominated the computing landscape for decades. In the early 1980s, the company set its sights on a new frontier: personal computers (“PCs”). When IBM launched its first PC, it commercialized a revolutionary new design — a motherboard containing the microprocessor and a “PC” bus that allowed plugging in specialized circuitry for peripheral devices. This simple but effective design laid the foundation for a computing revolution.
As the demand for personal computers grew, so did the need for more powerful systems. Initially, the PC bus IBM designed was known as the ISA (Industry Standard Architecture) bus. While no standard-setting organization was involved, it was popularized by many manufacturers, who then built special circuits for IBM PCs, PC clones, and alternative microprocessor-based systems. Most implementers accepted IBM’s license royalty structure to use the ISA bus technology. This willingness of IBM to open its proprietary technology to others and license it on generally acceptable terms helped propel the ISA bus forward, supporting the development of the entire PC industry.
In 1987, IBM attempted to step away from its ISA open technology and replace it with a more proprietary version. While IBM’s Micro Channel Architecture (MCA) offered superior performance to the ISA bus, the market rejected it because it was an expensive, proprietary, closed system. In a way, this marked a critical moment in the history of computing: open and interoperable standardized technology available on RAND terms would prove far more valuable than technically superior but proprietary interface solutions.
As an employee of Intel, I had a front-row seat to the PC industry’s rapid evolution. Intel’s microprocessors were advancing at a breathtaking pace, following the curve of Moore’s Law. But there was a growing problem. Even when the ISA bus was extended (EISA), it couldn’t keep up, and very few companies supported the use of IBM’s proprietary MCA. No matter how powerful Intel’s processors became, they were bottlenecked by outdated bus architectures.
Intel grappled with an age-old technological problem: modern, complex technological products depend on supporting technologies and products. Components of modern products do not exist in a vacuum. The performance of that same old bus reduced the performance of Intel’s next-generation processors. Intel needed a new bus solution to unleash its microprocessors’ full potential. Intel eventually developed its own proprietary bus solution: the Peripheral Component Interconnect (PCI) bus. This new architecture could finally provide its processors the bandwidth they needed to achieve full speed and performance.
But here is where things got interesting. Intel knew that for PCI to succeed, it had to be widely adopted—and quickly. So, Intel made a bold choice. Instead of keeping PCI as a proprietary technology, Intel partnered with other major players in the PC industry, including IBM, Compaq, Dell, HP, and others. Together, these competitors created a loose consortium to develop and manage the “standard,” intending to make PCI the new backbone of personal computing.
But the industry was wary. Would Intel demand significant royalties for this new technology, as IBM had done with the MCA bus? That did not happen, however, because Intel recognized that the goal was not to lock down the market with royalties. Instead, the goal was to improve the entire ecosystem to fully utilize the power of Intel’s processors and increase overall sales. So, Intel made PCI available under a limited, royalty-free license to reassure other manufacturers and encourage rapid adoption. While other founders did not make the same royalty-free commitment, few (except IBM) sought royalties.
Ultimately, the PCI groups’s members operated with a complex mixture of licensing strategies. But they nearly universally operated within a framework that was, by most accounts, reasonable and nondiscriminatory in spirit and practice. The result was that PCI quickly overtook older standards like ISA/EISA or IBM’s MCA, becoming the dominant bus architecture for the next generation of computers.
Although I wasn’t directly involved in Intel’s early PCI decision-making, I thought Intel’s choices were logical and eminently reasonable. Looking back, it is clear that PCI’s story wasn’t just about a technical breakthrough—it was about a strategic vision for how open standards could fuel innovation. Intel did not just build a faster bus; Intel opened the bus technology up, licensed it on reasonable terms, and thus played a pivotal role in creating a more robust, more dynamic, and larger industry for itself and others.
Chapter 4: The Importance of the RAND Commitment to the Continuing Success of the PCI Standard (and its successors)
By 2000, PCI had proven so successful that the companies involved created a formal standard organization with formalized definitions of RAND, named the PCI Special Interest Group (PCI-SIG). The PCI SIG continues to this day, overseeing the evolution and continuing development of the PCI standard, which has since transformed into PCI Express. Now in its seventh generation. PCI Express remains a critical component of modern computing, powering everything from desktops to data centers.
The PCI example highlighted that companies were generally willing to collaborate on interface standards and implement those standards provided that the competitive landscape remained balanced. This neutrality encouraged nearly all PCI Special Interest Group (SIG) members to actively contribute to the standard, drawn by the potential to participate in a growing market for complementary products. Participants recognized the strategic advantage of working with competitors to establish a unified interface standard while differentiating through their product implementations.
Standard-essential patents (SEPs) were involved, but they did not deter implementation because of the RAND (Reasonable and Nondiscriminatory) commitment. The RAND commitment played a crucial role, as it was foundational to the successful implementation of the standard. It established a framework where SEP holders agreed to license their patents fairly and ensure access to the standard was not impeded.
The PCI SIG primarily attracted manufacturing companies invested in the success of the standard, as their core revenues relied on product and industry growth. While one member sought limited royalties, most participants (including Intel) generally refrained from royalty demands except as part of a defensive strategy. Most members holding patents leveraged them only for cross-licensing negotiations and other defensive purposes.
I saw first-hand how the RAND licensing commitment encouraged broad-based market expansion with minimal licensing disputes, allowing for the rapid growth and widespread adoption of an entirely new ecosystem of products.
Chapter 5: RAMBUS and the JEDEC Standard
Unlike the smooth progress of the PCI standard, the development of JEDEC’s memory standards faced disruption when a member company, Rambus, concealed a patent essential to the standard. Once the standard was finalized, Rambus refused to license the patent on RAND terms, leading to over a decade of litigation and legal battles. Rambus sought to exploit the JEDEC IPR policy, deceiving and disadvantaging other JEDEC members. The resulting litigation, Including by the FTC, European Commission, and individual parties, proved costly not only for Rambus but also for JEDEC, memory chip manufacturers, and OEMs. This experience led SSOs to enhance their IPR policies to reduce the risk of misinterpreting the RAND commitment and avoiding potential abuses during standards development. Although Rambus continued to innovate in-memory technology after leaving JEDEC, its track record in patent litigation was extensive and had mixed results. Unlike the smooth industry experience of the PCI SIG, the abuse of the RAND commitment led to a litigation-intensive environment in the memory space.
Chapter 6: The Critical Role of RAND in PC Development
Thanks to the computer industry’s proactive efforts, RAND-based standards have become ubiquitous. A 2010 study led by Professor Brad Biddle found 251 standards in a single laptop, with 75% developed under RAND commitments. This study highlights the critical role of RAND obligations in creating interoperable, complex technical products to support entire industries.
The RAND commitment has become a cornerstone of industry collaboration, fostering innovation while enabling flexible licensing of intellectual property in the IT sector. The information and communications technology (ICT) sector also largely relies on RAND, though some key contributors seek royalties within this framework. Nonetheless, a version of RAND—RF-RAND, which includes a royalty-free component—has proven particularly effective in certain sectors, such as Internet protocols and specific hardware standards like USB. RAND and its derivatives continue to drive growth and ensure interoperability across diverse industry sectors.
Conclusion:
If followed in spirit, RAND provides flexibility for various licensing negotiations between diverse parties, supporting fair competition and industry-wide economic growth. If applied in good faith negotiations, standards developers can cooperate to create the rules and protocols necessary for workable interfaces and standards between products while promoting competition on product implementations. Rambus’s abuse threatened to destroy trust, undermining other companies’ willingness to participate in the standards-development process.
To continue to grow our industries, we must continue to use diligence, to learn from these types of practical experiences, and to evolve and disseminate these critical learnings into future SSO IP policies. If everyone operated in good faith to develop standards, declare relevant patents and honor their commitments to license on RAND terms, we can achieve a robust and competitive future.