There is a quiet revolution happening in the world of processor architecture. While most people navigate their digital lives powered by chips running Intel’s x86 or Arm’s architecture, a third contender, one born in a university lab and governed by a Swiss non-profit, is steadily infiltrating everything from smartwatches to data centre servers. Its name is RISC-V, and its rise is one of the most consequential developments in computing since the PC era.To understand why RISC-V matters, we first need to understand the landscape it is disrupting.
A Tale of Three Architectures
Modern processors speak in instruction set architectures, or ISAs, the fundamental “language” that software uses to communicate with hardware. Think of an ISA as a contract: software written for one ISA will run on any chip implementing it, regardless of the manufacturer. For the past four decades, two ISAs have dominated computing.
x86, developed by Intel in 1978 and later extended by AMD, became the backbone of personal computing and the server industry. It is a Complex Instruction Set Computer (CISC) architecture, one that packs a rich, historically accumulated set of instructions, making it enormously capable and backward-compatible, but also complex and power-hungry. If you are reading this on a Windows PC or a Mac running an Intel chip, you are running x86.
ARM (now Arm Holdings), born in Cambridge in 1983 and refined continuously since, took a different philosophy: Reduced Instruction Set Computer (RISC). Fewer, cleaner instructions, executed with blazing efficiency. This made ARM the undisputed king of mobile, your smartphone, regardless of brand, almost certainly runs an Arm-based chip. Apple’s M-series processors, which displaced Intel in Macs, also run ARM. The architecture’s efficiency makes it increasingly attractive for servers too, as seen in Amazon’s Graviton lineup and Ampere’s cloud chips.
Both x86 and ARM are proprietary. To build a chip using Intel’s x86 design, you need a licence from Intel or AMD, a process that is expensive, legally constrained, and subject to geopolitical risk. Arm licenses its architecture to chip designers (Qualcomm, Apple, MediaTek, Samsung, etc.) but ultimately controls the roadmap, the terms, and the prices. In 2022, Arm raised its licensing fees substantially, a move that sent shockwaves through the industry. Both models mean that the fundamental instruction set at the heart of your processor is controlled by a private entity, in a specific country, subject to that country’s trade regulations. RISC-V was designed to break that model entirely.
The Fifth RISC: A Clean Slate
RISC-V (pronounced “risk five”) emerged from the University of California, Berkeley in 2010, led by professors Krste Asanović and David Patterson. The “V” stands for the fifth generation of RISC architectures developed at Berkeley since 1981. It is a clean, modern RISC design: lean base instruction sets, modular extensions, and, crucially, completely open and royalty-free.
As of March 2020, the organisation governing the architecture was named RISC-V International, a Swiss non-profit business association. It freely publishes all documents defining RISC-V and permits unrestricted use of the ISA for the design of software and hardware.
The decision to relocate governance to Switzerland was not accidental. RISC-V International incorporated in Switzerland to calm concerns of political disruption to the open collaboration model, reflecting community concern and managing strategic risk for those investing in RISC-V for the next 50+ years. With its headquarters in Geneva, a city synonymous with international neutrality and institutional trust which I proudly call home, RISC-V International sends a deliberate signal: this architecture belongs to the world, not to any single nation or commercial interest.
RISC-V International has grown to over 4,500 members as of 2025, with the open-source instruction set architecture challenging ARM and x86 dominance. Members include Google, NVIDIA, Qualcomm, Samsung, Western Digital, and hundreds of universities and startups. Only members of RISC-V International can vote to approve changes, and only member organisations use the trademarked compatibility logo.
Modular by Design, Powerful by Extension
One of RISC-V’s most elegant architectural features is its modularity. The base integer instruction sets, RV32I (32-bit) and RV64I (64-bit), are intentionally minimal and frozen. Designers then add standardised extension letters based on their needs:
- M-Integer multiplication and division
- A-Atomic operations (for multi-core synchronisation)
- F / D-Single and double-precision floating-point
- C-Compressed instructions (for code density in embedded use)
- H-Hypervisor support (for virtualisation)
And then, perhaps most significantly for our era: V.
The “V” That Could Define the AI Decade
The V extension is RISC-V’s vector processing capability, and it is arguably the most strategically important module in the entire ISA.
Where standard scalar instructions operate on one data element at a time, vector instructions operate on entire arrays of data simultaneously. This is the engine behind AI inference and training: matrix multiplications, dot products, convolutions, the mathematical building blocks of neural networks, are all fundamentally vector operations.
At the RISC-V Summit in October 2025, SiFive’s senior principal architect John Simpson explained how RISC-V’s open architecture enables implementers to tailor cores for specific AI and machine learning domains. He emphasized that profiles such as RVA23 provide a shared baseline that preserves software portability across vendors, with the ecosystem depending on this balance between customization and standardization to scale without fragmentation.
Data from Epoch AI show that AI workloads have shifted from primarily vector-oriented computation toward large-scale matrix operations, increasing demand for efficient matrix multiplication and broader support for reduced-precision data types. RISC-V’s open extension model means that industry players can standardise new AI-specific instructions collaboratively, without waiting for a proprietary licensor to decide when, or whether, such capabilities serve their commercial interests.
Semidynamics, a European IP core supplier, recently introduced a RISC-V Tensor Unit that supports streaming workloads, sparse and dense tensor operations, and AI dataflow processing. By embedding vector and tensor capability directly into the CPU, the company is tackling energy efficiency challenges central to AI data centres. The result is that RISC-V is not simply chasing ARM and x86, it is being designed from the ground up for an AI-first computing paradigm.
Certifications, Safety, and Industrial Trust
For RISC-V to move from research curiosity to industrial workhorse, it needs to meet the demanding certification standards that govern safety-critical applications. Progress here has been substantial.
The Fraunhofer Institute for Photonic Microsystems became the first organization to develop a RISC-V core meeting the ISO 26262 automotive functional safety standard, which governs the reliability of electronic systems in road vehicles up to ASIL-D, the highest safety integrity level. Their EMSA5 IP core is a 32-bit processor with a five-stage pipeline available in both general-purpose and safety variants, with the safety variant meeting ISO 26262 automotive standards.
Beyond automotive, the IEC 61508 standard for functional safety of electrical systems and Common Criteria security certifications are being pursued by multiple RISC-V IP vendors. The RISC-V International Compliance Task Group maintains a public repository ensuring interoperability testing, while the Linux Foundation and RISC-V International jointly launched the RISC-V Foundational Associate (RVFA) certification exam to test foundational knowledge of the ISA and validate the growing talent pipeline the industry requires.
Progress in 2024–2025: The Ecosystem Matures
The last eighteen months have been transformative for RISC-V’s software ecosystem, which has historically been the architecture’s weakest link. A key theme for 2024 was optimising software to harness RISC-V vector instructions, with the RISE Project (RISC-V Software Ecosystem, a Linux Foundation initiative) coordinating work across QEMU support for ACPI, RISC-V IOMMU, Linux kernel interrupt drivers, and vector support for debugging and profiling clients.
By February 2025, significant progress had been made in supporting Android on RISC-V and introducing major AI and machine learning libraries into compatible environments. On the silicon front, the headlines have been striking. In March 2025, Alibaba’s DAMO Academy launched the server-grade Xuantie C930 core, which supports the next-generation RVA23 profile required by Ubuntu Linux from October 2025, and was advertised as ideal for servers, personal computers, and autonomous cars.
In January 2025, SpacemiT announced a server processor with up to 64 RISC-V cores, the VitalStone V100, manufactured on a 12nm-class process. These are not prototype chips, they represent serious, production-grade engineering ambition. Industry analysts noted at the RISC-V Summit North America in October 2025 that the performance gap between high-end Arm and RISC-V CPU cores is narrowing rapidly, with near parity projected by end of 2026.
Real Chips, Real Deployments
RISC-V is no longer a theoretical architecture. Here is a sample of chips and products actively using it today:
NVIDIA has been one of the quiet pioneers. NVIDIA has folded RISC-V cores into its GPUs and SoCs, expecting to ship a billion RISC-V cores across its GPUs, SoCs, and other products. These cores handle security, microcontroller, and management tasks within NVIDIA’s silicon.
Espressif Systems (the makers of the wildly popular ESP32 family) added a RISC-V ultra-low-power coprocessor to the ESP32-S2, then launched the ESP32-C3, a single-core, 32-bit, RISC-V-based MCU, now found in millions of IoT devices worldwide.
SiFive remains the most prominent pure-play RISC-V IP company. The SiFive Intelligence X390 processor is designed for edge AI and machine learning, featuring dual vector ALU and doubled vector length, delivering a fourfold improvement in vector computation and quadruple increase in sustained data bandwidth.
SpacemiT’s K1 chip, an octa-core 64-bit RISC-V SoC, already powers the Muse Book laptop and DeepComputing’s portable computers, making RISC-V laptops a commercial reality, not just a concept.
Tenstorrent, the AI chip startup co-founded by legendary computer architect Jim Keller, builds high-performance AI processors using RISC-V CPU cores and chiplet architectures, focusing on scalable compute from edge to data centre.
Raspberry Pi also ships products incorporating RISC-V microcontrollers alongside their Arm-based cores, reflecting the architecture’s growing role in hobbyist and educational hardware.
The Road Ahead: Mobile, Desktop, and Server
The three battlegrounds for RISC-V’s mainstream ambitions each present distinct challenges and opportunities.
Mobile is perhaps the most contested space. Arm’s grip on smartphone SoCs is near-absolute, underpinned by decades of software optimisation, vast IP libraries, and deeply entrenched supply chains. RISC-V is still maturing in Android support, but growing confidence from developers signals this is a medium-term target rather than a distant aspiration. The combination of V extension AI acceleration and the architecture’s power efficiency story makes it a credible future contender, particularly in lower-end segments and wearables where proprietary licensing costs bite hardest.
Desktop is the boldest frontier. SpacemiT’s laptops prove the concept works, but performance parity with Apple Silicon or AMD’s Zen architecture remains elusive. The RVA23 profile, which mandates a broad set of extensions including vector support, and converging toolchain quality suggest that credible desktop chips could emerge by 2027–2028.
Server is where the near-term momentum is strongest. Senior Alibaba Cloud executives predicted at a March 2025 conference that RISC-V would become a mainstream cloud architecture as early as 2030. The economics are compelling: cloud providers operating at hyperscale are acutely sensitive to licensing costs and eager for architectural diversity that reduces supplier dependence. Custom RISC-V server silicon, optimised for specific workloads like AI inference or storage, is already a serious engineering and commercial effort.
Why This Matters Beyond Technology
There is a dimension to RISC-V’s rise that goes beyond megahertz and benchmark scores. In an era of semiconductor geopolitics, export controls, supply chain fragility, and the weaponisation of technology standards, an open, neutral, Swiss-governed architecture is not just an engineering choice. It is a strategic one.
China has embraced RISC-V as a strategic priority in response to U.S. and allied export controls on advanced semiconductors, offering a license-free, open alternative that organizations can use without direct dependency on Western IP. But RISC-V’s openness equally empowers European chip designers, Indian government initiatives, African universities, and Silicon Valley startups. It democratizes access to processor design in a way that no proprietary architecture can.
The V in RISC-V officially stands for the fifth iteration. But in practice, in the AI era, on the floors of chip fabs, in the roadmaps of every serious semiconductor company, it stands for something else: Vector. Versatility. Vision.
The open architecture revolution is not coming. It is already here.