this post was submitted on 28 Oct 2023
124 points (100.0% liked)

Technology

37739 readers
500 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] barsoap@lemm.ee 2 points 1 year ago* (last edited 1 year ago)

Microcoding has been a thing since the 1950s, it's the default. Early RISCs tried to get away with it and for a brief time RISCs weren't microcoded kinda by definition, but it snuck back in because it's just too useful to not hard-wire everything. You maybe get away with it on MIPS but Arm? Tough luck. RISC-V can be done and it can make microcontroller-scale chips simpler, but you can also implement the RV32I (full) insn set in terms of RVC (compressed subset) and be faster. Not to mention that when you get to things like the vector extensions you definitely want to use microcode. The Cray-1 was hardwired, but they, too, dropped it for a reason.

I guess in modern days RISC more or less means "a decent chunk of the instruction set will not be microcoded but can instead be used as microcode", whereas with modern CISC processors the instruction set and the microcode may have no direct correspondences at all.