mofosyne
This is about unstable cores that randomly output incorrect calculation and ways to mitigate it via better hardware testing and duplicating parts of the core that can fail often.

I did however thought initially from the title that it's about 1-bit CPUs like the MC14500B Industrial Control Unit (ICU) which is a CMOS one-bit microprocessor designed by Motorola for simple control applications in 1977. It completely lacks an ALU so essentially cannot count, but is designed for PLCs.

freeqaz
Unrelated to the topic being discussed, but my mind immediately went to "per core pricing" which is common for databases. Some SQL servers would be charged for by the number of CPU cores in a system, and manufacturers would often offer an SKU with fewer, faster cores to compensate for this.

Taking that thought and thinking about adding "silent" cores is interesting to me. What if your CPU core is actually backed by multiple cores instead to get the "fastest" speed possible? For example imagine if you had say 2 CPU cores that appeared as one and each core would guess the opposite branch of the other (branch prediction) so that it was "right" more of the time.

An interesting thought that had never occurred to me. It's horribly inefficient but for constrained cases where peak performance is all that matters, I wonder if this style of thought would help. ("Competitive Code Execution"?)

bla3
[2021]