pyth0
I believe the main reason for the focus on these earlier decades is due to the openness of both the hardware and software during this time. Manual published by computer manufacturers contained detailed schematics of all the circuits, assembly code listings of the BIOS and system programs. As a hobbyist I can build my own SBC based off these schematics and probe physical pins on the chips in order to debug the board. As things became smaller and more integrated, chips started including more functionality and closed-source firmware and actually integrating them into your own designs became increasingly difficult.
SAI_Peregrinus
Others have answered why not later, but why not earlier is also a question. Earlier computers were a lot more work. CPUs weren't single chips, they were collections of dozens or hundreds of ICs. Before that they were thousands of individual transistors, Ben Eater has a Youtube series designing and building such a machine. Before that came vacuum tubes, in types which are no longer produced and aren't at all common to find working. Before that came electromechanical systems with machined cams, cogs, wheels, lots of relays, etc. So the earlier in computing capability you go the more expensive and difficult it gets.

Early 80s parts are a bit of a sweet spot combination of low price, enough challenge to be interesting but not so much as to be frustratingly difficult, and an interesting turning point in computing history as the first single-IC CPUs became popular.

jmclnx
I think it is a form nostalgia. But, the early 80s was a fast changing time for tech where new and much more powerful hardware was coming out almost every day.

Plus, tech was not controlled by large corporations as it is now. Back then there were many small hardware/software vendors competing and in many cases helping each other advancing. Now, everything is vanilla and controlled by Large Corps.

Me, I miss the days before GUIs, I always thought plain text interfaces were and still are the better in many cases than point/click.

PaulHoule
Computers haven’t really changed much since the 1990s, I mean I have been running Linux since 1994. Even in the late 1970s the DEC VAX had an architecture basically similar to modern computers.

What else could you be into? Computers pre the IBM 360 sucked. Personally I have some nostalgia for the 1970s PDP-11 and PDP-10 but not enough to really put a lot of time into setting up emulators. (The PDP-11 has the awkward bit that it has a 16-bit user address space, running RSTS/E you basically get a BASIC environment (or something like a CP/M environment) which is just slightly better than what you get with a 16-bit micro. I’d used installations where 20 people could enjoy that at once but for an individual hobbyist you might as well use or emulate a micro… But those GiGi terminals beat the pants off the Apple ][ and such)

Mass producing 8-bit micro hardware has tough economics today (can’t beat RasPi) but it is possible to make a computer of that era with mainly or entirely discrete components. (Even a 6502 or display controller!). Forget about making an IBM PC clone or an Amiga. (An FPGA can get there if you have enough $$)

If I wasn’t already overloaded with side projects I’d like to deeply explore the 24-bit world of the eZ80, like the old 360 and a place where micros never really developed a comfortable ecosystem. You could have more memory than the IBM PC that is easier to code for; you could run Linux with X Windows and Win 95 in 16GB of memory back in the day so such a system could kick ass. The AgonLight 2 has a programmable display controller based on the ESP32 and it should be very possible to make a more capable sprite-based graphics system than most contemporary game consoles, maybe even as good as the coin-op systems that supported games like Street Fighter

Ekaros
I think because 90s which is far more interesting is continuum. PC is partially backwards compatible over decades. Thus it is hard to isolate any single segment unlike "8bits" which were mostly one and done. You can get some points like end of DOS beginning of Windows 95/98. But there after it kinda blurs. Tech got better and old stuff kept mostly working. So you really cannot just stuck yourself in singular point.

Not that there isn't opportunities like peak of DOS gaming with Gravis Ultra Sound and some Sound Blaster. Or early 3d gaming.

sfmz
There is a 30-year lag in collectibles; people in their 50s tend to buy what they coveted in their late teenage years. From what I understand, this trend is also present in the car market.
TillE
8-bit computers are fully comprehensible, they're something you can build on a breadboard from components with straightforward datasheets. They're a perfect learning tool.

I don't think future generations are going to be very interested in tinkering with a C64 or an Apple II, but the 6502 will live on for a very long time.

dtagames
It represents a unique time in the history of computers -- an important gap between the mainframe era that came before it and the standard PC and internet eras that came after it. The wide variety of inexpensive and incompatible machines offered at that time has never been seen before or since.

This maverick era produced all kinds of innovation that simply doesn't happen today with one-fits-all computers that are controlled remotely by operating systems and cloud services. Modern machines bear no resemblance to 80's home computers except in their purity as computing machines, Turing machines, with simple I/O and programming.

Once you find out "how it all works" and that all computers are the same, one develops a certain interest in how we got there and perhaps a wistfulness for what things might have become. Home computers of the early 80's is where that answer lies.

OJFord
I don't think you can give one reason, because everything feeds everything else and then there's a network effect of sorts of popularity.

But one factor I think not mentioned yet is the film industry: I think (aside from feed-in popularity) the 80s gets outsized attention in TV & film because it's before modern technology was quite so prolific - plot lines that don't work with mobile phones, the web, etc.

And then that feeds people's interest too. Like, you can be born after WarGames or Ferris Bueller's Day Off for example, watch them, and then it's appealing because it's different and interesting, and you want more.

kazinator
It is probably due to the age demographics. People who are nostalgic for those years of the microcomputer revolution are both (1) relative large in numbers compared to earlier people in computing and (2) still relatively young and active. Many will be going into retirement about now also.

Then there is the practicality. The machines from that era were small. If you get your hands on the actual hardware, you can have it at home, set up in a small nook somewhere. Not so easy with some IBM 709 or something.

The stuff is easy to work on, and many components are still easily available.

simne
Because 83/84 was golden era of personal computers. Even when 8-bits was very limited, but they was affordable.

Imagine, sales of commodores and consoles achieved millions annually.

What also important, fortunately, 8-bits was good enough and (mostly) enough fun for tech level of that time (VHS, AM radio).

Previous computers was extremely expensive, so limited to military/science or business. Later computers become boring commodity.

dcminter
Firstly - is it really true that this is the main focus? I see a bunch of interest in the beige box era PCs as well.

Assuming it's true, though, then I would imagine there are several contributing factors:

The 80s is when computing arrived for the masses - and most of those masses were children at the time; the first computer I owned was a ZX81 and I was 9 years old when I got it. That lends it powerful nostalgia value. For later generations computers were likely more part of the background.

That generation of people is also now entering their late 40s or 50s. They probably have some income (especially if they got into IT) and their outgoings are likely tailing off - if they have kids then those kids are leaving the nest or have already. So there's spare cash to spend on all the bits and pieces that they couldn't afford back when they owned them the first time!

It's all far enough in the past that you can see it through rosy spectacles. Ram Pack wobble, slow tape loading, limited memory and primitive graphics all become features instead of limitations.

Then for younger generations who are getting into this the above points mean that there's a background of somewhat knowledgeable people to propagate information about these machines.

Add on top of that the limited nature of the machines meaning that one can have a complete-ish (or illusion of such) understanding of the machines. That's always been appealing.

Personally I find the 1970s minicomputers far more fascinating! But my dad worked with some of those and I adore Unix culture so I'm probably atypical.

Phiwise_
It's because the IBM PC was released in 1981, and over the next several years, as prices came down, steadily overran pretty much every other competing platform and took all the variety out of the market from a historical perspective. If you're interested in 40-year-old computers you could have a collection with a z80 machine, a 68k, a 6502, a tms9918a, or an 8088, and that's just variety in CPU architecture, and just some of the popular ones. Everything else was the wild west, too. Go backward too many years from there, though and the home and small business computer manufacturing industry just isn't as big, and specimens become an order of magnitude harder to find that aren't just glorified calculators. You have to put up a lot of cash comparatively to get the fun part of the hobby of owning and using them yourself over just reading about them, which you can do for any machine in any era, plus they're harder to service. If you're collecting 20 years later, though, things had gotten so much more standardized and developed that it feels almost like a different hobby. Most all of what you'll buy will be some variant of the HP versus Emachines dichotony: either an expensive IBM PC Compatible using all the common standards, with a high-tier x86 that mostly does what they all do but faster, and maybe an add-in specialty card, or a cheap IBM PC Compatible with some of the common standards, some things shaved off for cost, a low-tier x86 processor that's just more frustrating than your fast one, and a motherboard covered in cheap components that you have to solder in replacements for before it even works again.

I've painted a bit of a skewed picture here, but not by much. You can still collect later computers, and people do, but it's understandable that most people are drawn to the "cambrian explosion" of the whole line of history, no? Variety is the spice of life, and plenty its staple food to be spiced.

afavour
Because it’s what a significant number of current developers grew up with. It’s also the first time we saw computers represented in movies etc.

In the future I bet we will see a wave of nostalgia around Windows XP (in fact we’re already seeing it)

icedchai
I realize you said the early 80's, but the 80's, in-general, encompassed a ton of progress: from the 8 bit Apple, Commodore, Ataris, to the IBM PC and the clone wars, to the rise of the 16-bit Mac, Amiga, ST. Even the 386 was available by 1986. There were also many high end 32-bit workstations (Sun, NeXT, SGI...) Tons of variety in hardware.

The 90's was more about software (NT, Linux), and connectivity (dialup Internet goes mainstream, the first home broadband connections, etc.) Hardware felt mostly incremental: faster CPU, more RAM.

h2odragon
16 bit computing was the domain of inspired individual efforts. 32bit computing brought about "communities" that overshadowed the individual geniuses contributing to them.

Right now there's an extra boost because of the "that's what I used as a kid" factor. In the future the DOS era will still be studied, as that time when collaborative development had not yet been invented.

geor9e
Draw a graph, write all the decades since computers were invented on the x axis, and "how many computers were sold that year and still work well enough to be resold in 2024". Don't waste time looking it up, just best guess it. Now draw a vertical line where you define the word "retro" to apply, and erase everything to the right. It's a subjective word, so there's no wrong answer. Now re-ask your same question and look at your drawing.

Personally, I see the most interest in "Windows 98 era" retro computing (winamp, age of empires 2, millenium aesthetic), and rarely see 80s stuff anymore, but the people I follow the interests of are probably younger. 10 years ago I would have said 90s, and 10 years before that, 80s, so perhaps you're 20 years older than me? Just a guess.

stavros
Because pixel games are an interesting aesthetic that's still pleasing now, whereas 3D games with ten polygons are the same as what we have now, just looking much much worse.

We still get games with "pixel" aesthetics, and they look just as good as Zelda did back in the day. We don't get low-poly games like Tomb Raider 1.

morkalork
60s/70s computing was dominated by mainframes and dumb, remote terminals for them wasn't it? There's not much you can do on your own with that tech and I'm just guessing here, but its nowhere near as available anymore. Plus, of everyone I've met that has done programming with punch cards, nobody said they missed it.
wilsonnb3
I think it will remain the time period with the most interest, similar to how electric guitar and amp enthusiasts view the 60s.

For electric guitars, the 60s wasn't the decade where they were invented (30s) or the decade where they found their "final form" (50s), it was when they entered the cultural zeitgeist.

Despite popularity of the 50s guitars/amps and later decades slowly rolling in to "retro" status, the era of the Beatles, the who, hendrix, the rolling stones, etc. will always be the most popular.

A curious part of this is a sort of generational nostalgia transfer, where (for example) the popular bands from the 90s had this nostalgia of the 60s, so they used old guitars and amps and were influenced by 60s music which caused their fans to have the same view.

projektfu
1. Recapitulation of our childhood is particularly strong in my generation. For example, even after the Transformers movies, we obviously thought it wasn't enough like our memories, so we made Bumblebee.

2. Software for some 80s systems is ubiquitous.

3. Sound blaster hell/IRQ hell is a real thing and DOS is frustrating more than fun.

4. Prior generation computers were made for work and fun was a rare side effect of letting off steam.

5. Since Windows 2000 modern computers run the same software, so your retro is limited to beige boxes with flaky capacitors.

6. Non-wintel 90s+ computers are interesting and are indeed becoming a focus of retro computing. Power PC Macintosh, Acorn, BeBoxes, NeXT, SGI, even HP and DEC RISC computers are finding enthusiasm.

0xbadc0de5
The terms "retro" and "vintage" are a sliding window. In 20 years, computers from the 1990's and 2000's will be retro. There's also practical matters like price, availability and size - older computers from the early 1970's tend to fill up entire rooms whereas from the 1980's onward fit on your desk.
gchamonlive
The 8086 is from the late 70s. The x86 architecture would still take 10 more years to start dominating market. Recent processors don't differ too much in terms of instructions, so these processors are basically just simplified versions of we currently have, which understandably doesn't peak too much interest. Therefore the early 80s still being ripe with other architectures is more attractive.

I wonder, now that the market is slowly shifting to arm and RISCv architectures, we might see a similar trend 30 years from now, with people starting to procure x86 chips with similar interest.

nickdothutton
You can more or less fully understand an entire 80s home computer. The CPU, the memory access, the peripherals, and the graphics subsystem (…if you can call it that). Operating systems were simple by modern comparisons. Hardware is relatively cheap, and emulators at reliable, there was also a wealth of shareware available for many use cases, now much more easily available than when I ran a public access BBS for it.
yen223
Because people who were kids in the 80s are now entering their middle ages
TedHerman
As someone who has been programming since the 1970s, I'll have to admit that I don't get this interest in the early 80s.
maartenh
IMHO the constraints of these systems make them still quite fun to code for, just like solving sudoku puzzles can be fun.

I found about this [1] amazing live coding session by lftkyro on YouTube showing how to build a live music pattern editor on the C64.

[1] https://youtu.be/ly5BhGOt2vE?si=1EzOnELSb5fd-dqA

pikuseru
It’s because the folks who are the middle-aged adults with disposable income in their 40’s and 50’s used to have computers from the 80’s.

Although one of the benefits of them is that you can still play and program for them (and even on them.) They’re simpler and more immediate.

paxys
No, future generations will be interested in collecting technology from their own youth. There is already huge interest in old iPods, for example. Even original iPhone models are a collectible item now.
dr_dshiv
8bit 6502 tech is so radically awesome for the specs. So much creativity was required to make things work amid the heavy constraints. Like, basic NES/famicom games had 2k ram, 8k program code and 8k graphics.
beardyw
Well the 60s and 70s was mostly mainframes, a bit big for home tinkering. Babbage's difference engine would be an impressive project.

So basically,the 80s is the earliest easily accessible period.

krylon
A C-64 fits on your desk easily. A PDP-11 or a Cray-1 does not. This surely isn't the entire answer, but I suspect it does play a role.
pera
It's been like this since the late 90s, "retro" basically means microcomputers
yawpitch
At a guess, because that’s roughly the time most people currently in a mid-life crisis consider their childhood?

My guess is “retro” moves on roughly as fast as we do.

paulcole
Anyone answering this should be required to mention their age before sharing their half-baked theory.

I’m 41 and the reason for this is that people born in the mid-1970s to late-1980s are old enough and wealthy enough now to have nostalgia for that time period and the time to pursue that hobby.

jacknews
Firstly, it was the golden age where computers that you could own personally became capable enough to run decent games etc, but still be completely understood by a single person.

Also, people who were teenage hackers in that era are now middle-aged and beyond, and nostalgia.