

Pretty neat. But why an FPGA? I would imagine if you want to run software targeting really old chips, like the Z80, you might as well run it on a modern x86/ARM/RISCV processor with an emulator on Linux.
Pretty neat. But why an FPGA? I would imagine if you want to run software targeting really old chips, like the Z80, you might as well run it on a modern x86/ARM/RISCV processor with an emulator on Linux.
“The complexity for minimum component costs has increased at a rate of roughly a factor of two per year.”
A key part of Moore’s law which is often omitted is that Moore was not just talking about transistor density but about cost. When people say we’ve reached the end of Moore’s law this is not because we’re no longer able to increase semiconductor transistor density (just look at TSMC’s roadmap) but that the “complexity for minimum component costs” is no longer increasing. Chips are still getting faster but they’re now also more expensive.
I recently switched to Linux and the latest KDE surprised me with how powerful it is. Scaling works. Fonts are rendered nicely. It’s just easy to use. Most of the time I don’t even think about the fact that I’m running Linux anymore.
Judging from the video description this seems to be a remake of the original Frostpunk in Unreal Engine. Not sure why they’re doing it. The original is still just fine.
I’m aware that’s true for complex multi chip systems like arcade boards. The Mister project for example. But a simple Z80? I expect it to emulate virtually perfectly. Maybe not? Hence why I’m curious.