Some conversations I’ve had recently have made me reflect on
priorities and how I want to spend my time. Because gaming consoles (and their
controllers) are designed to mainly play games, in almost all instances they
are pretty much traditionally plug-n-play and easy to use. Excluding cases when
additional hardware is needed (4-player adapters, peripheral controllers, VR
headsets, etc.) or modern games that require large downloads, pretty much all you
needed to do was pop in the cartridge/disc, turn on the TV, and pick up the
controller.
But with PC gaming, it often requires checking on system
requirements, configuring hardware, and potentially other tasks like moving
files. To be fair, platforms/launchers like Steam have made things far easier
than it was when I was trying to get WarCraft II to connect to my friend’s
computer over a dial-up modem in high school.
What I’m getting at is that while, yes, a PC can potentially
have better screen resolutions, better frame rates, and other visual fidelity
enhancements over consoles, it requires more time/effort (and often more cash for
a better CPU/GPU or larger monitor) than a console does and it’s not something
that I really want to spend my time or money doing. If I am chasing the “best
possible experience” with regards to graphics, I would be upgrading my hardware
on a yearly basis and using all of my budgeted hobby money to do so.
The reason I play games is to spend time having fun, and I
want to maximize my priorities to do exactly that, so I want to spend as little
time as possible in configuration menus and hardware optimization. I don’t want
to spend more time researching/learning about games than I spend actually
playing them, nor do I want to spend time/money on something and take weeks to
get it set up and never actually play the games I’m intending to play. I admit
there is a satisfying feeling of relief knowing that I have the capability to play something (because I
have it installed on my 3DS or bought an N64 Flashcart), but if I never actually
get around to playing it, what good does it do?

I also don’t want to bankrupt myself in the process, so I
don’t need to have the bleeding edge of technology to accomplish this goal.
Basically, I’m trying to maximize playing time while minimizing cost and other
considerations like fiddling around with settings. Does that mean I won’t have
the absolute best possible experience? Yes, but it’s about 95% as good in my estimation, and
that’s good enough for me. I believe that the law of diminishing returns
applies here, where spending $500 will get you 92% of your goal, and spending another
$500 will only get you an additional 3 to 4%. Everyone has a threshold for “good
enough,” and mine is low enough that console gaming is fine even if I don’t
have 120 fps and VRR. Sometimes the frame rate may stutter a little, or load
times might be a bit longer, but at the end of the day, I just ask if the game
playable and did I have a good time playing it. It is said that the pursuit of
perfection is the enemy of the progress (or in this case, the “good enough”). I
would rather play 30 games over the course of a year that are rated good to
excellent rather than play only 2 games that are perfect (or put another way, I would rather play 30 games that run pretty well that are fun than play only 2 games that run perfectly).
Would I like to have the best possible experience playing a
game? Again, sure, but if I can achieve 95% of that right now with no
additional cost/effort/time but it’ll take hundreds of dollars and hours of my
time downloading mods/updates and needing to tinker with settings just to get
it to 98%, then it’s not worth it to me.
As an alternative example, would I like to watch movies on a
50-foot screen with perfect sound quality and nobody else around other than my
wife? Sure. But I don’t want to spend the money to rent out a theater or buy a
big enough house build my own private theater every time I want to watch a
movie. Instead, I can spend no additional money to watch my 46” TV from 2012 while
lying on my couch and not have to put up with other people in a theater, and
that’s good enough, even without modern enhancements like HDR. When my TV finally goes kaput, then yeah, I'll get something that is far more capable than my current setup. But I don't see the utility of replacing a piece of equipment that is still functioning as intended.

There are occasions where I will do some of this kind of thing. I spent the better part of a summer trying to get a Philips CD-i emulator to run so I could get a chance to play the Unholy Triforce of Zelda games. It never ran perfectly and would crash during the infamous cut scenes, so the work-around was to skip them and watch them on YouTube at the appropriate times. But once I had it running enough where I could experience the game (mostly) as intended, then I was satisfied and haven't really touched it since, having achieved my goal of playing all of the Zelda spinoff games.
Also, for what it's worth, I did use a few emulators like ZSNES on the computer somewhat frequently because they required almost no configuration and worked without too much effort. Later emulators for the N64 required more work so I didn't use them except for a few times to see how well they perform. And when it comes to the hacking scene for the 3DS, most of the work was done for me and I just had to follow basic instructions to get it work.
To summarize, I think everyone has differently optimized priorities (“different
strokes for different folks”), and mine includes frugality so I don’t feel the
need to run out and upgrade my hardware until it’s absolutely necessary as long
as the length of play, the ability to play the game, and the amount of fun that can be had outweighs the amount of
time/effort/cost to actually play it.