When 3D games took a wrong turn

When I told my friend Sera there were 3D games on the 8-bit home computers from 3-4 decades ago, she wouldn't believe me. Luckily, the Internet remembers. Elite (yes, it's that old); The Sentinel; Total Eclipse; heck, even Starglider had a Spectrum port, albeit limited to wireframe graphics so it could still run in real time. Anything with flat shading was restricted to turning and moving in fixed increments. But it worked! You could even look up and down, an ability Id Software games only gained with the Quake engine, a whole decade later.

A couple of years after that, pretty much any 3D game required hardware acceleration to work at all, even though computers were getting faster and faster. By 2008, the average CPU ran at ten times the frequency one could expect in 1996 or so, even before accounting for improved architectures. Yet somehow one could still expect every new game to demand a GPU upgrade.

And despite that, they still crawled. Still crashed. Still all looked the same.

Of course they did. If you're old enough to have coded your own 3D engine, where did you start learning? From NeHe's tutorials, by any chance? And if you're newer to the craft, what are you making your games in? Unity, or Unreal?

Doesn't sound like many options at all if you put it that way, now does it.

In fact it's much like on those ancient home computers, most of which came with hardware support for sprites and scrolling to supplement their feeble CPUs... and as a result most games made for them were platformers, shoot'em ups and action adventures -- the only kinds that could benefit. Most... but not all.

Welcome to the brave new world, where people think "2D acceleration" means smearing a texture all over a quad that faces the camera. Cue beginners asking why their crisp pixel art is a muddy mess when loaded into the game engine. Cue experts having to teach them again and again how to fix what shouldn't have been broken in the first place...

(By the way, stop thinking of pixel art as a nostalgic throwback born of hardware limitations and nothing more. It's a valid artistic medium in its own right, that even predates computers. Vector displays, now *those* were born of hardware limitations... and still gave birth to some of the most influential videogames ever.)

In 1993, the creators of Star Fox had to put a GPU in every game cartridge. A quarter of a century later, we can run games that don't look much worse in a web browser... in software... through a layer of emulation. (No really, look up Zepton for the Pico-8.) How much better could we do by going straight to Lua, never mind C++? How cool, different and strange could our games look?

We have no clue, because the very idea of software rendering is considered ludicrous nowadays, and has been so for at least a decade. After all, every computer has a GPU nowadays, right? Android even requires one to run at all.

Why walk when you can drive a car everywhere?

There you go. 3D took a wrong turn when we started thinking it should be the end all, be all of videogames. After all, when you have a hammer, everything looks like a nail. And GPUs were the hot new peripheral to own around the turn of the millennium.

From there to forgetting they were just means to an end was only a step. People are great at mistaking means for ends anyway. And here we are today, complicating things more and more with every new "simplification".

Enjoy your ever-faster ride down the road to nowhere.

Tags: , ,