It doesn’t happen often that I have one overarching theme for this newsletter; usually it’s just that I discuss a single link at length. This time it’s different. Get ready for yet another big rant about 3D graphics. But not just yet.
I want to start with a little video. Via Shamus Young, here’s a fascinating viewpoint on what the mechanics of Civilization (the game) betray about the developers’ view on the actual human civilization. Too long, didn’t watch version: remember Fry’s reaction to the theme park version of history in the pilot episode of Futurama?
Incidentally, the point they make in the video is very similar to Aaron Reed’s critique of the Star Wars prequels from a few months ago: somehow, along the way, we’ve grown used to the idea that history is a preordained series of events, rather than being shaped naturally by the actions and interactions of many individuals. To the degree that we acknowledge people at all, it’s a handful of historical figures seen as demigods who did everything by themselves…
Troubling, isn’t it?
In unrelated news, the same Shamus Young continues to rant against OpenGL, sadly without saying much that’s useful. Yes, it’s a quaint, inconsistent and overly complex API. But remember, they did try to simplify it with OpenGL ES, which forms the basis of WebGL. The result was mandating the use of shaders, which not only mean a lot more boilerplate you need to get started, but open a whole can of worms when it comes to compatibility and security.
What, then, are we supposed to do? Sticking to a single, proprietary platform is obviously out of the question in 2014. And there’s no open alternative out there.
Unless, of course, you consider the obvious solution of software rendering.
I’ve written about it many times before, and every time it elicited strong reactions. My readers keep complaining about how they absolutely do need hardware acceleration to do fancy stuff, and I keep telling them they don’t need to get fancy. And I could start again with my usual litany of examples from the 1990es, countless games that looked gorgeous while being incredibly low-tech. Or even ugly games from the 1990es that people are still fond of anyway. Final Fantasy VII, anyone? Yep, you could trivially run it in software these days. (Also, redo the sets in the same cartoonish style as the characters and nobody’s even going to notice the ugliness anymore.)
But then I stumbled upon this article about the making of Starglider, and I remembered that (wireframe) 3D was already doable on 8-bit microcomputers that were unimaginably slow by modern standards. How much could we do nowadays without ever touching a GPU?
(Technically, you could also do flat-shaded 3D on those, just not quite in real time; for that you needed a beefier 16-bit CPU.)
We’re unlikely to learn the answer, as even 2D games have started relying on 3D acceleration in recent times. See, people want to do fancy stuff in 2D as well… and video card manufacturers have been so obsessed with 3D that 2D acceleration functions are likely still limited to the basic blitting needed for graphical desktops. (I mean, before they tried to go 3D with those as well; luckily, users quickly got bored of wobbly windows, and now we’re past that horrid fad.)
You’re going to say it doesn’t matter, that nowadays even smartphones have GPUs. But then why do we keep hearing all these horror stories about bugs, incompatibilities and proprietary platforms being left unsupported mid-project? With software rendering, all you need is a general-purpose CPU and a drawing surface. You don’t depend on anyone.
To end on a cheerful note, I’ll point out that not one but two game bundles last week were focused on game-making tools; the latter even appears to specialize in them. And as the runaway popularity of Twine demonstrates, the democratization of game development is important to many people.
Have fun making games.
Weekly Links #21 by Felix Pleșoianu is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.