I still remember when they announced changing the onboard computer of the Hubble Space Telescope in 1999. The new one had a 486 CPU running at 25MHz and 2M of RAM. Ridiculously more powerful than that of the Space Shuttle... and much weaker than what I had back then: a 66MHz 486 with 16M of RAM. Which was already obsolete, yet could still run Opera 7 and Word 6 (or was it 7?) as well as all the classic games of the early 1990es.

This isn't a "get off my lawn" rant. It is, however, a plea for moderation.

Pretty much every modern computer has 3D hardware acceleration, right? Smartphones and tablets invariably come with OpenGL, and even older feature phones (remember Java ME?) have some sort of 3D support. They put 3D video chipsets in servers, for crying out loud — it's cheaper than operating separate assembly lines.

Well, my trusty laptop, which has served me so well since 2006, doesn't have one. It just doesn't. I know, right? Deal with it.

No worries, though, there are plenty of 2D games still being made. Or so I thought, until I started stumbling across newer titles that wouldn't run at more than one frame per second or so. Turns out, they were made with OpenGL. Not just that — OpenGL itself doesn't assume hardware acceleration -- but they were obviously unconcerned with performance. And since a Celeron CPU isn't too beefy either...

What in the world happened to good old 2D acceleration? Video players use it; the HTML5 canvas uses it. The near-ubiquitous SDL library takes advantage of it whenever possible. It's all you really need for a game such as, say, Cave Story.

OpenGL, however, can't use it. Not even in orthographic mode. It's a 3D API through and through; absent 3D hardware acceleration, it will revert to pure software rendering.

Now, that shouldn't be a problem either. The first game in the Thief series famously used a software renderer, despite coming out in 1998, and did just fine. Think about it for a moment. My smartphone has twice the CPU speed and RAM of a typical PC from that era. Multiply by two again for the aforementioned laptop.

Why can't it run a mere platformer decently? Frogatto, I'm looking at you here.

Listen, if you're making a 3D game that genuinely pushes the state of the art in computer graphics then by all means, require the latest and greatest hardware: dual GPUs, quad-core CPUs, gigabytes of RAM, you name it. But if you're not, for the love of computing, don't use more resources than you have to. You may find yourself requiring OpenGL for an ASCII-based roguelike.

At least libtcod only requires plain SDL. And that's still too much for people who, for some reason, prefer to work in text mode.

Comments

A member of frogatto’s dev team here, actually.

From poking around this site, I can’t really tell, but it looks like you don’t do much graphically-intensive game-development. I see some ascii-rpg making, but I don’t see any hardcore, push-a-machine-to-its-limits blitting. Ironically, that’s actually not the sole purview of all-out AAA 3d games, you see; you can actually do that, trivially, with just a 2d game. All you have to do is use 2d art, make it actually be 32-bit, use non-binary alpha blending, and have more than 1 layer. Our last game – battle for wesnoth, does exactly that; it uses SDL, it’s really, really well-optimized for the best 2d-acceleration out there that we can count on to be available everywhere, and it …. struggles …. to do even 30fps on modern hardware. Furthermore, wesnoth has whole categories of effects that are really off-bounds because there’s no way we can reasonably blit them. It’s crazy – I mean, surely a 2d game shouldn’t ever cost that kind of processing. Doesn’t ‘2d’ exempt us from every worrying about that? Actually, no.

These things REALLY aren’t obvious. It’s easy to assume that because we’re just another 2d game, that we’re pretty much the same requirements as cave story, right? We’re actually several orders of magnitude higher. It’s what happens when you make sprites that are 8x as big, have a screen that’s much, much bigger, use blending effects, etc. Suggesting that we drop our graphical fidelity to be at the same level as cave story just isn’t on the table. I’m not gonna toss out years of painstaking work hand-animating sprites.

The other thing that’s not obvious is the hardware-dependent hell most 90s gamedevs had to go through to get decent performance. I’m flicking through that thief dev diary link, reading about the things they had to do to manage to blit quickly. Not only are those things that restrict game-making only to the john carmacks of our world (seriously – if that were mandatory, I could never make games), but if you’re dropping down to x86 assembly, being cross-platform goes right out the window. Good luck with a linux/mac port.

The best analogy I can think of is the ghetto web-development was stuck in with IE6 – if you weren’t a web-developer, there didn’t seem to be anything to complain about, and it seemed inconsiderate for devs to make some site which didn’t work on your browser. I mean, it should be easy to make everything work in IE6, right? It’s not like firefox can do anything “important” IE can’t. And from the most generalized, hand-waved perspective, that was true; they could both blit text and graphics, the rest is just details. But those details are the lion’s share of the work.

Like web-standards, blitting standards are moving forward. We devs have our reasons, and we will leave behind anyone who doesn’t board the train in time (especially when android/iOS devices are moving in lock-step at providing great, and standard hardware 3d support).

I suggest when you buy your next laptop, you buy one with the best 3d card you can find, for future-compatibility.

— Jet


I hope that my above comment doesn’t come off as pandering or argumentative; I think it could easily seem so from how I phrased my closing statement. Someone on IRC pointed me to your blog post; I just wanted you to have some idea of where we’re coming from.

Making games is really hard, and for the effects we want to do, we really have no choice but to require OpenGL. I wish I had enough time in my life to cater to everyone’s different machines, but in practice, our time is our greatest limitation, and if we have standards support that’s as widespread as OpenGL is, with all the benefits it gives us (and the fact that it’s an open standard), we just can’t afford not to use it. I hate to leave people out in the cold, but it just wouldn’t be possible to make the games I make without this. I wish there was some standards body mandating that all computers had to ship with this stuff, because it really hurts everyone when someone doesn’t have it.

— Jet


The main things that make software OpenGL perform like crap are alpha blending and texture filtering. Those are operations that are very easy to do in dedicated hardware but are HORRIBLE to implement in software for a general-purpose CPU.

Of course, both of those operations are very helpful for making 2D graphics look a lot better in a resolution-independent way. Alpha blending can be avoided by using alpha testing instead, but not in a way that allows you to anti-alias for “free.”

fluffy


Hey, Jet. Thanks for dropping by. A different viewpoint is always welcome, and I understand your reasons for doing things the way you did. The thing is, Battle for Wesnoth runs just fine on the same machine on which Frogatto crawls. Is it only because it’s turn-based? I take it the answer’s no.

fluffy, I noticed that while learning what little OpenGL I know. 😛

Felix


mmm my laptop which i use to work since a couple of years and the only i have, doesnt have a 3d chip either. just a small chipset.. and doesnt have any 3d functions but no shader and is very bad,

now i know why Qt in normal mode ran faster than Qt in OpenGl.

Thanks buddy 🙂

we are wasting lots of cpu cicles lately…

— Nande