Comments on A different kind of optimization


A member of frogatto’s dev team here, actually.

From poking around this site, I can’t really tell, but it looks like you don’t do much graphically-intensive game-development. I see some ascii-rpg making, but I don’t see any hardcore, push-a-machine-to-its-limits blitting. Ironically, that’s actually not the sole purview of all-out AAA 3d games, you see; you can actually do that, trivially, with just a 2d game. All you have to do is use 2d art, make it actually be 32-bit, use non-binary alpha blending, and have more than 1 layer. Our last game – battle for wesnoth, does exactly that; it uses SDL, it’s really, really well-optimized for the best 2d-acceleration out there that we can count on to be available everywhere, and it …. struggles …. to do even 30fps on modern hardware. Furthermore, wesnoth has whole categories of effects that are really off-bounds because there’s no way we can reasonably blit them. It’s crazy – I mean, surely a 2d game shouldn’t ever cost that kind of processing. Doesn’t ‘2d’ exempt us from every worrying about that? Actually, no.

These things REALLY aren’t obvious. It’s easy to assume that because we’re just another 2d game, that we’re pretty much the same requirements as cave story, right? We’re actually several orders of magnitude higher. It’s what happens when you make sprites that are 8x as big, have a screen that’s much, much bigger, use blending effects, etc. Suggesting that we drop our graphical fidelity to be at the same level as cave story just isn’t on the table. I’m not gonna toss out years of painstaking work hand-animating sprites.

The other thing that’s not obvious is the hardware-dependent hell most 90s gamedevs had to go through to get decent performance. I’m flicking through that thief dev diary link, reading about the things they had to do to manage to blit quickly. Not only are those things that restrict game-making only to the john carmacks of our world (seriously – if that were mandatory, I could never make games), but if you’re dropping down to x86 assembly, being cross-platform goes right out the window. Good luck with a linux/mac port.

The best analogy I can think of is the ghetto web-development was stuck in with IE6 – if you weren’t a web-developer, there didn’t seem to be anything to complain about, and it seemed inconsiderate for devs to make some site which didn’t work on your browser. I mean, it should be easy to make everything work in IE6, right? It’s not like firefox can do anything “important” IE can’t. And from the most generalized, hand-waved perspective, that was true; they could both blit text and graphics, the rest is just details. But those details are the lion’s share of the work.

Like web-standards, blitting standards are moving forward. We devs have our reasons, and we will leave behind anyone who doesn’t board the train in time (especially when android/iOS devices are moving in lock-step at providing great, and standard hardware 3d support).

I suggest when you buy your next laptop, you buy one with the best 3d card you can find, for future-compatibility.

-- Jet 2017-09-20 14:28 UTC

I hope that my above comment doesn’t come off as pandering or argumentative; I think it could easily seem so from how I phrased my closing statement. Someone on IRC pointed me to your blog post; I just wanted you to have some idea of where we’re coming from.

Making games is really hard, and for the effects we want to do, we really have no choice but to require OpenGL. I wish I had enough time in my life to cater to everyone’s different machines, but in practice, our time is our greatest limitation, and if we have standards support that’s as widespread as OpenGL is, with all the benefits it gives us (and the fact that it’s an open standard), we just can’t afford not to use it. I hate to leave people out in the cold, but it just wouldn’t be possible to make the games I make without this. I wish there was some standards body mandating that all computers had to ship with this stuff, because it really hurts everyone when someone doesn’t have it.

-- Jet 2017-09-20 14:30 UTC

The main things that make software OpenGL perform like crap are alpha blending and texture filtering. Those are operations that are very easy to do in dedicated hardware but are HORRIBLE to implement in software for a general-purpose CPU.

Of course, both of those operations are very helpful for making 2D graphics look a lot better in a resolution-independent way. Alpha blending can be avoided by using alpha testing instead, but not in a way that allows you to anti-alias for “free.”

-- fluffy 2017-09-20 14:30 UTC

Hey, Jet. Thanks for dropping by. A different viewpoint is always welcome, and I understand your reasons for doing things the way you did. The thing is, Battle for Wesnoth runs just fine on the same machine on which Frogatto crawls. Is it only because it’s turn-based? I take it the answer’s no.

fluffy, I noticed that while learning what little OpenGL I know. 😛

-- Felix 2017-09-20 14:31 UTC

mmm my laptop which i use to work since a couple of years and the only i have, doesnt have a 3d chip either. just a small chipset.. and doesnt have any 3d functions but no shader and is very bad,

now i know why Qt in normal mode ran faster than Qt in OpenGl.

Thanks buddy 🙂

we are wasting lots of cpu cicles lately…

-- Nande 2017-09-20 14:34 UTC