Two weeks ago, version 12 of the Opera web browser was released with WebGL support, which means that now all the browsers that matter can natively render 3D graphics. Great news, right? A new era of games on the Web is about to begin!

Except, you know, not.

As far as I can tell, no browser actually supports WebGL on Linux. That’s it for me right there, never mind the infamous security issues inherent in the API. Even on Windows, if I understand correctly, WebGL only runs on a few select GPUs.

Hardly a technology for the Open Web.

I was reading this state-of-WebGL article and reminiscing the days when I first heard about this technology. That was, what, two or three years ago? And look where we are now: nowhere. I’d like to believe that it’s just a matter of time, but for a pesky thing called experience. See, this whole story has happened before with a little something called VRML. Never heard of it, you say? Precisely!

It’s been 15 years since VRML was supposed to be the next big thing. Much has changed in computing, but 3D on the Web is still a solution in search of a problem (and terrible implementations aren’t helping). Few applications actually require it, mostly multiplayer games, and so far those have done just fine with Java, for all the criticism.

William Gibson’s vision of cyberspace was naive sci-fi, people.

Mind you, I could be wrong (would’t be the first time), but I’m not holding my breath. Even if WebGL does become widespread, safe and stable in a couple more years, it will be completely irrelevant by then due to one or more of:

  1. Unity3D in its browser plugin incarnation becoming the new Flash. Hardly a happy ending, but the public will choose the product that works.
  2. Java breaking free of its stigma and taking off as a platform for 3D games on the Web. It’s increasingly popular elsewhere anyway, thanks to Android and high-profile titles such as Minecraft.
  3. 2.5D, raycasting, voxels and other software rendering techniques that have been used successfully for years before the days of consumer GPUs. Wolfenstein 3D has already been remade for the browser, both officially and less so, and there’s no shortage of retrogamers out there who understandably may want more than the same old platformers at some point.

Needless to say, I’m mainly interested in option 3 (and Nightwrath in option 2, which isn’t bad either). That’s because whichever technology takes hold on the desktop (hopefully all of them — the importance of diversity can’t be overestimated), consoles and mobile devices need browser-native solutions that work on whatever hardware happens to be available. And pure software algorithms just so happen to not need any special hardware support. Just what the doctor ordered for the digital Tower of Babel we live in right now.


Firefox on Linux supports WebGL. However, it isn’t very secure or stable; it’s pretty easy for an errant shader to bring the whole browser down, for example.

Even worse, on Mac OS X, an errant shader can cause a kernel panic, so.

Making shaders safer requires a lot more compromises in the entire stack, including better graphics drivers and which imposes cycle-time limits for shader execution, and it’s not a very easy problem to solve. Static code analysis can also only get you so far; trying to solve the problem of whether a shader halts is just as impossible as trying to solve the problem of whether any other program halts.

Not to mention the possibility that buffer overruns are found in various graphics drivers; drivers tend to be written under the assumption that the programmer knows what they’re doing and if the code is being run that it’s inherently trustable. In a browser-based environment, that goes completely out the window.

As much as I like their decision to go with GLSL from a programming standpoint, from an everything-else standpoint they really should have gone with some other shader specification mechanism. Everything else in the WebGL stack can be made secure or at least validated, but GLSL compiles directly to untrusted, unmanaged code that’s run by physical hardware, in a place that is completely out of the browser’s control.


Oh, and regarding your bullet points:

1. Unity3D’s plugin isn’t supported on Linux either (also it doesn’t solve the secure shader problem with WebGL)

2. Pure Java is terrible for mobile, mostly due to issues with Oracle’s licensing. (Android’s programmed in Java-the-language but the actual VM has nothing to do with Oracle’s; you can’t run an arbitrary .jar on an Android device without installing a bytecode VM, and that runs terribly)

3. Software rendering really sucks for mobile, where GPUs can do a lot more work with special-purpose circuitry that takes less power than implementing it in software. One of the most basic operations in graphics (linear interpolation) requires a lot of work from a general-purpose CPU, but with dedicated circuitry in a GPU it can do it faster with less power. Also GPUs can do some clever stuff to make less work for themselves, in ways that are difficult to implement in software (not to mention a lot less likely for a novice programmer to do); look at how PowerVR implements its depth testing, for example. (Hint: It’s not a z-buffer.)


Um, fluffy… I wasn’t talking out of my ass when I wrote that no browser seems to support WebGL on Linux. I tried. With the official nVidia driver, I might add.

That said, thanks for explaining the security issue with shaders; that’s exactly what I was talking about, but didn’t want to go into details.

As for the power issues on mobile devices, they didn’t stop manufacturers from putting Flash on many of them. If that’s not a resource hog, I don’t know what is.


Funny, the latest Firefox (x64) on my Ubuntu 12.04 system supports WebGL just fine, using the Intel graphics driver. Maybe you have to get a specific build of it or something, or maybe Ubuntu configures it with WebGL by default. There are a lot of variables to worry about, though.

I’m also pretty sure that Chrome does as well, although the last time I checked you had to explicitly enable it with a command-line option.


Also, while it is true that Adobe did release an Android version of Flash, basically nobody uses it — because it performs like ass and kills your battery.


No, I mean devices much weaker than that, such as my old MP3 player (and my E51, but that one’s already quite powerful).


Certain subsets of Flash can be used in a mobile-friendly way, sure, especially if you set the graphical quality to low or medium. There are also embedded Flash-like environments (such as by Hooked Wireless) that translate the performance-killing graphical primitives into GPU- or low-spec-CPU-friendly ones, but they only support limited subsets of the runtime.

When it comes down to it, the parts of Flash that work well for mobile are basically the exact same parts that you get in HTML5 Canvas instead (limited drawing primitives and JITted ECMAscript).

Oh, and another performance/battery killer on Adobe’s Flash stack is its floating-point-based MP3 player. Presumably your MP3 player provided a hardware CODEC instead. 🙂


So it does, but I thought it was more because the CPU in a media player is typically weak, and you need a Pentium I equivalent at the very least to decode MP3s in software in real time.

And the Canvas API is all you need to do a lot of nice tricks. Which is exactly the point I’m trying to make as of late. 🙂


Speaking of VRML, the whole Unity/WebGL thing was previously done (on Windows, at least) with WildTangent. i.e. hardware-accelerated 3d in the browser and you could write decent 3d games with JavaScript. I’m under the impression that WildTangent (erroneously/unfairly?) began to be perceived as malware and thus went out of fashion and became a distant memory.

— Alan