Wednesday, 18 March 2015

Unity and quality settings

We've been pretty bogged down in learning the Unity IDE and picking our way through the differences in Javascript and UnityScript (yep, that really is a thing - and it's ever-so-slightly different to "regular" javascript). But this evening we took an hour out to play with the quality settings. And we got some interesting results.

At the minute we're developing on a PC - it's a quad-core 3Ghz something-or-other with 6Gb of RAM and a whopping 1Gb graphics card (these numbers actually don't mean very much to me, personally, but the guy I got the machine off seemed to think they sounded impressive). It's the most powerful PC I've ever used. But then again, for the last ten years or so, most of my coding has been done in NotePad (well, EditPlus2 if I'm honest) or Visual Studio!

Anyway, it's a half-decent machine (as far as I'm concerned anyway) and it runs 3D graphics fairly well. So during a bit of "downtime" we had a play with the quality settings.

I didn't even know this kind of thing existed - it was only after asking Steve about how he prepares his software for "real-world applications" that he suggested using the same code-based and simply dropping the graphics quality for lower-end devices. It seemed like an idea, so we had a play to see what the different settings did:

On our machine, there was actually very little difference between the first three settings, fastest, fast and simple. Maybe we didn't have enough lights and effects to tell them apart; in any of these settings, there were few or no shadows on any of the objects.


Noticing the quality level change slightly as we went to "good" quality, we actually turned off shadows, as these were a little distracting. At this stage, we were more concerned with how our shapes were rendered, rather than the "sparkle" that particle systems and lighting occlusion (is that even a thing?) added to a scene.


Compared to "simple" mode, where the edges of all the shapes on screen had very definite "jaggies" along the edge, the "good" mode did a decent job of smoothing all the edges. So we were expecting great things from "beautiful" mode...


 Beautiful mode sharpened up a lot of things in the scene; it was only after comparing screenshots between "good" and "beautiful" we noticed what the actual difference was. The bitmaps on the floor tiles are much sharper (we hadn't really noticed that the deforms on the floors in "good" mode actually made them look quite blurry on second glance).

But in sharpening up some of the bitmaps, something else happened too. Our animated (soldier) character started to display little white dots along the seams of the arms. They only appeared every now and again, and only for a single frame of animation. But they were noticeable enough to be distracting.

If you looked at the surroundings (as you might for a first-person shoot'em up) beautiful was definitely an improvement over "good". But if you looked at animated characters in the scene (as you might with a third-person shooter, for example) "good" actually gave better results than "beautiful" - the characters were certainly better animated, and the edges were not so jaggy (though the perspective distortion on the bitmaps did make them appear a bit more blurry).

Strangley, things didn't improve with the ultimate "fantastic" setting.



Once again, the scenery got that little bit sharper, but the animated character still had those annoying flashes of white every now and again. Not there all the time, but noticeable enough if you watched the animation - a little like the way you might spot a rabbit's tail as it runs away from you in an empty field. If you look for it, it's hard to notice - but just gaze at the scene and every now and again you see a flash of white.


While "good" does create distorted floor (and ceiling) tiles, we're actually thinking of sticking with "good" as the default quality setting, if only because those flashes of white on the animated character are so distracting. The jagged edges of the walls and floors (and, for that matter, around the character) in "beautiful" mode are pretty off-putting too.

Just when we thought we'd found the best graphics settings for us we then discovered a single-line script, which confirmed our choice: QualitySettings.antiAliasing.

This can be set to zero, one, two, four or eight.


The differences in quality are most noticeable with antiAliasing set to eight.
The screenshot above shows the same scene, at "good" setting, with antiAliasing set to zero (left) and eight (right). The scene on the right is much easier on the eye!

The decision was cast, when we flipped back to "beautiful" settings, even with antiAliasing set to maximum, and we still got jaggies around our animated character. At "good" quality, the character sits "inside" the scene - he looks like he's part of the spaceship interior. At "beautiful" quality or above, the jaggies around the edges - and in particular along the gun barrels - make it look like an animated overlay, plopped on top of a 3D render.

So there we have it. It may be peculiar to our machine, and work perfectly fine on other devices. But for now, we're sticking with max antialiasing, and limiting our graphics quality to "good". We'll just have to learn to live with the slightly blurry tiles (or perhaps whizz the camera around quickly, and call it motion blur!)