Wednesday, December 2, 2009

When in doubt, use 0 instead of -1 for null

I use 16 bits in Albion to represent linetypes, which are looked up in a table. Since the dawn of time, I've had one special value for the 'continuous' linetype, which is -1. While unable to sleep in the summer heat and mosquitoes, I've realized that it would be perfect if I could repurpose the top 2 bits to encode the line cap style (butt, square, round, ?). But there is just no clean way to do this and still have -1 represent the 'continuous', aka default linetype.
I think the reason this problem exists is because in C, -1 is just two characters to type (and read), and so it is a concise way to represent null. If we were writing integer constants in binary, we wouldn't do this. The bits are likely to be useful in a way that is aligned to their positions, if only because a CPU is just so good at that, and the concept is so universal and easy to implement and understand.
So, long story short: All things being equal, use 0 for null, and not -1. From the point of view that an on bit represents information, zero is better aligned to 'nothing'.

Saturday, November 14, 2009

8 cores, 8 times as slow

We all know that most workloads don't scale linearly as you add more cores. But you know you've really screwed something up when your times go UP as you add more cores!

This is how I managed to achieve this:
In Albion's new renderer, I split the view up into tiles of say 256x256 pixels, and render the tiles on separate threads - regular graphics stuff. When all is going perfectly, the threads don't need to communicate with each other at all, and you get pretty much linear speedup. But one of the times when threads do need to communicate is when they're using shared resources - and fonts are one of those.
It's simplest for me just to have one font cache for the entire process, but obviously you need to synchronize access to this cache. When I originally created this font cache, I wasn't really thinking of synchronization, and when it came time to make it multithreaded, I just added a huge lock around every entry point, and thought I'll make those finer when I need to. Yesterday I definitely needed to.
In the particular scene I was looking at, I was zoomed in close, so the spatial culling was basically making all tiles render the same objects. As it turns out, I had 8 threads doing something approaching 8 times as much work, but unable to run in parallel. I don't think it gets worse than that!
This was really easy to find - you just hit pause on the debugger while it's running, and you see all your threads stalled at the same wait point.



Tuesday, April 14, 2009

WPF rendering on Vista is ugly and blurry

UPDATE: I discovered the problem. It was the NVidia drivers for my 8600 GT. Previous drivers were 78.13 (7813). New drivers are 82.50 (8250). Running Vista x64 with .NET 3.5 SP1

On my Vista machine, WPF rendering is nasty. I don't know what the engine is doing. My DPI setting is default (96 DPI). Aero is turned on.


On Vista, witness the nastiness:




On XP, the kind of quality we've come to expect:





I can't puzzle it out. Incredibly odd that I can't find any mention of this on the WWW.