“Premature optimization is the root of all evil” is a famous quote in computer science, and it absolutely holds true. Before optimizing a problem, you must make sure that you are optimizing the bottleneck, and that your optimization doesn’t actually make things worse.
These rules may seem obvious, but not everyone adheres to them; you’d be surprised how many newsgroup postings I see where people are asking how to solve the wrong problem because they didn’t take the time to profile their program and locate their real bottleneck.
One example of this kind of premature (or perhaps just not all the way thought-through) optimization that bothers me on a daily basis is in the Microsoft Terminal Server client (mstsc.exe). Terminal Server is a remote windowing protocol, and as such it is designed to take great pains to improve responsiveness to users. In most cases, improving responsiveness over the network involves minimizing the amount of data sent from the server to the client. In this spirit, the designers of Terminal Server implemented an innocent-seeming optimization, wherein the Terminal Server client detects when it has been minimized. If this occurs, the Terminal Server client sends a special message to the server asking that it stop sending window updates to the client. When the user restores the Terminal Server client window, the server will resync with the client.
This may seem like a clever little optimization with no downsides at first, but it turns out that it actually worsens the user experience (at least in my opinion) when you look at things a little bit closer. First, there’s how Terminal Server resynchronizes with the client when the client requests that it again wants to receive windowing data. Windows follows the model of not saving data that can be recalculated on demand in its user interface design. In many ways, this is a perfectly valid model, and there are a number of valid reasons for it (especially given that as you open more windows, it starts to become non-trivially-expensive to cache bitmap data for every window on the screen – even more especially on the very low end systems that 16-bit Windows has to work on). As a result, when Windows wants to retrieve the contents of a window on screen, the typical course of action is that a WM_PAINT message is sent to the window. This message asks the window to draw itself into a device context, or a storage area where the bits can then be transferred to the screen, a printer, or any other visual display device.
If you’ve been paying attention, then you might be seeing where this starts to go wrong with Terminal Server. When you restore a minimized Terminal Server client window, the client asks the server to resynchronize. This is necessary because the server has since stopped sending updates to the client, which means that the client has to assume that its display data is now stale. In order to do this resynchronization, Terminal Server has to figure out what contents have changed on the overall desktop bitmap that describes the entire screen. Terminal Server is free to cache the entire contents of a session’s interactive desktop as a whole (and indeed this is necessary so that during resynchronization, the entire desktop doesn’t have to be transferred as a bitmap to the client). However, it still needs to compare the last copy of the bitmap that was sent to a client with the “current” view of the desktop. In order to do that, Terminal Server essentially does something along the lines of asking each visible window on the desktop to paint itself. Then, Terminal Server can update the client with new display data for each window.
The problem here is that many programs don’t repaint themselves in a very graceful fashion. Many programs have unpleasant tendencies like triggering multiple draw operations over the same region before the end result is achieved, something that manifests itself as a very slightly annoying flicker when a window repaints. Even Microsoft programs exhibit this problem; for instance, Visual Studio 2005 tends to do this, as does Internet Explorer when drawing pages with foreground images overlayed with background images.
Now, while this may be a minor annoyance when working locally, it turns out to be a big problem when a program is running over Terminal Server. What would otherwise be an innocuous flicker over the course of a couple of milliseconds on a “glass terminal” display turns into multiple draw commands being sent and realized over the network to the Terminal Server client. This translates to bandwidth waste as redundant draw commands are transmitted, and even worse, a lack of responsiveness when restoring a minimized Terminal Server client window (due to having to wait on the programs on Terminal Server to finish updating themselves in the resynchronization process). If you have several programs running on the Terminal Server, this can correspond to three or four seconds of waiting before the Terminal Server session is responsive to input from the client.
While this is annoying in and of itself, it may still not seem all that bad. After all, this problem only happens if you minimize and restore a window, and you generally don’t just minimize and restore windows all the time, right? It turns out that with the Terminal Server client, most people do just that, if they are working in fullscreen mode. Remember that fullscreen Terminal Server obscures the task bar on the physical client computer, and in many cases, results in task switching keystrokes such as Alt+Tab or the Windows key being sent to the remote Terminal Server session and not the physical client system. In order to switch to another program on the physical client computer, then, one needs to either minimize the Terminal Server window or (perhaps temporarily) take it out of fullscreen mode. At least for me, if I want to switch to a program on the physical client system, the logical choice is to hit the minimize button on the Terminal Server client info bar at the top of the fullscreen Terminal Server client window. Unfortunately, that little minimize button invokes the clever redraw optimization that stops the server from updating the client. This means that when I switch back to the Terminal Server session, I need to wait several seconds while programs running in the Terminal Server session finish redrawing themselves and transmitting the draw operations to the client (which is especially painful if you are dealing with bitmaps, such as Internet Explorer on a page with foreground images overlaying a background image).
As a result, thanks to somebody’s “clever optimization”, my Terminal Server sessions now take several seconds to come back when I switch away from them to work on something locally (perhaps to copy and paste some text from the remote system to my computer) and then switch back.
Now, Terminal Server is a great example of a highly optimized program on the whole (and it’s absolutely usable, don’t get me wrong about that). It beats the pants off of VNC and any of the other remote windowing systems that I have ever used any day of the week, for one. However, this just goes to show that even with the best of intentions, one little optimization can blow up in unintended (negative) ways if you are not careful.
Oh, and if you run into this little annoyance as frequently as I do, there is one thing that you may be able to do that alleviates it (at least looking to the future, anyway). When using the Windows Vista (or later) Terminal Server client to connect to a Windows Vista or Windows Server “Longhorn” Terminal Server (or Remote Desktop), you can prevent this lack of responsiveness when restoring minimized Terminal Server windows by enabling desktop composition on the Terminal Server connection. This may seem a bit counter-intuitive at first (enabling 3D transparent windows would sure make you think that a lot more data would need to be transferred, thus slowing down the experience as a whole), but if you are on a high-bandwidth, low-latency link to the target computer, it turns out that desktop composition improves responsiveness when restoring minimized Terminal Server windows. This is because with desktop composition enabled, Windows breaks from the traditional model of not saving data that you can recalculate. Instead, with desktop composition enabled, Windows will save the contents of all windows on the screen for future reference, so that if Windows needs to access the bits of a window, it doesn’t need to ask that window to redraw. (This allows all sorts of neat tricks, such as how you can have a window appearing to be drawn twice with the new Alt+Tab window on Windows Vista, with the live preview, without a major performance hit – try it out with a 3D game in windowed mode to see what I mean). Because of this caching of window data, when resynchronizing with the client after a minimize and restore operation, the server end of Terminal Server doesn’t need to ask every program to redraw itself; all it needs to do is fetch the bits out of the cache that is created for each window by desktop composition (and thus the differences sent to the client will only show “real” differences, not multiple layers of a redraw operation. Try this with an Internet Explorer window open on a page with foreground images overlaying background images, and the difference is immediately visible between Terminal Server with desktop composition enabled and Terminal Server without desktop composition.) This means that there are no more painful multi-step-redraw operations that are visible in real time on the client, at least when it comes to pathological bitmap drawing cases, such as Internet Explorer (and no annoying flicker in the less severe cases).
Sounds like you found the problem I’ve seen for the last N years (I guess my usage really amplifies the effect beyond what most people would experience)…
for the longest time I would reconnect to my TS session (change networks, restore TS window, whatever) and I’d have to wait for explorer to finish repainting itself… it seems to loop through redrawing the desktop with the system tray, then without the system tray (icon alignment slightly adjusts with each flicker)… this alone is mildly annoying, but especially when i’m waiting for a HALF AN HOUR for the updates to finish.
I suspect it gets batched up when I minimize the session all night or all day at work (I’m constantly connected to at least two TS sessions). I’m slightly disappointed I can’t simply change this behavior… oh well.
Glad to hear Longhorn will fix this (though arguably I’ll be looking more into app remoting instead, so it’d seem the desktop problem wouldn’t be as likely to affect me).
Thanks,
-Scott
Keep in mind that desktop composition mode is still pretty sensitive to bandwidth and latency, though, so on slow links (or high latency links), you’re probably better off with standard GDI mode.