We currently have no other option but to connect to several machines in the field using a dialup connection that at best, connects at around 33.6k.To say performance is lacking is an understatement.
Previously, at the same location, with machines running windows XP, on the same external modems, we had much better performance. I blame the majority of the performance issue to all of the 'eye candy' in windows 7, however we have disabled just about everything we can to save on overall bandwidth use.
The most noticable of the issues and the one i am hopeful to resolve here, and is, well, difficult to explain. It is almost as if the images being generated and sent to the remote client are being generated faster then they are being sent and the longer the session is connected, the larger the gap between what is being done and what is being seen gets.
I will try to describe better if this is not clear enough... Just ask me to clarify
Is there a way to tell the host to purge whatever buffer or queue is being used to send the desktop imagery back the the remote client every so often? or to only transmit said imagery if there was an update to that region of the screen? i believe something like one of these options would compensate for the lack of available bandwidth
Thanks in advance for any help