Server Based Computing: Goldmine Or Snakepit? (Part 2)
Wednesday, 06 September 2006 by Michel Roth
In the second part of this series, I will discuss some of the disadvantages of Server Based Computing and what the limitations of Windows Server Based Computing are. We'll also talk about what you need to know about to make your Server Based Computing initiative a success.

Cons

Of course Server Based Computing isn't a miracle cure for all your computing needs. The are some issues you need to know about. Consider:

  • More Knowledge required
    Dealing with Server Based Computing requires an in-depth knowledge of Windows and all the applications you use. This is because Terminal Server introduces the concept of a Multi-user operating system. You need to think about this before you blindly (try to) install applications. One of the biggest challenges will be to configure the environment in such a way that applications and users can co-exist on the same server without conflicting with each other. Another challenge is to manage your Server Based Computing environment in such a manner that all servers remain the same.
  • Start-up Costs
    The initial cost of deploying Windows Server Based Computing can be significant. There are quite a few costs associated with starting up a new Server Based Computing initiative. Consider the investment for new (Terminal) servers: Terminal Servers tend to be high-end servers with lots of memory and plenty of processing power to facilitate as many simultaneous users as possible. Then there's the cost for TSCALs (Terminal Server Client Access License) and if you're using a third-party product to add to the functionality of Terminal Server, for example Citrix, then you have to factor those costs into account as well. Sometimes, there's even more software associated with a Windows Server Based Computing deployment, like some kind of management framework or an application distribution suite. The cost of these kinds of tools should also be taken into account.
  • More SPOFs
    SPOF stands for Single Point of Failure. This definition is sometimes subject to debate, but let me elaborate on what this means in this context. In Server Based Computing, by definition, everything runs on the Server. If one of these servers has a problem, everyone on that server has a problem. If, for example, the Terminal Server Licensing server has a (serious) problem, everyone has a problem. You can of course combat these SPOFs but that will result in more costs. It's just that you have to think about it.

If the shoe doesn't fit...

It's important to know the limitations of (Windows) Server Based Computing so you can steer clear of the areas when you are implementing Server Based Computing. Typical problem areas of Server Based Computing are:

  • Graphically Intensive Applications
    Server Based Computing isn't fit to handle large amounts of heavy graphics (think AutoCAD or video editing here). This is because of the mechanics of the protocols used in Windows Server Based Computing. The Terminal Server protocols RDP or ICA for example, are all built for low-bandwidth connections. One of the reasons the protocol uses such little bandwidth is because all the Terminal Server sends to the client is screen updates. When the Terminal Server runs a graphically intensive application or video, the protocol basically cannot handle it because it was not designed for this. Citrix has tried to combat this with their SpeedScreen technologies but this is just a "patch" (the protocol still is fundamentally the same). Also Citrix is working on Project Ocelot which should enable extreme graphics in Terminal Server environments. Another company who deals in enhancing Terminal Server graphical performance is ThinAnywhere.
  • Scientific Applications
    OK, so calc.exe in scientific mode is scientific to me right there, but that's not what I'm talking about. Scientific programs are programs that need to claim the CPU for long periods of time or claim monstrous amount of memory to perform very complex calculations. As long as the output isn't too graphical, these programs might just run fine. The only problem is that you can only have one instance or so running per server, because that application eats up your whole server. This beats the whole purpose of Terminal Server.
  • Bandwidth
    Server Based Computing is always "live". The actual computing is done on the Terminal Servers. All the client does is send keystrokes and mouse movements to the Terminal Server to operate the session. Because of this mechanism the (quality of the) network connection from the Terminal Server to the client is extremely critical. Slow or unstable network connections seriously hinder Terminal Server sessions. No network connection means no show. Citrix has put some technologies into their Presentation Server product that makes their protocol less susceptible to these kinds of networks problems but this only helps with small network problems (latency, packet loss) but currently no network means the end of the road, even for Citrix... This puts special demands on network connections (LAN as well as WAN) in Server Based Computing environments.
  • Peripherals
    Earlier I told you that all the client does is send keystrokes and mouse movements to the Terminal Server. But what if you have a scanner, smartcard reader or webcam? Then you most likely will have a problem. The problem stems from the fact that these devices need to be supported on the local operating system. Local operating system? Yes, every Thin Client device needs some kind of operating system to work. These operating systems tend to be extremely small and optimized. Drivers usually take up the bulk of an installation so these are typically left out. Another limiting factor is the fact a lot of Thin Clients are based on some kind of Linux. And we all know that drivers for Linux aren't as abundant as they are for Windows. And even if you have the driver in your Thin Client operating system, then that still is no guarantee that your device will work in your Terminal Server session.

My personal Advice for Server Based Computing

During the years that I've been working with Server Based Computing, I've come up with my own "best practises" based on my experiences.

  • Get Help
    Because of the fact that Server Based Computing requires an in-depth knowledge of Windows and all the applications you use, you should really hire-in knowledge to help you design, build and implement your Server Based Computing environment. For example, in the early stages of your environment, you will need make important (design) decisions which you cannot make properly if you do not have the required knowledge. These decisions can determine the success of your Server Based Computing implementation. You get the best results if the hired help works closely with in-house future administrators of the environment.

 

  • Invest In Knowledge
    You should make sure that after the implementation of a Server Based Computing environment (by the mentioned hired help), there is enough knowledge in your organisation to manage the new environment. There are enough examples out there, where the design and implementation was flawless but management was done by people with too little knowledge about the environment, resulting in a flawed environment.

    Therefore you should invest in knowledge. As said in the previous paragraph, make sure future administrators work closely together with the hired help. In my opinion this is the most effective type of knowledge transfer. Also, invest in courses and workshops, not just for IT but also for managers.
  • Go SBC All The way
    If you deploy Server Based Computing properly, you can save a lot money. It is however important that you be as true to the concept as you realistically can. This is very important. If you do a half-baked implementation of Server Based Computing it will cost you more money than it saves you. So don't for example have an environment that is half Client-Server and half Server Based Computing, because that will make your environment even more complex to manage. Try to deploy Thin Clients as much as possible. If you want to utilize your older hardware, "convert" them to Thin Clients.
  • Carefully Select The Right Thin Client
    A common mistake I've seen happen at Server Based Computing implementations is the lack of a proper selection procedure of Thin Clients. You should not underestimate the importance of the Thin Client. If you do Server Based Computing properly, you'll end up with the bulk of your clients being thin. Therefore, there's a lot riding on it. Make sure it is fit for your environment. It's out of scope here to discuss how you should select the Thin Client for your environment (that would be an article by itself) but at least you should take a look at the video performance (Achilles heel of Thin Clients) of the devices you are considering. Don't just blindly go for the cheapest unit or try to stay true to your brand; it's way too important for that.
  • Reduce The Number Of Applications
    One part of any Server Based Computing implementation is to make an inventory of all the applications you have. This is because you will have to install these applications on your Terminal Servers. While taking the inventory it is very possible to make a selection in those applications. You'll find that there are applications that aren't even being used anymore, are not licensed or perform the exact same function. Needless to say, the less applications you have to support, the better.
  • People Management
    Like I said in part one of this article, the biggest cost savings you can achieve in Server Based Computing environments is the managing of the environment. Desktop management will be a LOT less if you implement Server Based Computing properly. Depending on the country you live in, it's impossible to lay off someone immediately after you've realized that there's no more (desktop) management for him/her to do. So you need to think about this before you start. Perhaps you can educate these people to become Terminal Server administrators or move them to other departments. Whatever you do, think about it before it's too late.
  • Don't Push It
    Some applications, devices or other entities just aren't fit for Server Based Computing. Don't try and force these into the new Server Based Computing environment. Often these kinds of solutions are based on custom programming, cost a lot of money and are hard to manage. To rephrase a well-known saying: if the shoe doesn't fit, don't put it on...

Conclusion

So I guess it's time for the million dollar question. Do I think you should get into Server Based Computing? Yes, you should. If you do it properly, it can save you a lot of money. Remember, Server Based Computing is almost never a 100% solution. It usually coexists with other solutions. That's no problem. As long as you try to be true to the Server Based Computing formula within the guidelines as I've tried to put them down in this article, you'll be fine.

 

This article was previously published at MSTerminalServices.org.


Related Items:

Server Based Computing: Goldmine Or Snakepit? (Part 1) (23 August 2006)
Linux Based SBC Book (24 August 2007)
Server Based Computing: Goldmine Or Snakepit? (Part 2) (7 September 2006)
Are thin client devices still relevant in a world of $300 Dell PCs? (7 February 2007)
Server Based Computing: Goldmine Or Snakepit? (Part 1) (23 August 2006)
Server Based Computing Gets A Lift (18 July 2006)
2X Software Publishes Server Based Computing Guide (30 November 2006)
Going Thin - About To Be In? (14 December 2006)
THINWORX 2.8 (28 January 2006)
Citrix XenDesktop vs. VMware Virtual Desktop Manager (1 April 2008)
Comments (0)