Sponsored By

How Cloud Is Bringing the 'Desktop' Back to 'VDI'

If you stopped selling virtual desktops because of back-end infrastructure cost and complexity, paired with less-than-stellar performance, cloud can address both. It might be time for a second look.

March 23, 2018

5 Min Read

By Amitabh Sinha

The “desktop” part of a “virtual desktop infrastructure” engagement often gets lost in the shuffle. It’s understandable that MSPs focus a lot of attention on the underlying infrastructure. Conventional wisdom says that if performance is poor, end users will revolt. But the fact is, when all the focus is on the “I,” and the “D” is an afterthought, those users often still end up unhappy, with IT stakeholders wondering why they spent all that money.

When IT buys PCs, they focus on how much CPU, memory and storage come with each device. Why should a virtual desktop be any different? Yet in most VDI deals, MSPs spend most of their time talking about infrastructure: servers, storage, layers, I/O and management tools. It’s exhausting and expensive.

Let’s look at a typical path to VDI.

IT’s been spending about $1,500 per PC, with that cost amortized over three to four years. But the overhead of dealing with physical desktops is high, and as the world goes mobile, tethering an employee to a bulky PC is a sure way to give competitors the advantage. In response, and in light of technical advancements, some IT teams are taking another look at  virtual desktops and apps. The promise of greater IT efficiency, information security and a productive mobile workforce is certainly attractive. But in most cases, MSPs begin the discussion by helping customers translate the desktop attributes into expensive and complex data-center technologies. IT stakeholders started asking, “If I have 1,000 users, how many servers do I need? What is the CPU and memory usage rate? How many users can I fit onto a given class of server? In which data centers do I put all this infrastructure?”

Figuring out storage needs has been tougher still. Local storage on PCs is the cheapest available — about $100 per TB. SAN/NAS can be 25-100 times that cost. If each user had 1TB of storage on their desktops, then you would need 1000TB of SAN/NAS. That is massively expensive.

That’s how the “I” got complicated.

In response, VDI vendors came up with various ways to optimize storage. The conversation might go something like this: “Oh, you can optimize with a single image so you don’t need to have 1,000 copies of Windows OS. Now, let’s put in layers so you don’t need to have 1,000 copies of each application. Wait, what about profile-management tools to store end-user personalization? You need that, too. Oh, and you can no longer manage it all with existing management tools like SCCM and Altiris. So, your VDI infrastructure is a standalone management framework.”

All of this sounds workable on paper, but Windows wasn’t architected to operate in this manner, so customers struggle with app compatibility, corrupted profiles and application updates. At the same time, storage vendors started implementing deduplication so that those 1,000 copies of Windows and applications were automatically deduped at the storage layer. Hyperconverged infrastructure (HCI) vendors ultimately adopted deduplication, and even though HCI has started to make a dent in the cost of a VDI implementations, it hasn’t gone far enough.

Now you have to think about where all this infrastructure is going to live. Which data center should it be in? How far away will your end users be from that data center? What does that mean for latency? How will the user experience be? How much bandwidth will they require?

You’ll notice, there’s been almost no talk of the desktop in this engagement. The last straw is often when IT departments have to jump through complex infrastructural hoops to deliver a mission-critical workload to a high-priority class of user. MSPs often run into a wall when IT stakeholders decide they have more important things to do, and more value to add, than dealing with all this complexity.

Getting the ‘D’ Back

Cloud allows the focus to return to desktops; in fact, it provides an opportunity to completely re-imagine what the phrase “virtual desktops” means. The data center is any region of the public cloud you select. Essentially, the infrastructure becomes invisible in that region, at least in terms of having to worry about it. Desktops can be placed close to users so they have a great experience. All IT needs to do is determine the configuration of the desktop, just like they determine the configuration of a PC.

As an MSP, you can help the IT team test drive and choose a desktop configuration running in AWS, Azure or the cloud of their choice — though beware of “cloud washed” VDI. You help select thin-client hardware, or let employees use their own gear. Order the number of virtual-desktop units needed, then use their corporate image to create copies in the various regions where users reside. Rather than shipping expensive, imaged PCs to end users, an expensive proposition in itself, you simply email links for the desktops.

The discussion goes from, “How many servers do we need?” to “How much CPU do we need?” You as an MSP can deliver VDI efficiently and simply, freeing you to work on higher-value projects and providing users with a better experience. If you have not considered offering virtual desktops as a service to your customers, it’s time to reconsider, because the cloud has put the D back in VDI.

Amitabh Sinha has more than 20 years of experience across enterprise software, end-user computing, mobile and database software. Amitabh co-founded Workspot with Puneet Chawla and Ty Wang in August 2012. Prior to Workspot, Amitabh was the general manager for enterprise desktops and apps at Citrix Systems. In his five years at Citrix, Amitabh was vice president of product management for XenDesktop and vice president of engineering for the Advanced Solutions Group.

Read more about:

Free Newsletters for the Channel
Register for Your Free Newsletter Now

You May Also Like