Eighty percent of the time, integrating existing applications into a software-defined data center (SDDC) isn’t that complicated. That’s especially true of applications that are no more than a few years old, use open source technologies or are already cloud-enabled. It’s the other 20 percent—those legacy apps that companies simply can’t do without--that are the issue. Often, these applications were built with proprietary technologies, using out-of-date methods and code.
Integration is definitely a challenge for your customers. According to research from Enterprise Management Associates (EMA), integrating legacy and new technologies is one of the three top pain points caused by silos within IT.
In these cases, the best-case scenario is when customers are willing to start over with a more modern application--one that is built with open source technologies and is cloud-ready out of the box. But there are cases when these kinds of applications just won’t do. Think about a specialty manufacturer that built a custom solution when off-the-shelf ERP system wouldn't work for specialized processes. This type of specialty software is expensive and highly customized, and it would be very difficult for customers that use such applications to just give them up.
How can you as a partner help?
As long as your customers' SDDC is based on open standards—and it always should be—there is a workaround. An open platform helps enable compatibility and consistency with applications running on legacy infrastructures.
The most stubborn integrations may require external tools. An EMA report found that integration in a SDDC often requires third-party vendors and plug-ins to provide the interfaces.
To help your customers choose the right tools, note which management systems both the legacy applications and the software defined data center are using and explore which tools will best unify the information from these two administrative layers, advises Johnnie Konstantas, director of security solutions marketing and business development at Gigamon.
“Important also is the ability to maintain visibility into the information flowing among the tiers of the application, whether in a legacy deployment scenario or a converged one,” she says. “A visibility fabric can provide this insight into the data in motion among application tiers, and ensure that troubleshooting and security access controls can be consistently applied.
Another tool that can be particularly useful is the Gateway Interface Module, which connects the software-defined infrastructure to the legacy architecture. This allows the SDDC to be built on top of existing infrastructure investments.
Incorporating external tools isn’t for the faint-hearted, which is where MSPs come in.
With that said, there is a lot that IT ops and data center architects can do to move things along more quickly.
“What’s important is an agreement among the various teams that are impacted as to the architecture to be used," says Konstantas. "If they can standardize on one approach, it helps contain the complexity and make scaling and customization faster.”
For more information on the Software-Defined Data Center please visit http://www.vmware.com/content/dam/digitalmarketing/vmware/en/pdf/techpaper/technical-whitepaper-sddc-capabilities-itoutcomes-white-paper.pdf.
Guest blogs such as this one are published monthly and are part of Talkin' Cloud's annual platinum sponsorship.