Application should dictate the best virtualization platform, not vice versa. The day when all the applications were virtualized on a single hypervisor is over. The discussion is closed to hypervisor features. Especially when the big platforms already have largely converged. In practice, heterogeneous virtualization environments has been established. For corporate users, who fear depending on the manufacturer, which is a good news.
Application Should Dictate the Virtualization Platform Not Vice Versa
Many organizations now using open source hypervisor and Linux virtualization. The features available in this area are at least good enough, especially in the Linux environment, although there is still a lot of workloads that could be virtualized. The standard recommendation in this context is : your applications and operating systems should determine which is the best way to virtualize. In this way, IT managers could then use advanced management features for those applications that really need. Sooner or later, the prognosis is that, all providers of virtualization management software would accept the heterogeneity in the company and support its products.
Microsoft with their free hypervisor Hyper-V is trying to break VMware’s dominance, especially in the SMB market and at the departmental level in larger companies. The question of whether the VM container is freely available, is now largely irrelevant: in fact, customers paying for the management stack and that is to be welcomed. Whether they use VMware vCenter, Microsoft System Center or any other management suite preferred, the choice of hypervisor today depends on the user who can decide freely and also open source options. VMware devices under considerable pressure further tie customers to its management tools. The bottom line, according to the forecast, in 2013 is that the management tools will become more powerful AND cheaper. The real battle for market share should take place in this segment.
---
Software-Defined Data Center and Virtualization
The concept of software-defined Data Center, from expert’s point of view is the next logical step towards highly automated and efficient IT operations. Basically was behind a natural extension of virtualization on storage and network levels. Starting at mid-year, analysts expect that the first integrated solutions for software-defined computing, which include network and storage resources. VMware, for example, had taken a big step in this direction with the acquisition of couple of companies and already have a powerful suite of APIs for storage integration. But Microsoft is also investing a lot in the subject and have its Windows Server 2012 decked out with an array of features for the storage and network virtualization.
Hybrid data centers with a mix of virtualized local resources (on-premises) and external workloads in the cloud (off-premises) is the future. This future is beginning now. IT managers should stop wondering if their virtualized environment is a real private cloud or not. Instead, they should focus on to supplement internal resources through infrastructure and applications from the public cloud. In concrete terms, this means, to buy tools for infrastructure management and check those tools that enable easy provisioning and automatic configuration of complex services. Their business customers want you to deliver IT services, not infrastructure.