Originally posted on the WHIR.
Perspective is always important when picking technologies to base the future of your company’s IT on. Technologies go through generations of improvements along multiple dimensions before they reach true ubiquity.
Take the disk drive as an example. First built in 1953 by IBM, each new generation of disk drives replaced larger, more sensitive, and more cumbersome devices. The cost also dropped precipitously with every generation. Not many people remember, but the earliest drives were usable only in the protected environment of a data center.
Later generations progressively reached factories, offices, and homes, eventually reaching the ubiquity we see today. Disk diameter was 14 inches and disks were typically mounted in standalone boxes that resembled washing machines! Individual drives often required high-current AC power due to the large motors required to spin the large disks.
This is a far cry from the sleek, quiet, and high performance solid-state drives (SSD) we use today, which can be considered the third-generation of hard drives. Third-generation technologies are faster, cheaper, and better.
That’s why your company should select third-generation technologies for its key IT infrastructure.
Similar to hard drives, the cloud is a multi-generational phenomenon. Its story is distinguished by three distinct generations of cloud. With the introduction of each generation, platforms, applications, and companies have been transformed by these emerging technologies and their radical advantages.
It’s important for companies using the cloud to make the correct “bet.” By one estimate, we are on track for a volume of nearly 7 zettabytes of data across the world by the year 2020. The rise of analytics and other modern applications are directly linked to cloud-based opportunities. Companies must make the right technological bets to take advantage of this amazing growth.
The Evolution of Virtualization and How We Approach Computing
The concept of virtualization in computing dates back fifty years. As computing transformed and evolved, server computing has become more affordable than ever. This means virtualization didn’t get a chance to take off; it was economically feasible to have individual servers take on individual workloads.
The tide turned as data began to grow, and each generation of technologies brought an insatiable thirst for power, space, and cooling. Costs began to swell, and it turned out that the one server-one workload architecture that was the de facto standard was extremely inefficient. Total resource utilization in countless data centers across the world ranked in the single digits.
As a response, virtualization re-emerged through innovations introduced by VMware in 1999. A number of other open-source projects like Xen spawned, and software companies starting their own efforts, like Microsoft Hyper-V, Citrix, and Virtual Iron. Today, virtualization has enabled the development of Virtual Desktop Infrastructure (VDI) and container technologies, which indicates we should expect the sustained evolution of improvements in the realm of virtualizing systems, applications, and workloads.
VPS Hosting: The First Generation Cloud
Virtualization opened the door for VPS, which was implemented by hosting providers around the world. VPS enabled hosting providers to offer customers a step up from the notoriously poor experience of shared hosting and segregate environments (based on selected partitions) into separate server systems.
The technology helped create a new type of offering that provides the cost advantages of shared hosting with the virtual control of dedicated hosting. Digital Ocean, Linode, and SliceHost have blazed a path on this front and continue to provide VPS today. The VPS space has become a place for open competition for bargain pricing, complete with concerns about reliability, quality, and support due to that “race for the cheapest.”
Nonetheless, VPS remains one of the most important innovations in the history of web hosting as it is essentially the first generation of cloud technology. Due to its nature, though, VPS never scaled to corporate IT and never gained ubiquitous acceptance.
The Second Generation Cloud: A Virtualized Hypervisor
There is no more significant milestone than the innovation of web-enabled applications and the infrastructure that supports them. Salesforce introduced the concept of web-enabled applications in 1999 by introducing the delivery of enterprise applications through a website. It would not take long for more applications to gravitate toward a more dynamic model.
Virtualization was the key component enabling the enterprise to launch apps on the web. As it has evolved, it has responded to a number of resource and management requirements at the infrastructure level with advancements that still benefit companies today.
Amazon was an early pioneer in the cloud industry, along with Rackspace, Hewlett-Packard, and a few others. Amazon’s cloud (Amazon Web Services) dominates the scene today (this is not a good thing). When Amazon entered the field, it was building its own web infrastructure to address a particular set of rapid needs and lessen the gap between its application engineers and infrastructure engineers. This led to the development of a proprietary set of reliable infrastructure tools that Amazon’s application engineers could use to deploy, monitor, and control systems.
Amazon saw an opportunity to create a product out of web-scale infrastructure so that web-scale applications could be easily utilized by customers. AWS is distinguished as a second generation cloud products by a number of factors we take for granted today. The virtualized system, the console, the ordering process, and the cloud Application Programming Interface (API) make up some of the cloud advancements within this generation.
This cloud generation enabled significant advancement in application structure and application delivery. Hovering overhead, however, is a set of shortcomings that have dogged the public cloud industry for years. For example, companies that face compliance, security, and performance needs have long struggled with the control and application structuring that this cloud generation lacks. Within a public cloud, there is no native way of structuring capacity, flexibility, and control that modern applications require. Various workarounds and accentuated offerings have been clumsy, costly, and inefficient.
The Third-Generation Cloud: Hybrid Hosting
In time, enterprise demands led to the emergence of the hybrid cloud: a platform that can provide a multi-tiered architecture of capability. This architecture has to include a dedicated layer for control, security, and performance. It must include a cloud layer that offers cost benefits and scalability. Flexible and ever-growing storage has to be there, too, because data is always growing. Finally, a network must be dedicated and exclusive to unify this architecture.
The case for hybrid is simple. With a unifying interface that is fully programmable, modern applications and computing needs are met by delivering the features of cloud computing and dedicated servers. Hybrid cloud users get the horsepower and control they need, where they need it, while also being able to utilize the flexibility and scaling power that the cloud provides. Hybrid delivers whatever computing resource is required, wherever it is needed, and allows your computing to match your application structure and workloads. It means more efficiency, which leads to better capacity management and financial savings.
However, not all hybrid models are the same, and successful models require tremendous feats of architecture and automation. Codero has achieved these feats for customers with the industry’s first on-demand hybrid hosting platform. By introducing a scalable and customizable infrastructure, Codero gave companies of all sizes (from startup through enterprise) the computing platform right-fitted for their needs and the ultimate platform for any company and any application.
In many cases (like AWS), second generation cloud platforms are pseudo-hybrid in nature at best. Codero offers a true hybrid platform that delivers every element of the infrastructure to customers on demand. The management interface is flexible, tunable, and programmable, making it the ultimate solution for today’s cloud needs.
This generation of cloud is here to stay because it provides endless configuration, customization, and advantages for all customers and applications. This is why hybrid is the most discussed architecture in the business and on its way to becoming the infrastructure of choice for all companies.
Have a Strategy
What does the future hold? Based on indicators of the needs that lie ahead and the abilities to meet them, the next generation of cloud computing will have to deliver even better value, faster. Automation, manageability, and reporting will be key factors in this next generation cloud.
Codero sees the next generation cloud as an intelligent platform with the ability to integrate with existing technology processes and management tools. It will support various business tenants and, in essence, be a flexible service platform. This platform will be unified and behave as a single, simple-to-configure entity. It will be defined by software to accommodate the needs of software and applications, and it will address needs for identity, security, and the Internet of Things. Finally, it will be free of the anchor of a single infrastructure.
The challenge can be simply stated. Many organizations do not have the capacity and resources to deploy and maintain systems that are complex in nature. At the same time, however, the need for increasingly complex infrastructures and applications is increasing, as many organizations face regulations, budgetary concerns, customer service, and the overall customer experience.
The answer to these questions will be based on simplicity and value. It’s a delicate dichotomy balanced by the tremendous value that only hybrid delivers. The hybrid cloud, the third-generation cloud, is the only smart bet for today’s IT infrastructure.