Sep 9, 2014

As Big Data Grows, Hybrid Hosting Is the Answer

The age of big data has just given us an accidental insight in the wake of a recent significant earthquake in California.  It turned out that data points from fitness trackers out in the public indicated how many people were awoken during the 3am Bay Area quake. That was a little bit of a peek into the future of the internet of things and an even bigger peek into how big data can open to doors to unexplored insights.


Now, the finding that an earthquake jolted people from their sleep isn’t exactly surprising, but consider that scientists have been throwing computing at seismic research for a very long time. The big data use case is accelerating these efforts. In fact, one of the biggest benefits of this data analysis is evaluating risks.  Everything from fault data, to soil types, to population are integrated to help plan emergency resources, building codes and crisis planning. The point is there are many flexible points of data that have yet to be integrated and analyzed into the picture, a classic big data case.

How big data works with Hybrid Hosting

By now, you may know that big data generally has three characteristics — volume, variety and velocity.  If you look at the seismic research example, you can actually segment the basic logical computing structures by computing demands.  As you’ll see, the big data computing model maps perfectly to the hybrid cloud computing model.

  1. Bare metal: The most obvious segment is the data backbone where all the fun stuff happens – crunching numbers, analyzing data and delivering the biggest value point. It should be obvious that this is a high-CPU, high-memory, high-capacity level that is perfect for the bare metal servers layer of hybrid.
  2. bubble-cloudCloud: Collection points exist everywhere in the big data model.  In the case of seismic research, it could mean sensors, geographical data, soil types, etc.  Collecting staggering amounts of minute data, these workhorses still require performance, but they aren’t processing the really big stuff.  The need to collect more and to collect newer means these elements have to be flexible, portable and quickly deployed.  This is the cloud level of the hybrid model.  Powerful, cloud-based and as-you-need-it.
  3. Data: It’s called big data for a reason. Lots and lots of data. Live data. Data from a week ago.  Data from a year ago. Maybe even data forever. Every organization has different needs and requirements on data strategy and retention. Sometimes it’s dictated by what value they seek in what they are analyzing in the big data picture. Sometimes there are regulatory demands. At the end of the day, the customer decides what they keep and this requires flexibility.  Hybrid cloud is the simplest model to let your data be portable, scalable and accessible throughout your environment all in one.

Benefits of bringing big data and hybrid together

Structuring your big data architecture along hybrid cloud computing architecture reaps you strategic benefits as well.  One of the biggest concerns in the data age centers on security.  Hybrid infrastructure allows for the isolation of information and data control while simultaneously integrating cloud features.

Another benefit is that by its very nature, big data grows and hybrid scales to those needs in terms of platform, storage, and infrastructure.  Hybrid computing is also elastic.  Cloud bursting in hybrid environments means spinning up new workloads, meaning you can quickly add quantity to the big data equation or throw more powerful systems into the mix where needed.  This allows an organization to avoid downtimes or offload resource demands as needed.  Hybrid infrastructure delivers an adaptable platform for big data with full environmental management, scaling to demands and delivering tunable performance.

As we look at the ‘accidental’ revelation from the fitness tracker data, what is most clear is that the big data picture is unpredictable.  We don’t have a crystal ball, and it will grow in ways we can’t quite track.  Hybrid cloud is the ultimate answer for this as it is completely portable to growth and change.  Data analysis itself changes over time, and static environments are not well-suited to this dynamic.  The flexible cloud components, powerful bare metal systems, and fluid unified storage and capacity of hybrid cloud is the best platform for your big data needs.

Share on FacebookTweet about this on TwitterShare on Google+Share on LinkedIn

Tags: ,