Cloud Computing Expo
Big Data – A Sea Change of Capabilities in IT
An exclusive Q&A with Matt McLarty, Vice President, Client Solutions at Layer 7 Technologies
By: Jeremy Geelan
Jul. 11, 2012 04:00 AM
"Big data represents a sea change of capabilities in IT" notes Matt McLarty, Vice President, Client Solutions at Layer 7 Technologies, in this exclusive Q&A with Cloud Expo Conference Chair Jeremy Geelan. McLarty continued: "In conjunction with mobile and cloud, I think Big Data will provide a technological makeover to the typical enterprise infrastructure, drawing a hard API border in front of core business services while blurring the line between logic and data services."
Cloud Computing Journal: Agree or disagree? - "While the IT savings aspect is compelling, the strongest benefit of cloud computing is how it enhances business agility."
Matt McLarty: Agree. We have a number of customers who are able to use Layer 7 Gateways to protect their cloud deployments, and leverage the elastic scaling model of the cloud to handle seasonal or sporadic bursts of traffic dynamically. Historically, these companies would have to try and forecast this and risk over-buying infrastructure. So there is a big cost savings, but dynamic scaling is a new capability that only comes with the cloud model.
Cloud Computing Journal: Which of the recent big acquisitions within the Cloud and/or Big Data space have most grabbed your attention as a sign of things to come?
McLarty: What's grabbed my attention most is the fact that the Big Data - and specifically Hadoop - world is so raw that acquisition targets don't even exist. In its place, we've seen an unprecedented talent acquisition spree for anyone with Hadoop experience and data science skills. Big data represents a sea change of capabilities in IT and will have an impact on people, process and tools. In conjunction with mobile and cloud, I think Big Data will provide a technological makeover to the typical enterprise infrastructure, drawing a hard API border in front of core business services while blurring the line between logic and data services.
Cloud Computing Journal: In its recent "Sizing the Cloud" report Forrester Research said it expects the global cloud computing market to reach $241BN in 2020 compared to $40.7BN in 2010 - is that kind of rapid growth trajectory being reflected in your own company or in your view is the Forrester number a tad over-optimistic?
McLarty: Of course, this comes down to what people define as "cloud computing." Are traditional ASPs already being branded as cloud providers? Regardless, there are enough dimensions of growth for cloud - migration of COTS offerings to SaaS, globalization, support for mobile channels and big data - to justify an order of magnitude in a decade. It is certainly reflected in the growth of Layer7's business, and I'm sure there are more daring projections out there in the blogosphere.
Cloud Computing Journal: Which do you think is the most important cloud computing standard still to tackle?
McLarty: I think a standard/syntax for auto-provisioning cloud services would be quite useful. As I said earlier, much of the unique value of cloud comes from the ability to spec the infrastructure dynamically. Having the ability to migrate or balance workloads across a hybrid or federated cloud would be powerful for companies, but it would undoubtedly be met by resistance from the cloud providers and from the niche companies that have built a business around such a service.
Cloud Computing Journal: Big Data has existed since the early days of computing; why, then, do you think there is such an industry buzz around it right now?
McLarty: Like many technological innovations, Big Data has to have a lot of things coming together to make it appetizing to the mainstream. I remember seeing Sony HDTVs around 1990, but it wasn't until around 2005 that there was a critical mass of content, network capability and parts commoditization to make it palatable for the masses. The same thing is happening with Big Data: we now have the network bandwidth, distributed computing power and caching technology to make unstructured, fragmented data retrieval practical. And most of all we have the burning platform; we have simply outgrown our relational indexing capabilities.
Cloud Computing Journal: Do you think Big Data will only ever be used for analytical purposes, or do you envisage that it will actually enable new products?
McLarty: I believe that Big Data has the potential to augment all existing IT interactions. I would answer a slightly different question: if analytics are now available in a real-time context, how can they be used to augment other business and IT services? In the world of real-time integration - the world Layer 7 thrives in - we have seen an industry build out around Event-Driven Architecture, and consequentially seen that solution area integrate with SOA. Big Data can drastically change that game, and I envision a post-Big Data enterprise integration landscape where real-time business services are analytics-enriched, exposed through secure APIs, and accessible to mobile devices, web apps, and B2B consumers.
Reader Feedback: Page 1 of 1
SOA World Latest Stories
Subscribe to the World's Most Powerful Newsletters
Subscribe to Our Rss Feeds & Get Your SYS-CON News Live!
SYS-CON Featured Whitepapers
Most Read This Week