Comments
yourfanat wrote: I am using another tool for Oracle developers - dbForge Studio for Oracle. This IDE has lots of usefull features, among them: oracle designer, code competion and formatter, query builder, debugger, profiler, erxport/import, reports and many others. The latest version supports Oracle 12C. More information here.
Cloud Computing
Conference & Expo
November 2-4, 2009 NYC
Register Today and SAVE !..

2008 West
DIAMOND SPONSOR:
Data Direct
SOA, WOA and Cloud Computing: The New Frontier for Data Services
PLATINUM SPONSORS:
Red Hat
The Opening of Virtualization
GOLD SPONSORS:
Appsense
User Environment Management – The Third Layer of the Desktop
Cordys
Cloud Computing for Business Agility
EMC
CMIS: A Multi-Vendor Proposal for a Service-Based Content Management Interoperability Standard
Freedom OSS
Practical SOA” Max Yankelevich
Intel
Architecting an Enterprise Service Router (ESR) – A Cost-Effective Way to Scale SOA Across the Enterprise
Sensedia
Return on Assests: Bringing Visibility to your SOA Strategy
Symantec
Managing Hybrid Endpoint Environments
VMWare
Game-Changing Technology for Enterprise Clouds and Applications
Click For 2008 West
Event Webcasts

2008 West
PLATINUM SPONSORS:
Appcelerator
Get ‘Rich’ Quick: Rapid Prototyping for RIA with ZERO Server Code
Keynote Systems
Designing for and Managing Performance in the New Frontier of Rich Internet Applications
GOLD SPONSORS:
ICEsoft
How Can AJAX Improve Homeland Security?
Isomorphic
Beyond Widgets: What a RIA Platform Should Offer
Oracle
REAs: Rich Enterprise Applications
Click For 2008 Event Webcasts
In many cases, the end of the year gives you time to step back and take stock of the last 12 months. This is when many of us take a hard look at what worked and what did not, complete performance reviews, and formulate plans for the coming year. For me, it is all of those things plus a time when I u...
SYS-CON.TV
The Three ‘ilities’ of Big Data
Part 1: Portability, usability & quality converge to define how well the processing power of Big Data platforms can be harnessed

When talking about Big Data, most people talk about numbers: speed of processing and how many terabytes and petabytes the platform can handle. But deriving deep insights with the potential to change business growth trajectories relies not just on quantities, processing power and speed, but also three key ilities: portability, usability and quality of the data.

Portability, usability, and quality converge to define how well the processing power of the Big Data platform can be harnessed to deliver consistent, high quality, dependable and predictable enterprise-grade insights.

Portability: Ability to transport data and insights in and out of the system

Usability: Ability to use the system to hypothesize, collaborate, analyze, and ultimately to derive insights from data

Quality: Ability to produce highly reliable and trustworthy insights from the system

Portability
Portability is measured by how easily data sources (or providers) as well as data and analytics consumers (the primary "actors" in a Big Data system) can send data to, and consume data from, the system.

Data Sources can be internal systems or data sets, external data, data providers, or the apps and APIs that generate your data. A measure of high portability is how easily data providers and producers can send data to your Big Data system as well as how effortlessly they can connect to the enterprise data system to deliver context.

Analytics consumers are the business users and developers who examine the data to uncover patterns. Consumers expect to be able to inspect their raw, intermediate or output data to not only define and design analyses but also to visualize and interpret results. A measure of high portability for data consumers is easy access - both manually or programmatically - to raw, intermediate, and processed data. Highly portable systems enable consumers to readily trigger analytical jobs and receive notification when data or insights are available for consumption.

Usability
The usability of a Big Data system is the largest contributor to the perceived and actual value of that system. That's why enterprises need to consider if their Big Data analytics investment provides functionality that not only generates useful insights but also is easy to use.

Business users need an easy way to:

  • Request analytics insights
  • Explore data and generate hypothesis
  • Self-serve and generate insights
  • Collaborate with data scientists, developers, and business users
  • Track and integrate insights into business critical systems, data apps, and strategic planning processes

Developers and data scientists need an easy way to:

  • Define analytical jobs
  • Collect, prepare, pre process, and cleanse data for analysis
  • Add context to their data sets
  • Understand how, when, and where the data was created, how to interpret data and know who created them

Quality
The quality of a Big Data system is dependent on the quality of input data streams, data processing jobs, and output delivery systems.

Input Quality: As the number, diversity, frequency, and format of data channel sources explode, it is critical that enterprise-grade Big Data platforms track the quality and consistency of data sources. This also informs downstream alerts to consumers about changes in quality, volume, velocity, or the configuration of their data stream systems.

Analytical Job Quality: A Big Data system should track and notify users about the quality of the jobs (such as map reduce or event processing jobs) that process incoming data sets to produce intermediate or output data sets.

Output Quality: Quality checks on the outputs from Big Data systems ensure that transactional systems, users, and apps offer dependable, high-quality insights to their end users. The output from Big Data systems needs to be analyzed for delivery predictability, statistical significance, and access according to the constraints of the transactional system.

Though we've explored how portability, usability, and quality separately influence the consistency, quality, dependability, and predictability of your data systems, remember it's the combination of the ilities that determines if your Big Data system will deliver actionable enterprise-grade insights.

This piece is the first in a three-part series on how businesses can squeeze maximum business value out of their Big Data analysis.

About Kumar Srivastava
Kumar Srivastava is the product management lead for Apigee Insights and Apigee Analytics products at Apigee. Before Apigee, he was at Microsoft where he worked on several different products such as Bing, Online Safety, Hotmail Anti-Spam and PC Safety and Security services. Prior to Microsoft, he was at Columbia University working as a graduate researcher in areas such as VOIP Spam, Social Networks and Trust, Authentication & Identity Management systems.

In order to post a comment you need to be registered and logged in.

Register | Sign-in

Reader Feedback: Page 1 of 1

SOA World Latest Stories
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. I...
Is advanced scheduling in Kubernetes achievable?Yes, however, how do you properly accommodate every real-life scenario that a Kubernetes user might encounter? How do you leverage advanced scheduling techniques to shape and describe each scenario in easy-to-use rules and configurations?...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, will discuss how to use Kubernetes to setup a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, d...
SYS-CON Events announced today the Kubernetes and Google Container Engine Workshop, being held November 3, 2016, in conjunction with @DevOpsSummit at 19th Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA. This workshop led by Sebastian Scheele introduces participants...
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is founda...
As software becomes more and more complex, we, as software developers, have been splitting up our code into smaller and smaller components. This is also true for the environment in which we run our code: going from bare metal, to VMs to the modern-day Cloud Native world of containers, ...
Subscribe to the World's Most Powerful Newsletters
Subscribe to Our Rss Feeds & Get Your SYS-CON News Live!
Click to Add our RSS Feeds to the Service of Your Choice:
Google Reader or Homepage Add to My Yahoo! Subscribe with Bloglines Subscribe in NewsGator Online
myFeedster Add to My AOL Subscribe in Rojo Add 'Hugg' to Newsburst from CNET News.com Kinja Digest View Additional SYS-CON Feeds
Publish Your Article! Please send it to editorial(at)sys-con.com!

Advertise on this site! Contact advertising(at)sys-con.com! 201 802-3021


SYS-CON Featured Whitepapers
Most Read This Week
ADS BY GOOGLE