Pages

Friday, 24 July 2015

Big, Fast, and Powerful: Are You Ready for Next-Gen Data?

Big, Fast, and Powerful: Are You Ready for Next-Gen Data?

According to a new report from Capgemini, companies willing to spend on Big Data initiatives believe that they can disrupt a competitive landscape. Despite added investment in the volume and variety of data, the velocity of information remains problematic; businesses often struggle with making critical decisions quickly enough to satisfy consumer and stakeholder demands. The bottom line? Companies need a way to harness “Fast Data,” integrate it with existing data, and arrive at actionable results in real time.

Long Gone

Current data analysis techniques focus on archeology: dig deep and uncover what’s already happened to inform future decision-making. While patient work and the right technology can unearth some truly remarkable insights, however, this method puts organizations a step behind—customers have already left the store, purchases have already been made, and new data already generated. Fast Data is the new kid on the block, the one with potential to tap sources in real time. But without the backing of archeological information any insights generated are ultimately shallow—can companies get the best of both worlds?

Declared Intentions

The underlying idea here is to act with intention rather than firing blind or getting by with “just enough” information to act. The ideal? Event-driven data in motion is correlated over time and then applied to preexisting operational knowledge gleaned from Big Data analysis. This allows companies to take action while events remain relevant rather than after the fact.
Achieving this aim requires the use of Inference Rules both declaratively programmed and forward-chained. Simply put? Companies should be able to create “if-then” rules that leverage historical data and detect unusual patterns to produce multiple responses. Taking this a step further is the ability for rules to evolve based on state changes informed by real-time data processing. The result is decision making informed by both past and present data to improve future outcomes.

Applicable Use

Consider the use of electronic submersible pumps designed to extract oil at significant ground depths, often in remote parts of the world. These pumps are typically unmanned, but contain a host of sensors to monitor temperature, motor vibration, pressure, and current. Using historical data alone allows oil companies to create generally applicable maintenance schedules, but can’t predict the unexpected such as pump failure due to sudden environmental impacts such as storms or earthquakes.
Real-time data—slightly increased pressure and current, for example—suggests the possibility of more immediate failure. When combined using event processing with Inference Rules, operational centers are able to correlate existing conditions with historical results to determine if failure is imminent or the pump will stabilize without assistance.
The next iteration of Big Data revolves around speed and power—how can existing warehouses be combined with streams of events and used to empower decision-makers? Inference Rules provide a new approach for gaining an edge on data velocity by operationalizing the knowledge gained from analytics on data at rest, correlating data in motion, and taking action before an opportunity passes.

0 comments:

Post a Comment