Existing analytics tools and techniques will be very helpful in making sense of big data. The algorithms that are part of these tools, however, must be able to work with large amounts of potentially real-time and disparate data. A competent infrastructure must be in place to support this. And, vendors providing analytics tools will also need to ensure that their algorithms work across distributed implementations. Because of these complexities, a new class of tools is expected to emerge to help make sense of big data.
Here are three classes of tools in this layer of a reference architecture. They can be used independently or collectively by decision makers to help steer the business. The three classes of tools are as follows:
-
Reporting and dashboards: These tools provide a "user-friendly” representation of the information from various sources. Although a mainstay in the traditional data world, this area is still evolving for big data. Some of the tools that are being used are traditional ones that can now access the new kinds of databases collectively called NoSQL (Not Only SQL).
-
Visualization: These tools are the next step in the evolution of reporting. The output tends to be highly interactive and dynamic in nature. Another important distinction between reports and visualized output is animation. Business users can watch the changes in the data utilizing a variety of different visualization techniques, including mind maps, heat maps, infographics, and connection diagrams.
Often, reporting and visualization occur at the end of the business activity. Although the data may be imported into another tool for further computation or examination, this is the final step.
-
Analytics and advanced analytics: These tools reach into the data warehouse and process the data for human consumption. Advanced analytics should explicate trends or events that are transformative, unique, or revolutionary to existing business practice. Predictive analytics and sentiment analytics are good examples of this science.