Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Nine of the best BI and analytics software tools for enterprises

Scott Carey | June 27, 2016
As companies look to release their big data, companies like Qlik and Tableau are offering powerful self-serve analytics platforms

The self-service tool Sense is essentially an intuitive visualisation tool. It is a drag and drop input method, so there is no need for scripting or SQL queries and users can quickly use search capabilities to track down key metrics. Sense is enterprise-ready with central governance and security and the platform is fullly customisable when it comes to data sources through APIs.

Qlik Sense pricing is done on a token model. Customers can buy and allocate tokens to users or teams, so data volume, reports and queries don't factor. Token pricing is available by contacting the Qlik sales team direct.

3. Splunk

Splunk
© iStock

Arguably the market leader in the served analytics space, Splunk offers a self-serve solution for Hadoop called Hunk. This tool generates charts, visuals and dashboards from your Hadoop data lake. Hunk can be deployed with Apache Hadoop but also Cloudera CDH, Hortonworks Data Platform, IBM InfoSphere BigInsights, MapR M-series, Pivotal HD and Amazon ERM or S3.

Hunk is enterprise grade, with secure access controls, audit capabilities, integration with existing authentication systems and advanced data anonymisation capabilities. You can embed Splunk reports within applications or use ODBC integrations to access Splunk data within the environment employees are comfortable with, be it Microsoft Excel or Tableau.

Splunk also offers an enterprise grade guided analytics platform, a cloud analytics solution and an automated SaaS solution called Light.

Hunk is priced according to the number of TaskTracker Nodes (Compute Nodes in YARN) within your Hadoop clusters. Pricing for a one-year term license of Hunk starts at $3,450 (£2,400) per Hadoop TaskTracker Node or Compute Node, with a minimum of ten.

4. Trillium

Trillium
© iStock

If you are self-serving analytics from an ever-increasing range of data sources, then data quality is increasingly important.

Where IT would sometimes have to spend months consolidating and cleaning data up before presenting it to business users, companies like Trillium promise to reduce this process to under 30 days.

Trillium has two products for data quality control: Refine and Prepare. Both aid data preparation by connecting up disparate data sources and pooling them into a single repository, namely a Hadoop environment, under a coherent index. Refine goes one step further by taking this data and polishing it into a clean data set by merging or removing conflicting or incomplete records. This data can then be pushed on to visualisation platforms like Tableau or Qlik View to derive insight, both of whom Trillium partner with.

Ed Wrazen, VP of product management of big data at Trillium told Computerworld UK: "We're seeing the type of user looking for far more agility. So they don't want a big infrastructure, they want technology that gets them to market quicker." Wrazen said that the customers looking for this from the big vendors would need a plethora of tools that would have to be bolted together.

 

Previous Page  1  2  3  4  5  Next Page 

Sign up for CIO Asia eNewsletters.