At the press and analyst day of the Hadoop Summit in Dublin, three Hortonworks partners told the press what their customers want most when it comes to embracing enterprise-ready, open source big data tools.
Where Hortonworks is keen to speak about the transformational capabilities of Spark, machine learning, real-time and predictive analytics, the partners painted a slightly different picture, of enterprise customers that just want simplicity when it comes to releasing value from their data.
Corporations are looking to reduce their reliance on coding and programming when it comes to their big data strategy, especially with the current talent shortage in data science.
This means that simplicity and enterprise-ready Hadoop deployments are a natural fit for companies looking at open source big data solutions.
As its VP of corporate strategy Shaun Connolly said, Hortonworks at its core is about: "Productising Apache tech into commoditised enterprise tech."
Stefan Voss, director of technical marketing at EMC said: "Simplicity is the most important trend we hear from customers. You will hear all of this nice, pretty new projects and all enterprise customers struggle to integrate them all and the complexity."
EMC's Voss said that he points his customers towards integrated, enterprise-ready solutions like Hortonworks DataPlatform (HDP) for data-at-rest and DataFlow (HDF) for data-in-motion.
This "allows the data scientist to pick and choose which tools they need based on the data streams and deploy it in a very fast manner", he said.
Voss expects to see more industry specific tools being developed for Hadoop for the Internet of Things (IoT) use cases and in verticals such as healthcare, to simply plug and play when it comes to complex data analytics, "with the intention of hiding the complexity of the underlying structure," he says.
Chris Goodfellow, CTO of Haven OnDemand at Hewlett Packard Enterprise, agrees that his customers seek simplicity.
"A lot of our customers, especially in the more traditional enterprises, their data is all over the place and each piece of data is siloed," he said.
"You couldn't have one hundred [Hadoop] projects, with thirty people [working on] each. So making the technology accessible and simplified is a key evolution [that will enable] perhaps two or three developers to say: 'I have this specific business problem and I will apply this technology to that without having to start from scratch and deploy servers and networking.'"
Hortonworks executives spoke earlier in the day about how their customers generally fall into two camps when it comes to their data: they want to 'renovate' and 'innovate'.
'Renovate' means to take proprietary data spread out disparately across various silo and bring it together using a single data lake. This is where Hortonworks HDP comes in.
Sign up for CIO Asia eNewsletters.