Free Newsletters for the Channel
Register for Your Free Newsletter Now
September 7, 2012
Divining future trends in the rapidly evolving world of Big Data is tough. But a report released by Pentaho offers some interesting clues. Read on for some of the company’s findings regarding Big Data in the enterprise.
Pentaho, of course, is not a neutral party in analyzing Big Data trends. Since its main product is a suite of data integration and reporting tools, the company is deeply invested in the same market that the report discusses. Still, the findings, which are based on Pentaho’s own sales records, appear objective enough and offer some interesting insights into the direction of Big Data.
Here are the conclusions in a nutshell:
Technical IT staff are playing a key role in guiding the adoption of Big Data solutions by organizations. Noting this trend, Pentaho also suggests that open source has a special appeal to the people driving decisions in this space.
An increasingly broad array of organizations, both large and small, are using Big Data analysis tools for diverse purposes. Examples cited in the report include customer behavior analytics, lead conversion analytics, security threat pattern analytics, social media marketing analytics and supply chain optimization.
Organizations are more willing to adopt Big Data solutions when the requisite tools simplify the inherent complexity of Big Data processes. Wrappers that make it easier to write scripts or maintain software are key to selling Big Data platforms to customers.
Hadoop, the technical framework on which many Big Data platforms are based, is leading the pack as the preferred solution. According to the report, “more than 70 percent of new Pentaho customers in the [second] quarter [of 2012] are deploying on Hadoop, with the remaining new customers split evenly between NoSQL databases (MongoDB and Cassandra) and analytic databases (primarily Greenplum and Vertica).” If you’re not Hadooping now, it’s probably time to start.
On a related note, the revenue generated around Hadoop is exploding, with $77 million reported in 2011 and $812 million projected by 2016. This is according to IDC, which also expects total revenue for the Big Data industry to reach $17 billion by 2016.
Pentaho, of course, hopes that organizations will embrace its data analysis and maintenance tools as they build their Big Data infrastructures. But whichever solutions administrators choose to adopt, it’s clear that Big Data not only is here to stay, but likely to continue to grow rapidly in the months and years ahead.
Further, if Pentaho’s report is any indicator, developers interested in claiming their piece of the Big Data pie as the channel evolves should focus on creating simple, accessible tools and interfaces that are dynamic enough to be deployed for a range of different use cases. They should also rally around Hadoop, since it has clearly become the core of the Big Data world.
Christopher Tozzi started covering the channel for The VAR Guy on a freelance basis in 2008, with an emphasis on open source, Linux, virtualization, SDN, containers, data storage and related topics. He also teaches history at a major university in Washington, D.C. He occasionally combines these interests by writing about the history of software. His book on this topic, “For Fun and Profit: A History of the Free and Open Source Software Revolution,” is forthcoming with MIT Press.
You May Also Like
Canalys Channel Leadership Matrix Names AWS, Cisco, HP Among 'Champions'Feb 22, 2024
CrowdStrike, SonicWall Cyber Threat Reports Highlight Attacks, Popular TacticsFeb 21, 2024
Zscaler, Juniper, Cato Launch New B2B Tech ServicesFeb 21, 2024
Meet Channel Futures' 50 Channel Influencers for 2024Feb 20, 2024