Prominent Apache Hadoop project contributor Hortonworks has this week detailed the new Hortonworks Data Platform powered by the Hadoop software framework for data-intensive applications. With a view to promoting adoption of Apache Hadoop, the Hortonworks Data Platform seeks to provide a stable foundation and a growing ecosystem for big data. As such, the company has also announced "comprehensive support and training" offerings and partner enablement programs aligned for company developers and systems integrators.
Typically described as "massively scalable", the Hortonworks Data Platform (HDP) is driven by projects including the Hadoop Distributed File System (HDFS), MapReduce, Pig, Hive, HBase, and Zookeeper. Seeking to be viewed as comprehensive at every level, HDP also now includes HCatalog, a metadata management service for data sharing between Hadoop and other enterprise information systems; Ambari, an open source installation and management system; and open HTTP/REST APIs that make it easier for ISVs to integrate and extend Apache Hadoop.
Backed by data integration players including Informatica, Hortonworks is hoping that HDP provides some answers to analyst firm IDC's proclamation that the amount of data across the globe is doubling every two years. "Apache Hadoop has established itself as a leading solution for unlocking the enormous potential of this vast amount of information, known as big data. It is critical for companies like Hortonworks to continue to develop and contribute core Apache Hadoop code back to the open source community, helping make the technology more palatable for widespread deployment across enterprise environments," said Benjamin Woo, program vice president, storage systems and lead analyst for Big Data, IDC.
As already stated, Hortonworks also announced support, training, and what it calls "enablement services". The company says that Hortonworks support subscriptions assist developers during the entire lifecycle of Hadoop solutions, including unprecedented Hadoop design, development, and proof-of-concept support. These enablement services are in fact a means of assisting ISVs and systems integrators in delivering optimized Hadoop solutions and services.
"Hortonworks is dedicated to advancing Apache Hadoop so that it becomes the de facto platform for the next generation of data processing solutions," said Eric Baldeschwieler, CEO of Hortonworks. "The Hortonworks Data Platform is an important step in achieving this goal. By providing an open, stable, and highly extensible platform that makes it easier to integrate Apache Hadoop with existing data architectures, Hortonworks is helping organizations maximize the value from the data flowing throughout their enterprise."