Working with complex data structures just got easier with Inquidia’s new data integration component, allowing companies to directly create data in Apache Avro format.
October 9, 2014 – Chicago, IL – Inquidia Consulting has released a new software component allowing developers of big data architectures to manage data in one the most modern data formats, Apache Avro, a high-speed binary format that allows for complex Hadoop data structures. The component, designed for users of Pentaho Data Integration, can be plugged into new or existing Pentaho Data Integration deployments. The plug-in is currently available on GitHub.
“We've developed this with the native Apache libraries for optimal performance and extensibility,“ said Chris Deptula, senior architect for the project. “Our customers with Big Data want to be able to organize data quickly and use complex data structures at the same time. This plug-in, using the Apache Avro format, does just that.”
The new component will allow developers to easily provide data to their Hadoop environments with complex data structures, including hierarchical transaction data, log data and more. The data is stored in a binary format for fast access, while also allowing a dynamic structure that responds easily to change.
“As we help our customers implement Big Data architectures, having the ability to create and share complex data in a flexible format is essential,” said Bryan Senseman, senior partner at Inquidia Consulting. “Clients, community and Inquidia all benefit from this knowledge, so it’s important for us to share the expertise. We're confident the Avro component will simplify big data integration.”
About Inquidia Consulting Inquidia is an innovative professional services firm delivering full spectrum data engineering and analytics services that help our customers inquire, learn and take action with their data. We are passionate about data. Find out more at www.inquidia.com.