Our client implements a global big data platform based on the Hortonworks Data Platform (HDP). That way, they pursue even more innovative ideas and methods in the field of data analysis than before. The highest priority was the shortest "time to value". Together with our client, we developed efficient and automated processes and a tailor-made Hadoop architecture for the specific application. Another aspect was the integration of Hadoop into the overall business intelligence landscape.

 

  • Establishing the workability of new projects and use cases in less than 3 hours
  • Integration of new data sources: from identification to use in 24 hours maximum
  • Establishment of a "Single Point of Data Access": Client's Data Scientists can access all existing data (Big Data, DWH, SAP,...) in a single integrated interface
  • Integration of the Big Data Platform into standard operating processes

Challenge

The investment in a Big Data Platform only pays off with proper intergration and efficient processes.

Ba means of Big Data analysis, risks can be better identified, modeled and insured. Our client uses this opportunity to develop new insurance solutions and services together with partners. The extended IT and analytical competence is specifically applied, e.g. in the early detection and trend analysis of fire damage. This requires the successful integration of a global Big Data platform that covers all areas.

Your experts

Solution

An overview of all supporting processes and an integrated solution ensures the success of the project.

Our task was to create a concept for all processes around the existing Hadoop cluster. Security, resource scheduling, data governance, data access and data ingestion. The implementation took place as DevOps directly across all environments.

The optimum technical solution must be defined for each specific use case. Every tool has ideal areas of application and limitations. We understand the challenges associated with many open source products and have a deep understanding of troubleshooting.

With Atlas and Hooks at each relevant data access, we provide an overview and traceability of all existing data. With Kerberos, Tag-based Security in Ranger and Knox, we provided both security and data privacy. Whether real time or batch data, with Kafka, Nifi, Sqoop and ETL tools such as Talend, we integrated data from all sources - public and internal, from ERP systems databases and unstructured sources.

Customer value

Efficient processes and data transparency greatly reduce the development cycles of Big Data use cases.

We integrated Hadoop into the business processes and ensured with the right technology, that use cases can be implemented efficiently and in a focused manner. With our knowledge of classical business intelligence, relational databases as well as the complete Hadoop ecosystem, we support the data scientists in all phases of the planned use cases - from the idea through development to implementation and staging in production.

Contact

Kai Niessen

T. +49 89 122219642
kai.niessen@ventum.de

Show news

Dr. Christoph Römer

T. +43 15 3534220
office@ventum.com

Philipp Wiegel

T. +49 89 122219642
philipp.wiegel@ventum.de

Show news

Interested in pursuing a career with us? You decide where you will go.

See all jobs

Read more

You could also be interested in this:

EAM: Wertschöpfend. Richtungsweisend. Unentbehrlich.

28.05.2018

Read more