Creating a Data Repository Center for a large insurance organization, Uniqa
Technologies, methods and tools
Apache Spark, Apache Hadoop, Apache Airflow, Apache Zeppelin
- Developing the necessary software and hardware architecture
- Analyzing information requirements
- Designing the solution
- Implementing the solution
- Production launch
- Delivering a comprehensive deployment plan, including risks and opportunities
- Developing platform use scenarios, e.g. a Customer Analytic Record
- Sharing customer data for the purposes of operational processes
- Performing a customer satisfaction survey
WHAT WERE THE NEEDS?
The insurance company needed to create a Central Data Lake Repository as part of its digital transformation. We were included in the Data Lake implementation process at the conceptual design stage.
WHAT HAVE WE DELIVERED?
At the initial stage of the project, we delivered a comprehensive deployment plan, including risks and opportunities, supplemented with platform use scenarios, such as the Customer Analytic Record. We shared customer data for the purposes of operational processes, including integration with key sales systems. We also performed customer satisfaction data analysis and visualization tasks.
WRITE TO US, WE’LL TELL YOU MORE ABOUT OUR CASE STUDIES
The controller of your data is Onwelo Sp. z o.o. with a seat in Warsaw. Your data will be processed in order to answer your query and, if you so consent, send you commercial or marketing communications. You can withdraw your consent at any time by contacting us at email@example.com. Read our Information clause to find out how Onwelo processes your personal data and understand your rights.