Journey made by data: from primary sources to beautifully visualized graphs and charts
To have a better grasp of how important data representation is, let’s have a closer look at how it is done at a company providing insurance services.
- Any of them deals with vast amounts of data from a variety of sources, both internal and external, such as transaction systems, databases storing information about customers, multiple government agencies, and many, many more. Staying on top of such a mish-mash of digits and facts is easier said than done.
- Another hot-button issue is that processed data must be compliant with financial regulations, such as anti-money laundering (AML), Know Your Customer (KYC), and others.
But how does the process go and which tools to use on the way? Let’s talk about it in more detail.
From the scheme above we can see that the entire data journey consists of two legs: ETL and BI Reporting. ETL – Extract, Transform and Load – is the first part of the process, during which:
- data is extracted from primary sources
- transformed into a consistent format through cleansing, validation, aggregation, merging, and normalization and
- loaded into a centralized repository – Data Warehouse or Data Lake for future analysis.
To make this journey possible for this client – huge insurance player operating globally – we leverage on-premise and cloud ETL tools, such as:
- SSIS (SQL Server Integration Service) – on-premise tool automating data processing streams.
- Astera Centerprise – ETL platform supporting data manipulation and transferring to Data Warehouse.
- MS Azure Data Factory – twin brother of SSIS from the MS Azure family, allowing to build data processing streams in the Cloud.
- MS Azure Synapse Analytics – a very close alternative to the above Data Factory and a great fit for data integration, data warehousing and big data analytics. What makes it stands out is that it supports PySpark (Python), Spark (Scala), Spark SQL .NET Spark (C#), SparkR for lightning-fast data manipulation and analysis.
- MS Azure Data Lake – cloud-based platform providing scalable data storage.
So, what happens after data is extracted, cleansed, filtered, and loaded to the destination storage with the help of on-premise or cloud ETL tools? It needs to take some easy-to-digest shape (i.e. become readable and understandable) to bring maximum value for the client, right? And here starts the second part of this fascinating transformational journey, which is the BI reporting.
As soon as our client’s data is loaded, we can visualize it, and build reports with BI Reporting tools. Those vary, as today’s landscape for Business Intelligence technologies is as diverse as ever. Yet, as we’re talking about the corporate client here, it’s no surprise we’ll stick to the Microsoft ecosystem of solutions. For the purpose of this topic, I picked the two below:
- SSRS (SQL Server Reporting Service) – on-premise tool for reports generation in the form of graphs or charts
- MS Power BI – Microsoft BA Cloud-based tool for data visualization (can be used on-premise as well).
Now as you know the theory, let’s see how it is applied in practice on real projects and – most importantly – what value we help generate for our client.
Use case 1: Symfa helped the client fine-tune the ETL processes for efficient insurance data management
A simple one for a start.
As I have already mentioned before, our US-based client provides insurance services to businesses of small and medium sizes and routinely deals with vast amounts of data. The customer was looking to transform the ETL processes and create a unified codebase to manage the inbound data in the most efficient way.
How the client made the best of the BI tool: Our team works with data sent from third-party services in xlsx, csv, and txt formats. We manage the respective ETL processes through the Astera Centerprise ETL application.
After data is processed and filtered, it is uploaded to the database in compliance with the client’s business requirements. And vice-versa, if the client needs to send documents or reports, structured data is extracted from the database and shared with the stakeholders.
Read the detailed description of this project in our portfolio.
Use Case 2: How we assisted the client in informative financial reports generation
In this case, our task was to optimize the data flow between the client's Oracle ERP system and their custom Data Warehouse (DWH) hosted on a SQL server. As the client operates globally and has about 10,000 employees, managing financial data is a tough job to do.
How the client made the best of the BI tool: Here’s our approach. The client's ERP system acts as the intermediary, connecting the data sources and feeding raw data into the ETL tools. Given incoming data complexity, we use two on-premise ETL applications – Astera Centerprise and SSIS. Once the data is prepared, it is loaded to the data storage to be used by the financial analysts from the client’s side. Thus they are enabled to create custom financial reports with their available visualization tools – such as Power BI or simply Excel.
For a complete picture of our work on this project, check out our detailed case study, which provides deeper insights into the challenges we faced and the solutions we implemented.
Use case 3: How we helped the client revive their sales process through insightful marketing reports
The client’s marketing team utilizes policy data to drive sales and renewals. But before getting to the marketers' hands, the data must be transformed.
How the client made the best of the BI tool: Previously, a third-party agency was responsible for cleansing the sales data for the client. Our objective was to replicate their workflows and identify opportunities for optimization using the client's existing tools and infrastructure (Astera Centerprise ETL application).
With our help the client no longer needed the third-party vendor’s services on data cleansing and could reallocate the available resources to support other digital transformation initiatives.
This case study is also available in our portfolio. Don’t hesitate to click for more details on the project implementation!
To wrap it up
In this blog post, I told you about ETL and Reporting tools that the Symfa team uses on a range of projects for our insurance client. You may have noticed that I incorporated cases with on-premise instruments only, and it is no coincidence. In the next part of this story let’s explore the cloud ETL technologies that our team uses for this same insurance customer. Meet me soon, same place – Symfa blog – I’ll be back with a new article for you.