How to Ensure Hassle-free BI Reporting in a Big Corporation That Needs Their Answers Fast

undefined
7 min read
Business intelligence
Client stories
ETL/ELT
Insurance

Today I’ll be featuring three real use cases that we’ve been doing for our US-based insurance clients. If you are very much like our customers, established old timers wishing to build insightful reports using all trendy BI tools and visualizations, this blog post is for you.  

My name is Ivan, I am a BI developer at Symfa and today I’ll be your guide into the world of corporate Business Intelligence reporting.

Table of Contents

  • What is BI reporting in the first place?
  • How we helped the client generate insightful financial reports
  • How we helped the client reinvigorate their sales process through granular marketing reports
  • The story of a failed partnership. We took over the client’s legacy BI system and are getting it ready for the big market
  • Just before you go

What is BI reporting in the first place?

Feel free to skip this part if you've been applying some sort of Business Intelligence (BI) in your organization for some time already. If not, worry not, you’re not lagging behind too much. Even if you missed a trend or two in corporate software development, we’ll fix it real quick together now.

In a nutshell, business intelligence reporting means collecting data across multiple internal and external sources, processing it and using the output to generate reports. Simple, isn’t it? 

“What’s the problem then, and why is my corporate reporting going to cost me a million?” 

Well, the essence of good reporting is in the quality of the data used. That is what any person who has ever done even the smallest analytic jobs would tell you. Because of the discrepancies in data formats, human errors, data inconsistencies or incomplete data, you cannot use raw data (the data immediately loaded from the sources you choose) for reports generation straight away. Research analysts collecting data manually have their own rules of validating the data they get for their insights (such as using only trustful sources, get confirmation from two or more trustful sources, personal judgment based on years’-long practice, etc.). For a corporate data analyst, it’s impossible to use such data validation methods, as: 

  1. we deal with a too heavy data volume in the first place, and 
  2. our sources are predefined by the client and we cannot change them if the data they bring to us doesn’t fit in. 

Instead, we use the software tools to extract the data, cleanse it, validate, and load to the client’s data storage. From here, data analysts of the client’s or Symfa’s engineers can get the clean data to build the reports using available data visualization tools or simply make queries against the table and export the deliverables in the required format.

Likewise any analytical job, BI reports generation relies heavily on data preparation and data transformation jobs. 80% of the project time goes to getting those data ready for work, and 20% is actually analytics and visualizations. 

In our projects, we do both data prep and reports building, depending on the client’s needs. Here we’ll talk about the data prep jobs, where the clean data we deliver are then used by the client’s talents to generate granular reports and facilitate decision-making. The data is collected (or Extracted – the way we call it), transformed and then loaded into the client’s data storage during the so-called ETL process. To enable the ETL process we use the specialized off-the-shelf software of the client’s choice, or customize the tool of the client’s (if such exists) if it doesn’t meet the client’s needs in full.

The ETL tool loads data into the existing client’s data storage (database/data warehouse/data mart/data lake) that the client selects based on the reports characteristics and the qualifications of the talents that will use it. For example, data analysts can very well work with data lakes that store unstructured data, while a general user is far better off with a good old database or a data warehouse containing well-structured data. 

In the end, the data is shaped into charts, graphs, diagrams, etc. using a data visualization tool. Sometimes Excel visual capabilities are enough, or sometimes visualizations aren’t necessary at all. Let’s be honest, we don’t go all fancy when we need to present a few pages of routine figures to the management on a daily or weekly basis. Yet, say, for consulting jobs looks are paramount. So, it really varies from case to case.

Now you know what BI reporting is and how it works. Let’s see exactly how we enable great BI reporting for our insurance clients, shall we?

how to enable reliable financial reporting in a corporation

How we helped the client generate insightful financial reports

This insurance client of ours approached us for a test drive and stayed for a full-scale digital modernization campaign (including API-led connectivity, monolith decoupling, legacy support and more). ETL and BI projects make an impressive percentage of the overall number of projects with this client, so let’s have a closer look at one of them.

Our task was to streamline the data flow between Oracle ERP and the client's custom made DWH sitting on the SQL server. As I mentioned before, ETL comprises three relatively simple procedures – extract, transform and load activities (henceforth its name). No big deal, but the data can be overly complex, and our client’s financial data is just that complex (to say nothing of the outrageous volumes it comes in, given that the client operates globally and employs 10,000 talents).

So, here’s what we do. The client’s ERP connects to the data sources feeding raw data on one side, and the ETL tools on the other. Two ETL tools are used to deal with the incoming data – we use one to extract, and the other to transform and load the data. All for the complexity and volume of the financial data! The ready-to-go data is loaded into the data storage. The financial analysts of the client’s come here every time they need an influx of fresh data and use them in their reports. The output tables are 100% ready to make queries against them. If that’s not enough, the analysts connect their available visualization tools to the data storage (from Informatica, to Power BI to good old Excel) and build bubbles, charts, graphs, etc.

For this project we also made a custom connector from scratch to establish a reliable connection between the Oracle ERP and the ETL tool. You can read more about this in our case study on this project.

business intelligence to boost your sales processes

How we helped the client reinvigorate their sales process through granular marketing reports

Sales are a pain in the back. What makes it even worse is when you have to pay a third party agency to clean your sales data for you. My guess is it cost our client a lot, if they asked us to replicate the processes of the data cleansing agency. But I’m getting ahead of myself, so let’s get back to how it all started.

The insurance client approached us to streamline insurance policies remarketing. 

Each bound policy is saved in the database. The automation systems collect that data from multiple databases. The marketing team uses those data for policy renewals. For example, the marketing team sends emails or SMS reminders to the policyholders when the policy is due to be renewed. Data is also used for user segmentation, revenue forecasting and more.

Before us, all the above data used to be cleansed by a third party agency. Our task was to replicate their workflows and optimize them whenever possible. The key condition was to do it using the existing client’s tools and infrastructure. 

What we ended up with is pretty much the same workflow, but this time it works on the client’s side from start to finish (including data collection, cleansing, validation, filtering, etc.). Our engineers upload the ready-to-use files to the data warehouse of the client’s where the marketing team gets it to drive sales and renewals.

This case highlights the story above plus a couple of juicy details I left untold (like an in-built analytics module we did for the client’s marketing team).

how to rebuild your legacy bi application

The story of a failed partnership. We took over the client’s legacy BI system and are getting it ready for the big market

I call no names, but those guys that were before us left some mess behind them. 

The previous vendor tried to move the ETL system to the cloud (as the visualization tool also sits in the cloud, so it was kind of logical) but got stuck at some point. So, what we were planning to do when we got to the project was to simplify the overly complex system by unifying its logic whenever possible. 

We were planning to get some help from the previous vendor, as we had only a month to get to know the system. You can imagine the guys weren’t super enthusiastic about helping us out, so we basically studied the system through the code or contacted the client's BA’s in case the code couldn’t tell us a thing.

The challenges didn’t stop there. We didn’t have to work with simply unstructured data. We had our data already denormalized for a different type of reports. However, with day-to-day learning and commitment, we managed to streamline the data flow for our client and now they are planning to enter the SaaS market with this solution.

This case is available in our portfolio too. If you’re interested particularly in this BI tool for cession report generation, here’s the link.

Just before you go

Intelligent reporting isn’t all roses, neither for us, nor for our clients. It doesn’t solve all your business challenges overnight and sometimes it creates new ones. Like the big development check or extra recruiting efforts not just to find the proper ETL/BI agency, but to hire new or train your available in-house talents to work with the data. Low-code/no-code ETL platforms will eventually democratize data prep jobs, with ad-hoc reporting becoming the new normal. As for now, complex data still requires a lot of investments to be cleansed and verified and eventually be used for insightful reporting.

This is so true for the established market majors with tons of complex data and tons of legacy software that one simply cannot throw to the trash bin and start with a clean slate. In my next article I’m going to tell you how we're solving exactly such a case by fighting new market threats using existing legacy BI tools and a bit of engineering ingenuity. A way to go in times of economic crisis, with your innovation budgets slightly cut (to put it mildly).

Stay tuned if you like content like this. Follow me on LinkedIn for regular insider posts on the challenges in day-to-day life of a BI developer.

Credits

Ivan Sokolov
Ivan Sokolov

Business Intelligence Developer

Ivan is an aspiring young leader of the corporate BI universe. He's constantly challenging himself with new approaches to data processing and experimenting with tools. Ivan is residing in Georgia with his wife and two kids.

Ivan is an aspiring young leader of the corporate BI universe. He's constantly challenging himself with new approaches to data processing and experimenting with tools. Ivan is residing in Georgia with his wife and two kids.

More Like This

BACK TO BLOG

Contact us

Our team will get back to you promptly to discuss the next steps