How we modernize a legacy BI application of an insurance giant. Part II

Struggling with sluggish enterprise reporting? This article is a game changer.

undefined
App Modernization
Business intelligence
ETL/ELT
Insurance
7 min read

If it takes forever for your BI application to do the job, this blog post is for you. Hi, I’m Ilya Mokin, Head of R&D at Symfa, and in this article I am going to share with you more details on our BI modernization project.

The reasons for Business application apps inefficiencies may abound, but for our client this was as simple as tons of redundant data in the database. We freed up the DB capacities and made app scalability possible, at this increasing significantly the reporting speed.

New to our blog? Here we share our experiences and opinions on completed projects and software development trends. Hit that link to the first part of this blog post series told by Anatoly Bankovsky, a Delivery Manager for this project, and join us to find how the story unfolds.

Table of Contents

  • How the application works (for those who skipped the first part of the story)
  • The onboarding: the take off was rough, but we made it into the sky
  • Business logic intricacies that can affect your reporting speed
  • The biggest value we brought to the customer on this project
  • Why Power BI didn’t work for our client (and the alternative we proposed)
  • Time overlap makes me transfer the project ownership to a new tech lead
  • Wait for it! The third – and final – part of the article soon in our blog

How the application works (for those who skipped the first part of the story)

Image

We prepare the data and generate reports for the employees and partners of our insurance client, who uses the application to track insurance programs revenue performance. 

The users log into the application via a client’s website. The app allows partners to see their performance broken down by program, subprogram and/or period and build easy-to-read detailed performance charts using the app filters and visualization functionality.

The onboarding: the take off was rough, but we made it into the sky

When we joined the project, the infrastructure was already quite complex. We had a month’s time to:

  • fill in the blanks in the documentation together with the previous vendor
  • get to know huge complex ETL logic and scripts and 
  • uncover more than a few tricky by-passes made by the previous team that kept the system running somehow.

Was it mission impossible – to hand over the project ownership only in a month’s time? Hard as our colleagues tried, mostly we figured out how to ride this beast ourselves. Frankly speaking, this isn’t a unique situation for the software project transfer. Any project that is more than 6 months old has its hidden skeletons in the closet. Of those hot ugly fixes only a few people will know, so it’s natural that during the transfer this knowledge is lost.

Business logic intricacies that can affect your reporting speed

The application did the reporting using the data over a two-year period aggregated from a large number of sources, all stored in one huge 7TB database. Though the user circle is quite limited, the amount of data that is processed used to make report generation a challenge with the reporting tools of the customer choice (too heavy data load).

The calculations were sluggish because of this monstrous volume, but when we joined we couldn’t delete a thing. You see, the calculations logic used the data on all the previous months for the reports.

In order to speed up report generation, we needed to safely get rid of those tons of redundant data. Finally, when we stood firmly on the project’s ground, we wrote the logic that takes data for previous months and adds it to the total value, or subtracts, if necessary. Thanks to this, we no longer need to store the two-year volume of data, thus reducing the database size to only 300GB. Right now whenever new data comes in, we upload it, do the calculations and off it goes.

The biggest value we brought to the customer on this project

The database reduction allowed the customer to let more partners upload their data into the application and build performance reports.

With a DB size of 7TB, it was getting impossible to maintain and operate. Report building took ages, the DB could crush at any moment, and the customer was running out of the Azure storage capacities. After our optimization measures, the DB grows much slower. Even with dozens of new dealers added, it grows 5GB per month max. Which is acceptable, the client still has enough Azure capacities for years ahead. When the resources run out, we’ll clean up the five-year old data and we’re all good.

Why Power BI didn’t work for our client (and the alternative we proposed)

Apart from the database saga, we had one more urgent issue to tackle.

At the start of the project we had a mix of visualization tools – Power BI, in-house built reports, MS Reporting tools – incorporated in the application. Ideally, you should get rid of such complexities, but at this stage we focused on major issues that Power BI created. Apart from the versioning and deployment troubles that Power BI gave to the client, it was unfit for these specific reports. After careful study of the client’s bottlenecks, we switched to DevExpress – a tool that makes the reporting a much easier process. I’ll explain why we did it in a sec.

Long before we showed up on this project horizon, the architecture was laid down. The idea of the first architect was to take Power BI and to empower business analysts to make reports by themselves. Great as it looked on paper, it didn’t work. Power BI requires special business knowledge and the DAX language. It may seem like all roses in a sales pitch – fun, speed and drag-n-drop. In real life, it’s a bit more complicated.

The complex nature of the reports, third-party access issues (the reports are published on a website with limited access) and of course the sheer volume of data made report generation using Power BI a torture. Developers still had to help business users out and come up with new unthinkable ways to make Power BI work for this project. Power BI couldn’t handle the data volume and sometimes even required another circle of data denormalization. These manipulations were too tedious and costly for everyone, so the customer got rid of the beautiful idea.

We now use DevExpress, write a couple of SQL queries on the server for a specific user with access restriction in mind, and JavaScript creates charts and graphs in the frontend part of the app just like that. A developer does the job real quick, while a BA cannot do this neither with Power BI nor DevExpress. A very rare BA will, but this is a whole new different story.

Time overlap makes me transfer the project ownership to a new tech lead

As much as I’d love to stay on the project, the deeper the team plunged, the clearer it became: the project required a larger time overlap from the tech lead. We found a great alternative for the client – an architect located in the same time zone.

undefined

Wait for it! The third – and final – part of the article soon in our blog

In the next and final part of this Business Intelligence series of articles, my colleague – Sebastián Cruz Bossio will share with you his contribution to the project growth. Be sure to come back to our blog to find out more about:

  • Why project audit matters when a new tech lead shows up
  • Why 5 overlooked ms can make your BI app one big failure
  • Why documentation matters (and a simple way to keep it to make it work for the whole team)

Stay tuned.

Follow us on LinkedIn and X to be the first to know about our blog updates.

More Like This

BACK TO BLOG

Contact us

Our team will get back to you promptly to discuss the next steps