Large-Scale Remodeling of Legacy BI System

Legacy app modernization enables a faster report generation and unimpeded user growth.
undefined

Highlights

7TB TO 300GB DB SIZE REDUCTION

400% REPORTING SPEED INCREASE

REVISED BI TOOLKIT

smart big data transfer sessions

Customer location
  • USA USA
Project duration:
  • 1 year (ongoing)

Global insurer brings a legacy app for a quick check

An insurance company headquartered in the US and employing over 6,000 talents worldwide, approached Symfa for a legacy modernization project.

The client is an international insurance player. In order to track insurance programs revenue performance they use a custom BI application. Via an ETL process, the application collects and processes data from the client's databases, then transforms it into graphs, tables, and metrics.

Overly complex logic and performance issues

After a not so successful partnership with the previous vendor, the carrier turned to Symfa for consulting services first. After studying the initial rescue steps that we proposed for the app architecture, the client entrusted us with the system rehabilitation.

Challenging brainchild with a story to it

This was a project with a story to it, so before plunging into the development process we scanned the application for bottlenecks. 

01

Mixed architectural approach

The system was a mix of the monolith and microservices approach, with several stand-alone services each having their own code base and running on a separate server but sharing one database.

02

Scattered logic

As the system logic was scattered across multiple servers, this made the excessively decoupled software hardly manageable.

03

Complex ETL network

Complex network of the ETL processes within the application (.NET, C#, Hangfire), SSIS packages and several processes running in Azure DataFactory lacked proper documentation.

04

Hampered visualization

A mix of visualization tools – PowerBI, in-house built reports, MS Reporting tools – was incorporated in the application. Power BI couldn't cope with the data load and required further data denormalization, so it had to be changed for a different option.

05

Slow performance

The report generation took ages because of the tons of redundant data in the DB and overly complex calculation logic.

06

Denormalized data

The project data is complex, denormalized and poorly structured. To get those denormalized data fit for our purposes required extra development research capacities from our team.

07

Ad-hoc modifications

Custom ad hoc modifications for the tables and reports.

Quick system overview & team composition

Here's a very brief overview of the system functionality and the team in charge of the project.

undefined
Over time, the team that started only with the development talents, grew to include a QA automation expert. The upgraded solution holds a significant potential and is getting ready to be presented within the partners' networks.

Technologies

  • Azure
  • .NET Core
  • SSIS
  • TypeScript
  • .NET
  • DevExpress
  • SQL Server Reporting Services
  • Hangfire
  • Azure Data Factory
  • Power BI
  • Message Gears
  • Auth 0
  • Twilio

DevOps and infrastructure challenges

The project has no dedicated DevOps talent on the Symfa side. All the DevOps-related tasks are performed by the autonomous and cross-functional engineering team.

Despite relatively easy deployment, the DevOps processes on the project were nonstandard, with a mish-mash of technologies.

The client is an enterprise insurance player, so security policies and configurations are naturally overly complex to prevent data leakages. DevOps approach had to be adjusted, accordingly, to incorporate the security measures of the client.

A revival of a valuable business asset

As the application is a valuable business asset, top management of the client holds regular meetups with the team to monitor the progress. The achievements we’ve made on the project secured our reputation as a strong partner for seamless enterprise change management and digital transformation initiatives.

ETLproperly documented
BItoolkit revised
7TBDB size reduced to 300GB
LOGICunified as needed
DevOpssimplified pipelines and infrastructure
15+hour long safe data transfer sessions

Latest projects

BACK TO PORTFOLIO

Contact us

Our team will get back to you promptly to discuss the next steps