Custom ETL Connector for Oracle ERP to Enable Reliable Data Flow

Clean, reliable, and easily available data to step up the client’s analytical game


90 mln rows of data

Custom connector development

Multiple data quality checks

DevOps for faster releases

Customer location
Project duration:
  • 1 year (ongoing)

A Fortune 500 insurance company looking to power its diverse analytical needs

The client is a US-based insurance provider with over 5,000 employees and offices around the world. Founded about a quarter of a century ago, the insurance giant has grown a massive client base and accumulated vast amounts of data.

This strategically valuable data, however, was locked in the client’s Oracle ERP system. The existing setup allowed only for very limited customization and required very labor-intensive data processing workflows. And even so, many attributes and metrics important for further decision-making and refining the value chain remained unavailable.

To capitalize on this data treasure trove and open the door to new insights, the client needed to set up an effective and highly-performant pipeline to extract comprehensive ERP data, clean it and load it into the Financial Analytics Data Warehouse for further analysis. The data-heavy project called for mature ETL development expertise, one of Symfa’s core competence.

Lack of a standard DB connection for reliable access to financial data at scale

Solving the client’s data challenges with confidence

The Symfa team embarked on the project to streamline the data flow between Oracle ERP and the client's custom data warehouse sitting on the SQL server. Our scope of work included:


Building a custom DB connector

To ensure stable and reliable connection with the database, our team built a complex custom connector from the ground up. The connector works like clockwork, automatically restarts, if necessary, and verifies the data in case of any errors.


Developing a robust ETL pipeline

The ETL pipeline built by the team seamlessly extracts data from Oracle ERP, cleans it, maps against the target model and loads the data into the Financial Data Analytics Warehouse.


Ensuring data quality

To guarantee data accuracy and completeness, the ETL solution includes multiple data quality checks. In addition, emails are automatically sent after every ETL process with the links to data quality and performance reports.


Streamlining releases with DevOps activities

To accelerate delivery and increase efficiency, the team set up a reliable CI/CD pipeline that leveraged automated release management, including automated build and deployment, granular permissions and access levels, and more.


  • SQL Server
  • .NET Core
  • Astera Centerprise
  • Azure DevOps

Business value: Turning raw data into a strategic asset

With data being at the core of the modern insurance business, ETL projects recently rose to prominence in the client’s company, being the cornerstone of data-driven culture. Now, this robust and reliable ETL pipeline successfully fuels the client’s far-reaching analytical plans. With the ability to transform diverse financial data into analytics-ready data that can be acted upon, the client can inform its strategic planning and open up new revenue streams.

From the perspective of collaboration with the client’s specialists responsible for further analysis, our engineers have become an integral part of the distributed team, enabling an organic data flow and providing each other with extra support whenever it is needed. 

  • All the workflows and communication patterns on the project were established according to the client’s requirements.
  • All the engineers are actively involved in the project discussions during daily catch ups in Microsoft Teams.
  • 24/7 access to Jira and progress reports (daily/weekly/monthly) add to the project transparency, which is a must for a business operating in a strictly regulated environment, like the one of our client’s.

Latest projects


Contact us

Our team will get back to you promptly to discuss the next steps