social share alt icon

CREATING OPERATIONAL DATA STORE USING KAFKA-BASED DATA PIPELINES

Know More

CLIENT

 

One of the largest financial institutional banks in the U.S.

BUSINESS CHALLENGE

 

The client aspired to create a Kafka-based data pipeline to source data from various databases and create a single operational data store.

SOLUTION

 

We created a data pipeline to:

• Get raw data from multiple source systems in a pre-defined format

• Define transformation services to convert raw data into business events

• Send business data as output into an operational datastore

BENEFITS

Using these data pipelines, the client successfully:

• Synchronized data in almost close to real time

• De-coupled architecture completely, thus avoiding any point-to-point integrations

• Planned and put forth a scalable framework for the customer

• Designed a distributed architecture for high throughput and optimized performance