Astera introduces the newest addition to the platform, Astera Data Services - a complete API lifecycle management solution. Learn More

X

Ingest Up-to-Date Data into Your Data Warehouse in Near Real-time with CDC

By |2022-03-08T06:36:59+00:00October 28th, 2021|

Tired of dealing with redundant data in your data warehouse? Load data incrementally using change data capture to populate your data warehouse at speed without worrying about redundant or inaccurate data. 

 

Gone are the days when businesses could afford to analyze their data at the end of every week – or even every month. The rapidly evolving business landscape of today has made it necessary for organizations to use data warehouses to process and query their data in near real-time in order to extract quick insights and make prompt business decisions.

The need for quicker time-to-insight means that data often needs to be captured directly from transactional systems as and when it is received. That said, moving entire databases every time you need to perform analysis on your data is out of the question. This is because copying all of your data for each query can be resource-intensive and cause unnecessary delays, especially when your database comprises millions of records.extract insights from data quickly with change data capture

When you need to process data quickly, it’s important to only copy or migrate only new or modified data to your data warehouse. Change Data Capture (CDC) technology can help you achieve this by identifying changes to your source dataset, capturing these changes in temporary change tables, and delivering them to the data warehouse for reporting and analysis.

Why Can’t Data Be Analyzed and Queried in the Source System?

It’s important to replicate data before you analyze or query it for a few reasons.

Data is often initially stored in transactional databases. Since these databases are operational in nature and have not been created specifically for analytical purposes, it can take a considerable amount of time to query data directly on them particularly when dealing with large volumes of data.

What’s more, these operational databases are also regularly in use which means that querying or analysis directly in the source table can potentially cause issues pertaining to the flow of data. In case the data in these databases is transformed or manipulated right at the source, there will probably be no way to roll back to the original version of the data. Performing analysis in the source transactional database while new data is being entered into these databases can also cause disruptions and potentially affect the quality of the insights you’ll extract.querying data at the source can affect speed

By replicating the data from transactional databases to analytical ones, you can both reduce the time-to-value by processing data in a system dedicated for complex querying and maintain the originality of the source data. Change Data Capture further facilitates quick and convenient data processing by replicating only the data that isn’t already available in the destination database.

When using CDC, you also don’t have to wait until you have a significant amount of new data in your source database since data is not transferred in batches. Instead, you can create, schedule, and orchestrate your data pipelines to ensure that only new or modified data is migrated from source to destination as soon the change occurs, expediting the entire migration process considerably.

How Change Data Capture Streamlines Your Analytics

Suppose your multinational organization deals with thousands of clients in several geographic locations. When your network is so vast, you certainly need to maintain a database with the contact information of all of these clients and store their information in a centralized repository for easy accessibility. The slightest change in the information of even a single one of these clients needs to be propagated to your data warehouse to ensure that it continues to serve as a single source of truth.

Copying the data of thousands of clients just to propagate a change in one record will be extremely resource-intensive and can cause unnecessary delays in analytics. With Change Data Capture, you can ensure that only the new information is moved from the source database to the data warehouse so that your insights are based on accurate and updated data.

One of the greatest benefits of using CDC to identify, capture, and deliver changes from your source systems to your data warehouse or analytical databases is how resource-efficient the entire process is. It saves you from executing high load queries on a periodic basis. For your business, this translates directly to reduced load times and quicker time-to-insight for better decision making.Streamline data analytics with change data capture

With Change Data Capture, you can also scale without worrying about lack of data integrity or completeness since CDC ensures that your central repository is always populated with the right data without any delays.

Using CDC also ensures that your resources won’t be choked at certain times of the day or week because instead of loading data into your data warehouse in large batches and querying large volumes of data in one go, you can do so as and when it is received. As the process doesn’t require too many resources, you can also save cost on spending on specific hardware to keep the data loading process going.

Types of CDC

There are a few different ways in which you can implement CDC in your data warehouse. Let’s take a closer look at each one of them:

Log-based Change Data Capture

Log-based Change Data Capture is a reliable way of ensuring that changes within the source system are transmitted to the data warehouse. In log-based CDC, a transaction log is created in which every change including insertions, deletions, and modifications to the data already present in the source system is recorded. The data at the source is then compared to the data in the transaction log to see if there are any changes which need to be propagated to the data warehouse.

transactional cdc involves creating a transaction log of all of the changes and new data at your source database

The logs created in this type of CDC also ensure that normal transactions within the source system are not affected by continuously processing the data directly at the source. This process reduces latency and can help you create a centralized repository without overburdening your systems with additional processing to keep up with any transactional changes.

Trigger-based Change Data Capture

This type of Change Data Capture involves creating specific functions to capture changes as they occur in the source database. For instance, an AFTER DELETE SQL trigger will capture the instance of your database after a record has been deleted.

When using trigger-based CDC, your database will create another table to record the changes in addition to maintaining a transactional log. This can not only increase latency but also increase the risk of missing out on some changes in case triggers are not defined properly for certain scenarios or if triggers are accidentally switched off.

Maintaining an additional set of trigger-based changes also puts extra load on the architecture and can consume a lot of resources in case you are dealing with very large volumes of data or recording changes on multiple tables simultaneously.

Since this is usually the scenario with large organizations, you might have to test trigger-based CDC by making some changes on a smaller database and see if the latency and load is something you can work with long term. That said, trigger-based CDC is particularly useful in event-based architectures where you absolutely must update the data warehouse when certain changes are made to the source database.

Provider-based Change Data Capture

For provider-based change data capture, scripts are made available by database providers to monitor one or more fields of a database table. These scripts can then identify and record any changes to the table. Like trigger-based CDC, provider-based Change Data Capture also involves retrieving data from the source database as changes are made, hence putting some additional load on the system.

Optimize Your Analytics Architecture with Change Data Capture

Capturing changes in your source systems and propagating them to your data warehouse is essential to ensure that you can extract quick and accurate insights. With Astera DW Builder’s built-in CDC component, you can create elaborate data pipelines and copy all new and modified data to your data warehouse without typing a single line of code.

Astera DW Builder comes complete with hundreds of built-in transformations to maximize the value you can extract from your data. Automate your data pipelines and leverage built-in capabilities like Change Data Capture to ensure that your data warehouse is always up to date without any manual intervention.

Ready to build a self-regulating data warehouse that delivers up-to-date data to your BI & reporting architecture in near real-time? Get in touch to discuss your use case or take Astera DW Builder for a test drive today!

Related Articles

Data Visualization: Connecting Your Data Warehouse to a BI Tool

Your path to BI data visualization probably involved creating a data warehouse and populating relevant data from multiple sources –...
read more

Building Data Pipelines: A Guide to Improving the Efficiency of...

By building your own data pipelines, you can populate your existing data warehouse incrementally at a speed for faster analytics...
read more

Modernize Your Data Architecture with a Best-Practices Approach to Dimensional...

Dimensional data modeling has been the foundation of effective data warehouse design for decades. Kimball’s methodology promises optimized query performance...
read more