Data Replication: SAP HANA to Snowflake
Several TB of Data Changes Per Day? No problem.
How HVR enables high volume data replication from systems such as SAP into Snowflake
VR for data replication from SAP HANA® to Snowflake.
Data Replication from SAP
Data replication and integration from most common sources to Snowflake using HVR: The details
- “Soft delete” is a transformation to mark a row as deleted when it was physically deleted on the source. The soft-deleted row can now easily be identified on the target ODS as a delete that must be processed by ELT/ETL downstream. Without the soft delete, the alternative would be to use a resource-intensive query to identify what data is currently available in the target but no longer in the source.
- The ability to include metadata from the source like the commit sequence number from the source, or the commit timestamp, based upon which downstream ELT/ETL can determine what data set to process next.
- Support for the so-called agent plugin, which is a call-out to a script or program at certain moments during the data delivery, such as at the end of the integration cycle, when the target database is transactionally consistent. Customers often use an agent plugin to call their ELT/ETL routines.
End-to-end data replication
For Snowflake integration, HVR supports:
Time for a Test Drive
Snowflake is the next-generation data warehouse technology, built for the cloud. How can you most efficiently, and with the lowest possible latency, get data into Snowflake?
Interested in seeing what data replication using HVR is like? I invite you to take HVR for a Test Drive.