Replicating data to a head office with limitations

Posted on

Question :

Here’s my situation. We have cash registers that send transactions to the store server. With every transaction received, the store server updates tables… at the same time other management transactions modify tables. At the end of the night we send everything to the head office for which they have all the info for all the stores. usually stores and cash registers only use SQL Express (we support 2008 and up) as for the central head office DB, they usually use SQL Server Standard edition and up (again we support 2008 and up).

I’m trying to revamp that process due to the fact that right now our process requests from the stores all data of days that have been modified and that for 25 tables out of hundreds. The problem is that if 1 record changes for a previous day we send all data of that day that was touched for all 25 tables.

These 25 tables are daily summaries that are updated live at the store and the changes are sent to the head office once a night usually at midnight.

My first instinct was to create triggers on each of the 25 tables until I realized that every item of a transaction will trigger 25 different triggers.
Which quickly became unattractive.

my next try was to send each transaction once to the head office and replay everything at the head office, but that will be too much of a load at the head office since they might be receiving data from 200 stores all at once.

Ideally, I’m looking for a way to transfer only data from the 26 tables that NEED to be transferred. Maybe by having the head office compare some type of Checksum of every table with a store’s checksum of the same tables (SQL checksum won’t work since the head office will always have more data).

I’m asking for different opinions to hopefully find other ideas I can explore.

very much appreciated.
Thank you

Answer :

You might be able to do something with actual SQL replication, but if you’ve already got an existing routine, just modify it to filter specifically on a new bit field in every table: uploaded or whatever you want to call it.

So when the register writes the changes to the table, make sure it clears that bit. Or, if you can’t alter that part of the routine, put an ON UPDATE trigger on the table that clears it when a row is updated.

Then make sure that the “upload to corporate” routine sets all those bits when it copies the data up.

Or, what might be even more useful, is to have a “last changed” field updated by your register uploads, and a “last uploaded to corporate” field. That way you can compare the two.

I would take the lowest level transactions and transfer them to a central location. Load and then batch process. Preferably without triggers.

Id have create_dt and change_dt fields which id use to identify deltas that need to be resent. Id then periodically reconcile source and target.

Whether you overwrite the changed records or append deltas will depend on what your till captures, your account practices and level of detail required. I.e Cash vs accrual and whether you need to reconstruct how it looked at a point in time. (stationarity of record)

I do something similar with my Datawarehouse. We ship low level data using ssis, but also have a step in our etl which performs sums/counts on both systems and stores the result. If it varies we know to reload.

Leave a Reply

Your email address will not be published. Required fields are marked *