My scenario is pretty simple, I just created a table, then on server (SQL Server 2005) side i prepared it with steps taken from this guide - http://msdn.microsoft.com/en-us/library/cc305973.aspx ,
Sales.Customer schema. Determine whether the table has a primary key and any columns that could be used for change-tracking.
Add columns to track when and where inserts and updates are made.
- Create a tombstone table, and add a trigger to the
Sales.Customer table to populate the tombstone table.
Then I took stored procedure from this guide -
And as a last step, I set up your solution to use gzip encoder and CustomRemoteProviderSample : AppRemoteProviderSS2005 class.
Then it works just perfect but performance is bad. What i found, that batch size is affects overall performance, if batch size is smaller - it takes more time to synchronize. For batch size = 1000 - it took about 21 mins and for batch size = 5000 - it took
about 17 mins already. For bigger batch sizes i receive exceptions from gzip encoder that it can't allocate so much memory on device.
Device is htc 3300 (proc. - OMAP 850 200 MHz, RAM - 64 Mb, ROM - 128 Mb ) running win mob 6.
The database file I've got after initial sync on device is about 5 Mb.
Let me know if you need other info. I can send you creation script for my table and stored procedure as I have them at present, if you find it useful. Would be just great if we can come out with some solution and make it faster.
Don't worry about time, check this when you have time..
And thanks for quick response.