This checklist is guidance when implementing a new integration using the Blendr.io platform.
- Source: Incremental fetch for both new records and updated records? Make sure to not fetch all records from the source on each run.
- Destination: correct usage of creating and update on exist (Upsert)? Do not create duplicates in the destination, check for existing records and update accordingly.
- Are updates allowed in the destination platform? E.g. invoices may sometimes not be updated after creation for accounting reasons.
- No overwrite of data in the destination that is updated by other processes? E.g. avoid overwriting manual corrections of contacts in a CRM.
- Updates of records do not remove custom fields? This is a common issue with API's: a record may have 3 existing custom fields when sending 1 custom field in an update, the other custom fields are removed. This is because custom fields are sometimes handled as one object with key/values, and the object is replaced as a whole on each update.
- Records deleted in the source are also deleted in the destination? No risk of deleting records that were created outside of the Blend? In other words, you are not deleting records that you did not create?
- The Schedule of Blend is set?
- A comparison Blend has created that checks and compares source and destination data to check for missing data? This Blend should send out an email if a difference is found.
- Errors are handled and not just ignored?
- In case of 2-way sync: avoid endless loops. Updates from A to B will cause a timestamp update in B, which causes a new update in A. Solve this by comparing the object field by field before updating (compare the new object with the existing object in the destination). Do not update if no changes.
- Data quality in the source: no duplicates in the source where destination expects a unique value (e.g. email addresses of contacts are not unique in source and are unique in destination).
- Data quality in destination: no duplicates are created, no missing values for required fields.
- A unique key is used to match records in both platforms when syncing data (e.g. email address)? If a key is not unique (e.g. email address is not unique in one or both platforms), store id of source platform as external id in the destination platform.
- Calculate a number of API calls in total (read+write) and compare them with API quota's and rate limits. Take into account paging.
- The Blend is tested? All edge cases are tested (e.g. records with missing data)?