When it comes to data load, I experienced several times errors in the input data.
To fix that we experienced a different approach: rather than just read the file and update Clarity objects, just having an XOG output, we tried it a little bit differently.
we basically have:
- A Custom Object, with an attachment fields, and few other fields to report when the object has been created, by whom, and statistics about the data load.
- a custom sub object, where every record is stored
- three processes:
- Load: one to read the excel file, attached to the custom object, and populate the sub object
- Check: one to check the data in the sub object, in order to perform controls in an easier way with SQL (much less redundant than the check you can apply in gel on a string field, for example). this process marks every record with a flag green or red, and a message reporting all of the errors for each record
- Update: one to update Clarity objects, by reading just the green records. (an option to discard them all is recommended, when it comes to a "all or nothing stuff" which most likely occurs with budget data or so).
From the "user experience" side, you can easily load the excel file as an attachment (rather than putting it in share), use three actions and check by yourself what's going on by using the standard list of the sub-object.
this technique could be easily adapted when the file is not attached by a person, but is sent via ftp or other means: you just need a job which:
- scans the file system
- created an instance of the custom object with an attachment, for every file you have to load
- starts a process, which starts the three processes mentioned above
you then have all of the loads tracked with the statistics, and the error message for any problem you get.
P.S: is you get an error, you can amend the record in the sub-object and re-execute the load and update processes, instead of re-starting from another excel file...