The “Data Integrity” is not only used to validate source files, but can also be used to validate target files. Many users have found that the “Data Integrity” can be used on production files to learn what values may have crept into those files.
The “Master File Display” can be used to create formatted reports with specific accounts for detailed validation. It also can be used to create a formatted report of all source and target files for auditing purposes to show how all files appeared at the time of data conversion. These can be referenced if there is any question as to specific values when the conversion was performed. This is an “auditing” suggestion.
The “Report Writer” can be used on target and source files to produce all the reports that a user has predefined as well as ad-hoc reports that arise during the transformation process.
Report Writer is used to create the balancing reports. The balancing reports are capable of creating up to 9 levels of totals. The reports can also be used when certain fields are being changed such as a General Ledger Chart of Accounts. The report can show the total on the old values as well as the new values.
The “Prove” report was a creation of a past conversion in which the user had allocated 6 months for validation of one application. One of the reasons so much time had been allocated was due to just one element having specifications in table format of 6 input decision fields and covered about 400 permutations of the 6 fields. Instead of “spot checking” certain accounts, 100% of the accounts were validated in two weeks. When the conversion is run, all of the elements for the target and source elements are written out to a file. The data conversion is the one time when a user will have access to both source and target elements.
The “Exception Report” is primarily used to call attention to exception cases as they occur, although it can also be used to report on anything a user desires. The report is helpful to programmers and can be easily customized by adding skeleton code and modifying it to fit a specific situation. The mechanism is already in place to perform the printing, sorting, etc.
Auditors also find this report beneficial because if/when hard coding is required to change data, the report can be easily list the before and after values. Auditors do not mind data being changed during conversion as long as an audit trail is left.
Automated Validation offers the ability for a user to validate the “Expanded Specifications” from the mapping for any account. A user is able to view the actual specifications in one window using Gladstone DataMap, and the actual data from the target field and the many source fields used in the expanded specifications to populate the target in another window. This will include the account number, the field data names, and the actual data. A user is then able to “register” if the expanded specification is functioning properly. During data mapping, if a user chooses to validate the “Expanded Specifications” element, a verify button will be activated.
For existing expanded specifications in conversions with “Automated Validation” this is performed manually. The feature will also validate all “Direct Move” and “Default Value” specifications. To validate defaults, the only item in the output should be the default value. “Direct” moves may involve two scenarios. One, the same number of records in the source are converted for the target. In this scenario, the source values and number of occurrances should be equal. If they are not equal, the “Direct Move” is invalid. In the second scenario, the number of records on the source will not be converted one-for-one to the target. In this scenario, the balance reports that identify the dropped records (the delta between source and target) need to be used to reconcile the difference and determine if the field is valid or not. This feature allows for a different source key to be used to match the source records other than the target file key. A cross reference file is used in order to match the “Key” of the target file and the alternate key used for the source file. This feature uses Gladstone’s “Meta Data Plus” system to validate dates. The “Meta Data Plus” system creates an error report of bad dates. This feature uses the “Meta Data Plus” system to validate balances. Validation consists of determining if the field is numeric. No other validation can be performed on balances. The “Meta Data Plus” system creates an error report of non-numeric fields.