Duplicate record found when update master data

Duplicate records error may occur during Master data extracting/loading. Usually reload from PSA or source system can solve the issue. But when it’s a daily load with process chain, an once for all solution is needed.

It’s possible to select the following if update only to Master data.

Open the InfoPackage –> Update tab –> Error handling –>Valid records update, reporting possible

But if data is updated to both Master data and ODS, you don’t have above selection, or you get below error message.

You want to load data to a DataSource in an ODS, that requires serialization.
You are not able to filter out incorrect records and write them to a separate error request. Error handling reset to “No Update; No Reporting”

This method works.

Open infoPackage –> Processing tab –> Only PSA, update subsequently in Data Target

Also, you can change in Source system,

RSA2 –> find a pulldown list called “duplicate record” –> 0: do not allow duplicates

Also will help if you can remove duplicates in the package level. Sample:

# Keep the most recent record on top

Sort DATA_PACKAGE by [field 1] ascending
[field 2] ascending.

# Remove duplicates if not needed
Delete adjacent duplicates from DATA_PACKAGE comparing [field 1] [field 2].

If it happens in a data mart delta load to both Master data and other data targets, the default delta infoPackage doesn’t have a choice of PSA. You need to create your own infoPackage.

Also, program RSDMD_CHECKPRG_ALL can be used to correct inconsistencies on SID entries.

You can try RSRV as well.

0 comments: