Skip to main content

Hybris Datahub 5.3 - Composition Issue

​In one of the project, when we are using Datahub to load data from SAP to Hybris, we have find a strange issue. We are loading customer data with one flag (say ship-Together) is loading properly first time. But, after that, if the value is changes at the source (SAP), the changed value in Hybris is not showing always. Sometimes corrected value is updated and sometimes not. The reason is very trivial and depends on the Item status and life cycle. The life cycle of the canonical items is as below
1. For new data set, canonical items are created.
2.  If the canonical item was composed successfully with no errors, the newly created canonical item gets assigned a status of SUCCESS. Only canonical items with a status of SUCCESS are published to a target system, any other items are ignored during publication. There can only be one canonical item with a status of SUCCESS per integration key in the same pool. 
2A. In case the items was not composed properly, the status changed to ERROR
3. Now, if the updated data set is sent, if the integration key is same for the new data set as the previous, the previously data set will be merged with the new data set and the old&new data set status changed to ARCHIVED. In the MERGED data set, the value of the field which has different value in old and new set, is not deterministic.
ABC.png

Solution: Since we have faced the issue with datahub 5.3, where we do not have datahub clean up jar, we have two alternatives.
1. Write a PL/SQL trigger on the column. When the value changed to SUCCESS, change the value as DELETE. In that case, the column will not considered for merging.
2. The solution we have provided is, in the Integration keys, we have included current timestamp. In that case for each dataset, the integration key will be unique and we'll not face the MERGING issue.
Please provide feedback if any. Till then, happy coding.

Comments

Post a Comment

Popular posts from this blog

Hybris Beanshell and Groovy script example

While doing support, we need to check some of the scenario and their result in the production server. The beanshell and groovy script comes very handy on this. Below are the few example of how I used/using it in my various project. This is very handy to test your data as well 1.        Check the log: Sometimes getting server access is very painful in the enterprise landscape. However, person joined in the project is required to work as soon as they inducted in the project. In this scenario, below groovy script is very useful.   Please note, run these in the rollback mode unless and until the otherwise is required. a.        The application is running from the container. For the POC, it was running in the tomcat server (tomcat/bin). To know the exact directory, run the below script in groovy console println "pwd".execute().text Below will be printed in the output tab /usr/sap/hybris/bin/platform/tomcat/bin ...

SonarCube and Hybris integration

There are some issues reported for SonarCube 6.7 integration with Hybris. The recommended version is Sonar 6.4 or less. For Sonar 6.4, there is no need to build the database. Please follow the below steps to use sonarcube with Hybris. For the POC, we have used Hybris 6.4.0.2 and sonar 6.4. Sonar Cube configuration 1.        Download sonar 6.4 and install/unzip. 2.        For the POC, we have used Embedded Database (default). But multiple database is supported. Access management can also be implemented. However, for the simplicity purpose, it is not used for the POC 3.        The java properties can be set at the sonar.properties file (not used in our case). Example of the property is sonar.web.javaOpts=-Xmx512m -Xms128m -XX:+HeapDumpOnOutOfMemoryError which can be enabled in the sonar.properties file under conf folder of the installed folder. 4.      ...

Beanshell/Groovy example - Change Local.properties permanently using script

Sometimes we need to change some of values defined for properties key in local.properties/project.properties. There are three ways to do it Update via HAC Update Local.properties Update using beanshell/groovy script If we use the first process, after server re-start, changes will go. So, the value will remain in the system memory till the server got re-started. Here, will give an example how to change local.properties and changes remain even after rebuild. Run the below script in the HAC script tab with "commit" on. import org.apache.commons.configuration.PropertiesConfiguration; import java.io.File; import java.io.IOException; import java.net.URI; import java.net.URISyntaxException; //The new value to be stored in the key/value pair in local.properties. String changedValue = "NEW_VALUE"; //Key name in the local.properties String propertyName = "test.properties.value"; PropertiesConfiguration config = new PropertiesConfiguratio...