Hi I'm using DSS 6.0.1 and a project that we've been using repeatedly via a manual Scenario trigger, appears to have developed a schema change somehow that has caused the data to fail in that column but the overall job completes as expected.
Parking the "schema change" issue which I'm following up separately, I thought I'd add a preliminary step in the scenario to "Check project consistency" and set the subsequent steps to only run "If no prior step failed" so we catch these type of issues in the future.
In my test project I created two HDFS datasets linked by a Group recipe (DSS engine), I then directly updated a single column in the output dataset to change the storage type from bigint to string. If I manually run Check consistency from the Flow Actions it colours the Group recipe red as expected, detecting Incompatible schema - this is good. But if I run the scenario as described above the scenario does not fail but completes with no stated errors.
Am I using the scenario "Check project consistently" step correctly and why does the manual check work but the scenario not?
Thanks to your explanations we were able to reproduce the issue you are facing.
It has been reported to our product team with all the details that you provided.
As a workaround I would suggest, even if that requires manual operations, to click on one of the input dataset in your flow and use the tool "Propagate schema across Flow from here" that is accessible through the "right click" contextual menu.