Bug #4151
closedexported RIO and ROML doc run-pairs have same KeplerLSID
0%
Description
Exported RIO and ROML doc run-pairs have the same KeplerLSID. Try exporting a run from the WRM and then open up the kar to see this. If this is a WRM problem reassign to me.
Related issues
Updated by ben leinfelder over 15 years ago
now generating a new LSID when the reportInstance is assembled from provenance data
Updated by Derik Barseghian over 15 years ago
In the case where I don't do anything with reporting, I get ROML.xml and ROML.N.xml files. The files are identical and have the same LSID. Is this supposed to happen?
Updated by ben leinfelder over 15 years ago
for the time being, both ROMLs will be included.
Updated by Derik Barseghian over 15 years ago
This breaks the ability to export more than one run, since each run will have a ROML.xml, and an error about duplicate entries pops up from KARBuilder. It sounds like SANParks usecase doesn't need more than one run exported, so delaying the fix is ok, but shouldn't we just retarget the bug instead of closing it? Retargeting to 2.0.
Updated by ben leinfelder over 15 years ago
i thought we made it so that if they had the same LSID and the same name, we'd only include the last one that was added as a KAR Entry. did that change somewhere along the line in the last week or two?
Updated by Derik Barseghian over 15 years ago
I thought that too; not aware of any changes to that code.
I think this problem only occurs when you're trying to export runs that use different workflows, multiple runs of the same workflow export into a kar fine.
Updated by ben leinfelder over 15 years ago
Okay - that makes more sense: same kar entry name (ROML.xml) but different LSIDs
That'll definitely need to be addressed...
Updated by ben leinfelder over 15 years ago
I've prepended the workflow name to the ROML.xml entry that is generated.
While it's non-unique, perhaps it's the first step?
Can you try it out, Derik?
Updated by ben leinfelder over 15 years ago
reading the history - i believe we can close this bug (again)