Bug #5303
closed
The workflow which archive sensor data into metacat can upload data even no new data was generated
Added by Jing Tao almost 14 years ago.
Updated almost 14 years ago.
Description
I used sensor stimulator and span to generate some data into a Dataturbine server.
Then i terminated the sensor stimulator and span. So no new data will be generate anymore.
Supposely the workflow can upload data only at the first time. The second run will be no data uploaded.
However, I can keep running the workflow and every time data and metadata are uploaded to the metacat server
changing bugs from REAP to Kepler product
After partially fixing the bug 5319, i found the the running will keep uploading the last single data into metacat even no new data was generated.
The last uploaded time : 2011-02-25 10:48:06 was passed to DataturbinActor2, it will output:
2011-02-25 10:48:07 even though there is no metadata and data has the time stamp.
Since there is output for DataturbinActor2, it will drive the workflow to upload this package:
Dataset for sensor:"sensor0" at site:"gpp" for time period "2011-02-25 10:48:06" and "2011-02-25 10:48:06"
This issue is the TimeDifference class doesn't compare the last time of archiving with current latest data. Now we added this mechanism and the bug was fixed.
Original Bugzilla ID was 5303
Also available in: Atom
PDF