Bug #5303
closedThe workflow which archive sensor data into metacat can upload data even no new data was generated
0%
Description
I used sensor stimulator and span to generate some data into a Dataturbine server.
Then i terminated the sensor stimulator and span. So no new data will be generate anymore.
Supposely the workflow can upload data only at the first time. The second run will be no data uploaded.
However, I can keep running the workflow and every time data and metadata are uploaded to the metacat server
Updated by Derik Barseghian almost 14 years ago
changing bugs from REAP to Kepler product
Updated by Jing Tao over 13 years ago
After partially fixing the bug 5319, i found the the running will keep uploading the last single data into metacat even no new data was generated.
The last uploaded time : 2011-02-25 10:48:06 was passed to DataturbinActor2, it will output:
2011-02-25 10:48:07 even though there is no metadata and data has the time stamp.
Since there is output for DataturbinActor2, it will drive the workflow to upload this package:
Dataset for sensor:"sensor0" at site:"gpp" for time period "2011-02-25 10:48:06" and "2011-02-25 10:48:06"
Updated by Jing Tao over 13 years ago
This issue is the TimeDifference class doesn't compare the last time of archiving with current latest data. Now we added this mechanism and the bug was fixed.