Project

General

Profile

Actions

Bug #5303

closed

The workflow which archive sensor data into metacat can upload data even no new data was generated

Added by Jing Tao almost 14 years ago. Updated over 13 years ago.

Status:
Resolved
Priority:
Normal
Assignee:
Category:
sensor-view
Target version:
Start date:
02/11/2011
Due date:
% Done:

0%

Estimated time:
Bugzilla-Id:
5303

Description

I used sensor stimulator and span to generate some data into a Dataturbine server.
Then i terminated the sensor stimulator and span. So no new data will be generate anymore.

Supposely the workflow can upload data only at the first time. The second run will be no data uploaded.

However, I can keep running the workflow and every time data and metadata are uploaded to the metacat server

Actions #1

Updated by Derik Barseghian almost 14 years ago

changing bugs from REAP to Kepler product

Actions #2

Updated by Jing Tao over 13 years ago

After partially fixing the bug 5319, i found the the running will keep uploading the last single data into metacat even no new data was generated.

The last uploaded time : 2011-02-25 10:48:06 was passed to DataturbinActor2, it will output:
2011-02-25 10:48:07 even though there is no metadata and data has the time stamp.

Since there is output for DataturbinActor2, it will drive the workflow to upload this package:

Dataset for sensor:"sensor0" at site:"gpp" for time period "2011-02-25 10:48:06" and "2011-02-25 10:48:06"

Actions #3

Updated by Jing Tao over 13 years ago

This issue is the TimeDifference class doesn't compare the last time of archiving with current latest data. Now we added this mechanism and the bug was fixed.

Actions #4

Updated by Redmine Admin over 11 years ago

Original Bugzilla ID was 5303

Actions

Also available in: Atom PDF