Kepler: Issueshttps://projects.ecoinformatics.org/ecoinfo/https://projects.ecoinformatics.org/ecoinfo/ecoinfo/favicon.ico?14691340362012-05-09T22:30:44ZEcoinformatics Redmine
Redmine Bug #5605 (Resolved): Kepler scheduler doen't work for local workflows in windows oshttps://projects.ecoinformatics.org/ecoinfo/issues/56052012-05-09T22:30:44ZJing Taotao@nceas.ucsb.edu
<p>Before the sensor-view workshop, derik found that the Kepler scheduler doen't work for local workflows in windows. It works in mac and linux.</p> Bug #5602 (Resolved): Timezone confusion using the schedulerhttps://projects.ecoinformatics.org/ecoinfo/issues/56022012-05-07T23:52:44ZDerik Barseghianbarseghian@nceas.ucsb.edu
<p>It appears at least some parts of the Workflow Scheduling system don't account for different timezones. When you schedule a workflow in Kepler, you don't specify timezone, so presumably this is local time. I think the server also uses local time, and so this requires you to know where the server resides, and schedule using its time. Instead we should probably translate to GMT, and expose timezone codes in the gui.</p> Bug #5251 (Resolved): using File->Open MoML... always opens model into Sensor Site viewhttps://projects.ecoinformatics.org/ecoinfo/issues/52512010-12-02T19:28:18ZDerik Barseghianbarseghian@nceas.ucsb.edu
<p>If in the sensor-view suite I open a regular xml model, e.g. 00-StatisticalSummary.xml, using the File->Open MoML... menu item, the new window is in the the Sensor Site view, with filtered component tree. <br />It should just open into the Workflow view. This is on mac and linux at r26412.</p> Bug #5223 (Resolved): Store the last timestamp of data store for a sensor to the local databasehttps://projects.ecoinformatics.org/ecoinfo/issues/52232010-10-20T21:33:34Zmanish manishmkanand@gmail.com
<p>Once the data set for a specific time interval is stored in the metacat, we need to store the last timestamp interval, so that we know since when to start retrieving the next datasets for the same sensor, when we make a read connection to the dataturbine server. This feature is achieved by storing this information in the local mysql db. This same information will be read when the next chunking needs to be done for the dataset.</p> Bug #5222 (Resolved): Store data set file and associated EML file into metacathttps://projects.ecoinformatics.org/ecoinfo/issues/52222010-10-20T21:30:28Zmanish manishmkanand@gmail.com
<p>Given a data set file path and the EML file, store the data and EML file to metacat. Seems that we use the already existing EcoGridWriter to achieve this functionality.</p> Bug #5221 (Resolved): Create EML stringhttps://projects.ecoinformatics.org/ecoinfo/issues/52212010-10-20T21:28:40Zmanish manishmkanand@gmail.com
<p>Given a dataset file path string and sensorML string, create the eml string for the same.</p> Bug #5220 (Resolved): Create data chunks having same metadata info for a time intervalhttps://projects.ecoinformatics.org/ecoinfo/issues/52202010-10-20T21:27:36Zmanish manishmkanand@gmail.com
<p>Given a sensorId and a series of timestamps (since a previous date) that reflect when the metadata for the sensor changed, the data set need to be chunked into distinct files. The output is the file with the naming convention as "site(x)_logger(y)_sensor(z)_01102010-01302010.txt". This file is created at some local location, and the complete path of the file is sent as string output. "01102010-01302010" denotes the timestamp as 01(day) 10(month) 2010 (year). Also, time interval "01102010-01302010" denotes the time when the metadata for sensor(x) was same, starting from "01102010" and till "01102010". So, the dataset is chunked to reflect the same metadata for this duration.</p> Bug #5217 (Resolved): Retrieve the metadata of a sensor at a specific timestamphttps://projects.ecoinformatics.org/ecoinfo/issues/52172010-10-20T21:17:18Zmanish manishmkanand@gmail.com
<p>Given a sensor id and a timestamp, retrieve the metadata of the sensor at that specific timestamp. This information is required to build the sensorML information.</p> Bug #5216 (Resolved): Retrieve the timestamps for all the metadata changes that occurred since a ...https://projects.ecoinformatics.org/ecoinfo/issues/52162010-10-20T21:14:30Zmanish manishmkanand@gmail.com
<p>This bug implements the feature which takes a sensor id and the last time stamp sine the data was stored in the metacat for that sensor, and returns all the timestamps when the metadata was changed. These timestamps reflect the time points where the data sets need to be chunked.</p> Bug #5214 (Resolved): Retrieve the last timestamp for which a sensor's data was stored in metacathttps://projects.ecoinformatics.org/ecoinfo/issues/52142010-10-20T21:10:24Zmanish manishmkanand@gmail.com
<p>This bug implements the feature to retrieve the last timestamp for which a sensor's data was stored in metacat. For each sensors, the data is stored in the metacat. We need to keep track of the timestamp of the last data for the sensor that was stored in the metacat. At current, we are proposing to store this information in a database (mysql). Table sensorTimeInfo(sensorId, lastTime) which will keep updated with the last timestamp for each sensor which the data has been stored in the metacat.</p> Bug #5213 (Resolved): Query dataturbine server to retrieve the sensorshttps://projects.ecoinformatics.org/ecoinfo/issues/52132010-10-20T21:06:25Zmanish manishmkanand@gmail.com
<p>This bug implements the feature to query a dataturbine server to retrieve all the associated sensors. The following information shall be retrieved for each sensors: (i) sensor id; (ii) logger id to which the sensor is attached; and (iii) site Id to which the sensor belongs. I naming convention can be followed where each sensor is named as: site(x)_logger(y)_sensor(z), where x, y, z are site id, logger id, and sensor id.</p> Bug #4757 (Resolved): transfer sensor data from dataturbine to metacathttps://projects.ecoinformatics.org/ecoinfo/issues/47572010-02-05T18:48:31ZDaniel Crawldanielcrawl@gmail.com
<p>Periodically archive sensor data on the dataturbine server into metacat. Also verify that archived sensor data can be searched and read in Kepler.</p> Bug #4756 (Resolved): allow saving of sensor workflow outputhttps://projects.ecoinformatics.org/ecoinfo/issues/47562010-02-05T18:46:59ZDaniel Crawldanielcrawl@gmail.com
<p>Sensor workflows that run on the server can convert raw data into derived data; the derived data should be stored back into dataturbine so that it may be accessed by other workflows or archived into metacat. Probably the best mechanism to do this is update the dataturbine actor to allow writing. Also, verify the EcogridWriter actor still works (writes to metacat).</p> Bug #4752 (Resolved): create a sensor plots guihttps://projects.ecoinformatics.org/ecoinfo/issues/47522010-02-05T18:37:17ZDaniel Crawldanielcrawl@gmail.com
<p>Display recent data collected by sensors. Select sensors to display, possibly group by sensor type, different sensors may have different sampling rates; show appropriately. Also, see if RDV useful for this (possibly embed RDV within Kepler?).</p>
<p>See figure 5 in <a class="external" href="https://kepler-project.org/developers/incubation/kepler-engineering-view-for-reap/engineering-view-plans">https://kepler-project.org/developers/incubation/kepler-engineering-view-for-reap/engineering-view-plans</a></p> Bug #4742 (Resolved): create engineering view model typehttps://projects.ecoinformatics.org/ecoinfo/issues/47422010-02-05T18:22:22ZDaniel Crawldanielcrawl@gmail.com
<p>The engineering view is a separate tool from the normal workflow editor. Add the engineering view to File -> New submenu, and when selected it should open a new window showing the engineering view tabs. The actor tree should only contain components (actors and annotations) specific to engineering view. A second part of this bug is to develop the ontology for those components.</p>