Bug #4197
closedWaterflow TPC demo - tracking bug
0%
Description
Will be using this as a demonstration of the TPC reporting system. Logging development here (might overlap with the trac system).
Related issues
Updated by ben leinfelder over 15 years ago
Opening from the exported KAR is causing some problems - that's in another bug.
There are two files for this TPC (in the kruger SVN repo):
tpc02-water-flow-base.xml
tpc02-water-flow-high.xml
Updated by ben leinfelder over 15 years ago
In the tpc02-water-flow-base.xml workflow I have the table rendering in reporting not using the R data2html method nor the browser-launch (i've left that part of the workflow intact for comparison as I hone the reporting-based version).
TODO (workflow-side): finesse the data going into the report - "month" is an integer (1=Jan, 2=Feb, etc...) and the "avg" has 8 decimal places. Also want to use capitalization for the column headers?
TODO (reporting-side): Shade the header row. Lighten the borders around the cells.
Updated by ben leinfelder over 15 years ago
-column headers are capitalized.
-header rows are shaded
-overflowing cell contents are truncated
TODO: remove browser call
TODO: add status
TODO: round to more reasonable significant digits
Updated by Derik Barseghian over 15 years ago
I'm getting a lot of error messages on the console when I open this workflow. It seems to run ok though. (This is while using our current version of hsqldb). Are others seeing these errors?
[run] ERROR (org.ecoinformatics.seek.dataquery.DBTablesGenerator:generateDBTextTable:481) The error in generateDBTable is
[run] java.sql.SQLException: Table already exists: T1941157596 in statement [CREATE TEXT TABLE T1941157596]
[run] at org.hsqldb.jdbc.jdbcUtil.sqlException(Unknown Source)
[run] at org.hsqldb.jdbc.jdbcStatement.fetchResult(Unknown Source)
[run] at org.hsqldb.jdbc.jdbcStatement.execute(Unknown Source)
Updated by Derik Barseghian over 15 years ago
On my Eclipse build, which is using hsqldb 1.8, I get the same, and some additional errors. Provenance gives an "ERROR: RecordingException: Unable to insert into data table: out of memory", the workflow seems to work (web page comes up), but in the wrapping up phase I get a bunch of ejava.lang.OutOfMemoryError: Java heap space errors. So, I wonder is the overhead of Eclipse or my locally different version of hsql to blame. Any other Eclipse user seeing this too?
Updated by ben leinfelder over 15 years ago
The EML actors throw those exceptions - that is expected.
As for the newer HSQLDB - I will take a look.
Updated by Derik Barseghian over 15 years ago
I'm not seeing the memory errors today when using Eclipse with the currently checked in hsql, nor in Eclipse with hsql 1.8. Either I was just low on memory yesterday, or maybe I failed to kill .kepler between my tests and that caused some issue. r19734 upgrades kepler to hsqldb 1.8.0.10. Let me know if you notice any issues with this.
Updated by ben leinfelder over 15 years ago
-hsqldb 1.8 looks good.
-removed the browser call
-added naive "status" example - needs to be meaningful
Jim will do the rounding of the Average column. Oh - and maybe make the month a
string instead of a number?
Updated by ben leinfelder over 15 years ago
tpc02-water-flow-high.xml:using record-based data table rendering"nil" values are showing as a large number - need a bug fix for that.
-including status actor
-river and month are both integers - not sure where those are being looked up, but the values need to be in the dataframe before the RecordToken is created and "reported on"
Updated by ben leinfelder over 15 years ago
Changes made by Jim:
-month and river are now strings (rather than factors (ie. ints).
-rounding done to 4 places.
Changes made by me:
-"nil" values will be preserved for IntToken
-DoubleToken "nil" values will show as "NaN"
-boolean "nil", unfortunately, will show as "false"
Updated by ben leinfelder over 15 years ago
For the tutorial:
-we will use the "base" workflow with production data. Most slides and hands-on activities will use this workflow.
-the "high" workflow will be switched to use DEV for it's data so that we can illustrate how adding new data tables to an existing datapackage will be picked up automatically the next time the scheduled workflow executes.
Updated by ben leinfelder over 15 years ago
"nil" DoubleTokens will remain "nil" (rather than NaN)
Updated by ben leinfelder over 15 years ago
when running from the command line the "Plot base flow" R actor is not fired and no tokens are being recorded (as you'd expect.
When run in the gui, however, everything is normal.
Updated by ben leinfelder over 15 years ago
when I add another port named "trigger" to the "Plot base flow" actor and connect a similar output port from the upstream actor that should already be triggering the downstream actor to fire I get expected firing results!
Updated by ben leinfelder over 15 years ago
we were/are having intermittent problems rendering images created in kepler on the report - this is deep in FOP/xmlgraphics library and does not happen on the mac.
Choosing to use mac for the training demo execution engine
Updated by ben leinfelder over 15 years ago
added workflow for the HIGH base flow that points to DEV server so that we can demo the dynamic data loading.
Updated by ben leinfelder over 15 years ago
added KAR for the base workflow that includes a report layout
Updated by ben leinfelder over 15 years ago
I've gotten this working in the trunk now - after merging and patching up some new errors.
Ran the base workflow kar from my command line which automatically uploaded it to my local metacat where i could see the generated report (pdf).
There are still some blocking bugs, however (added).
Updated by ben leinfelder about 15 years ago
now using OrderedRecordTokens and OrderedRecordAssembler to do things "correctly" (no ptolemy overrides)
Updated by ben leinfelder about 15 years ago
The bovine TB TPC is also a good candidate for "testing" these features out. Just ran the KAR for that from the command line and it looked good.
Updated by ben leinfelder about 15 years ago
The water flow workflow is also looking successful from the command line