Kepler: Issueshttps://projects.ecoinformatics.org/ecoinfo/https://projects.ecoinformatics.org/ecoinfo/ecoinfo/favicon.ico?14691340362011-03-01T23:46:41ZEcoinformatics Redmine
Redmine Bug #5333 (New): 2.2 rc3: getenv("") doesn't work for mac installation.https://projects.ecoinformatics.org/ecoinfo/issues/53332011-03-01T23:46:41Zjianwu jianwujianwu@sdsc.edu
<p>getenv is to get environment variable value in Kepler. It works for me in my installation on windows. On mac, it doesn't work if I start Kepler using Kepler.app. It only works if I start using Kepler.app/Contents/Resources/Java/kepler.sh.</p> Bug #5314 (New): Closing last workflow window closes Kepler applicationhttps://projects.ecoinformatics.org/ecoinfo/issues/53142011-02-18T13:58:24ZMichal Owsiakmichalo@man.poznan.pl
<p>This is not compatible with Mac OS X way of closing windows.</p>
<p>After last window is closed, application should remain running unless user explicitly presses Command-Q or chooses Kepler -> Quit</p> Bug #5276 (New): component search performance is poorhttps://projects.ecoinformatics.org/ecoinfo/issues/52762011-01-24T22:21:07ZDerik Barseghianbarseghian@nceas.ucsb.edu
<p>Searching for remote KARs through the Components pane is very slow -- search downloads and deals with karXML, each generally about twice as big as the kar it represents. During search the GUI is locked, and there is no progress indicator letting you know search is happening.</p>
<p>The query and related code probably need to be refactored to only download and utilize the bare minimum to show the results. This refactoring may necessitate some changes to what actions are possible on items in the results tree.</p>
<p>Jing and Matt will discuss this bug.</p> Bug #5173 (New): Develop clear criteria for when code in Kepler CORE should be in its own module.https://projects.ecoinformatics.org/ecoinfo/issues/51732010-09-08T23:24:18ZDavid Welkerdavid.v.welker@gmail.com
<p>As of now, we have broken up Kepler into a number of modules. But this breaking up of Kepler has been ad hoc and not followed a consistent set of guidelines concerning when code should be in its own module and when it should be grouped with other code that is somehow related in a common module.</p>
<p>Lack of consistent guidelines make modules harder to understand. Following more consistent guidelines would make Kepler easier to understand.</p> Bug #5129 (In Progress): adding and removing configuration properties could be made easierhttps://projects.ecoinformatics.org/ecoinfo/issues/51292010-08-06T21:21:25ZDerik Barseghianbarseghian@nceas.ucsb.edu
<p>Now that we've shipped 2.0, if a developer wants to add a new configuration property to a config file, they must first check if the user's config file has the property before accessing it, and add it if not. This is a minor hassle if your code tries to use this property in a number of places, each of which might be reached first during the user's kepler session...you have to do the check and possible add at each place. Instead, maybe the configuration manager could, during startup, run through all active modules config files and add properties if necessary to the user's copies in KeplerData. Handing removing properties would be harder -- if we remove a property, and then a user reverts to an older version of Kepler, this CM method would have to know to add the property back for that version.</p> Bug #4869 (In Progress): changes made to workflows during dialogs before committing (Edit Paramet...https://projects.ecoinformatics.org/ecoinfo/issues/48692010-03-04T19:04:24ZOliver Soongsoong@nceas.ucsb.edu
<p>When I make changes to an actor through the dialog, it seems like the changes are made to the workflow immediately, before the "Commit" button is clicked.</p>
<p>For example, create a workflow with an EML 2 Dataset actor. Make sure Kepler has not already authenticated with KNB. Now edit the EML 2 Dataset actor to reference a data package that requires authentication. The KNB login window will immediately pop up, even though no changes have been committed. Dismiss the window in some way. Now click the Help button and close the help window. The dialog will have disappeared. Open the actor dialog again, and the change will be there.</p>
<p>I should point out that the "Cancel" button does seem to work, but I suspect (based on debugging comments and comments in code) that this acts by reverting the changes already made.</p>
<p>I've seen this for sure on EML 2 Dataset and RExpression, since changes to one trigger visual feedback and I've mucked around in the code of the other. I suspect this is a general problem, though, and it may be a result of the underlying ptolemy design.</p> Bug #4300 (New): Animate at Runtime" checkbox stays checked when director is replacedhttps://projects.ecoinformatics.org/ecoinfo/issues/43002009-08-07T23:27:12ZTimothy McPhillipsmcphillips@ecoinformatics.org
<p>If you enable run-time animation of a workflow and then swap in a different director, the "Animate at Runtime" menu item remains checked. However, the next run of the workflow will not be animated; apparently the newly inserted director does not know about the animation?</p> Bug #4273 (New): Ctrl+E operates on mouse position, not selectionhttps://projects.ecoinformatics.org/ecoinfo/issues/42732009-07-27T20:50:27ZOliver Soongsoong@nceas.ucsb.edu
<p>This is a UI concern, but if I select an actor, then move the mouse over a different actor, then Ctrl+E, it will open the settings for the actor under the mouse, not the selected actor.</p> Bug #4046 (New): ComadTest should report more details when detecting an errorhttps://projects.ecoinformatics.org/ecoinfo/issues/40462009-05-01T20:05:58ZTimothy McPhillipsmcphillips@ecoinformatics.org
<p>The ComadTest actor is used to create automated tests of COMAD features. Because it is often useful to include several instances of ComadTest in the same test workflow, the ComadTest actor should report its name when it throws an exception. Ideally it also would indicate something about how the data stream it received during the current workflow run does not match what it received during training--perhaps the element name and line number of the first mismatch in the trace?</p> Bug #3671 (New): Configurable workspace directory for holding workflows, data, and run productshttps://projects.ecoinformatics.org/ecoinfo/issues/36712008-11-13T19:04:28ZTimothy McPhillipsmcphillips@ecoinformatics.org
<p>In bug 3558 I requested that a new directory be created on the user's system for each workflow run, and that outputs of the run, trace files, etc, be placed there. In bug 3585 I asked for an API that would make it easy for actors to write output files to this 'run' directory.</p>
<p>But where should these run directories themselves go? I believe we should allow users to specify a directory for holding their 'workspace' in a location of their choosing. In the workspace could go a directory for holding the workflows they develop and use for a particular project (we've done this before in the Kepler/ppod release, but the directory location was fixed), another directory for holding workflow runs, etc.</p>
<p>One alternative would be to hide all this somewhere inside .kepler in the user's home directory. However, I don't think this is the best approach for two reasons. First, the point is to make it easy for users to find their workflows, data, and workflow run products, and to load the latter into other tools for visualization and further analysis. The .kepler directory is hidden and should be used for things that would distract the user if made more prominent. Second, in practice the .kepler directory is frequently deleted (sometimes when installing a new version of Kepler, for example). A user's work should not be deleted at such times, so .kepler should be used only for things that can be regenerated as needed (e.g. data caches).</p>
<p>Another alternative would be to store everything discussed here in a database. However, many workflows (a) generate large numbers of large data files that would be awkward to place in a database, and (b) users often want immediate file-system access to these output files anyway because the other tools they use to review and further analyze their results expect the data to be stored in files. There shouldn't be an extra step of exporting workflow run products from a database to a directory of files after each workflow run in such cases.</p>
<p>I also think users should have the option of creating multiple workspaces, each with their own directories of workflows and runs. A workspace browser in Kepler could make it easy to view workflows or runs from a particular workspace or all of them at once.</p>
<p>Note that all this has ramifications for distributed execution. Following execution on multiple nodes, the files expected to be found in a local run directory will need to be copied automatically from each compute node.</p> Bug #3560 (New): Color-code contents of CollectionDisplayhttps://projects.ecoinformatics.org/ecoinfo/issues/35602008-10-24T00:53:33ZTimothy McPhillipsmcphillips@ecoinformatics.org
<p>The CollectionDisplay actor provides a live, XML-formatted view of the data stream arriving at the actor in a COMAD workflow. The window contents would be easier to understand if they were color-coded to distinguish Collections, Data elements, metadata/annotations, and provenance records, as suggested for the Trace File view of the provenance browser in bug 3555 (the color-coding should be same for both).</p> Bug #3552 (In Progress): Annotation elements in trace file do not appear in details pane of prove...https://projects.ecoinformatics.org/ecoinfo/issues/35522008-10-22T20:07:51ZTimothy McPhillipsmcphillips@ecoinformatics.org
<p>The provenance browser shows the details for the selected data or collection element of a trace in the lower left-hand panel. When the element selected has been annotated with one or more Metadata elements, these appear as name-value pairs under the heading "Annotations" in that panel. However, if an <strong>Annotation</strong> element has been applied to the selected element, it is not displayed.</p>
<p>Annotation and Metadata elements should both appear in the details panel, probably under two distinct headings, 'Metadata' and 'Annotations'.</p>
<p>(Note that the distinction between metadata and annotations in COMAD is that the former are reserved for things that have always been true about the item it is associated with, while the former can be used for any purpose. Consequently, Metadata elements cannot be deleted or replaced during a COMAD workflow run, while Annotation elements can be.)</p> Bug #3546 (New): Automatically load trace for a completed run into the provenance browserhttps://projects.ecoinformatics.org/ecoinfo/issues/35462008-10-22T17:40:53ZTimothy McPhillipsmcphillips@ecoinformatics.org
<p>At present, viewing the trace of a workflow run via the provenance browser (the one in the provenance-apps module) requires either running the provenance browser from the command line or navigating to the trace for the latest workflow run in the MyTraces subtree in the Traces panel of the Workspace pane in Kepler. I almost always want to see the trace immediately on running a workflow.</p>
<p>Could we provide an option to load the trace of the current run automatically when it completes?</p> Bug #3149 (New): cannot add arbitrary jars when exporting archivehttps://projects.ecoinformatics.org/ecoinfo/issues/31492008-02-14T19:31:46ZDan Higginshiggins@nceas.ucsb.edu
<p>The "Export Archive (kar)" popup menu asks if the class file of an actor should be included in the kar, but there is no way to add other classes/jars. ie jar libraries that may be needed by the actor. (One specific example - python actor needs the jython.jar to execute)</p>
<p>Need to add a dialog to let the user add arbitrary number of additional jars to a kar file.</p> Bug #3143 (New): dataFrame_R cache problem under parallelshttps://projects.ecoinformatics.org/ecoinfo/issues/31432008-02-11T20:26:29ZDerik Barseghianbarseghian@nceas.ucsb.edu
<p>I'm currently getting the following error with the demos/R/dataFrame_R.xml workflow when trying to run it under Parallels:</p>
<p>Error in file(file, "r") : unable to open connection<br />In addition: Warning message:<br />cannot open file '////.PSF//.Home//.kepler//cache////cachedata//urn.lsid.localhost.7a976669.0.0', reason 'Invalid argument' <br />Execution halted</p>
<p>All other R demos work fine. I've tried deleting .kepler and restarting to no avail. Others have reported success under Parallels.</p>