Kepler: Issueshttps://projects.ecoinformatics.org/ecoinfo/https://projects.ecoinformatics.org/ecoinfo/ecoinfo/favicon.ico?14691340362010-02-18T20:18:18ZEcoinformatics Redmine
Redmine Bug #4809 (New): Kepler storing authenticated search resultshttps://projects.ecoinformatics.org/ecoinfo/issues/48092010-02-18T20:18:18ZOliver Soongsoong@nceas.ucsb.edu
<p>From a clean cache, start Kepler and search using authenticated sources. Authenticate and Kepler will return search results. Restart Kepler and perform the same search again. Kepler will not ask for credentials. Try instantiating a dataset that requires authentication, and Kepler will ask for credentials. Restart Kepler and perform a search for a different term and Kepler will ask for credentials.</p>
<p>It looks like the search results are cached. Since access to new data requires authentication, new searches require authentication, and access to previously cached authenticated data does not, this may not be a problem.</p> Bug #4807 (Resolved): GetMetadata does nothinghttps://projects.ecoinformatics.org/ecoinfo/issues/48072010-02-18T19:49:26ZOliver Soongsoong@nceas.ucsb.edu
<p>Search for any data, right click on a search result, and select the only option, "GetMetadata". This does not seem to do anything, and there are no errors on the console. There should probably also be a space in the label as well.</p> Bug #4805 (Resolved): searching for data gives an NPEhttps://projects.ecoinformatics.org/ecoinfo/issues/48052010-02-17T23:52:15ZOliver Soongsoong@nceas.ucsb.edu
<p>I searched for Kruger on both KNB and DEV using both authenticated and unauthenticated searches. I authenticated as necessary, and I got an NPE:</p>
<p>java.lang.NullPointerException<br /> at ptolemy.vergil.basic.KeplerDocumentationAttribute.createInstanceFromExisting(KeplerDocumentationAttribute.java:195)<br /> at org.ecoinformatics.seek.datasource.eml.eml2.Eml200DataSource.generateDocumentationForInstance(Eml200DataSource.java:1070)<br /> at org.ecoinformatics.seek.datasource.eml.eml2.EML2MetadataSpecification.transformResultset(EML2MetadataSpecification.java:252)<br /> at org.ecoinformatics.seek.datasource.eml.eml2.EML2MetadataSpecification.addResultsetRecordsToContainer(EML2MetadataSpecification.java:406)<br />ERROR (org.ecoinformatics.seek.datasource.eml.eml2.Eml200DataSource:generateDocumentationForInstance:1078) error encountered whilst generating default documentation for actor instance: null</p> Bug #4801 (In Progress): out of memoryhttps://projects.ecoinformatics.org/ecoinfo/issues/48012010-02-17T22:13:10ZOliver Soongsoong@nceas.ucsb.edu
<p>ERROR: RecordingException: Unable to query data table: out of memory</p>
<p>I opened tpc01, ran it, closed it, opened tpc03-herbs, ran it, closed it, opened tpc03-large-herbivores, ran it, closed it, opened tpc03-woody, ran it, closed it, opened tpc09, ran it, and hit the above out of memory error. I have 68 runs in the wrm, but 63 of them are various iterations of the small test workflow for bug 4789. I also have those 5 Kruger KARs in a local repository and all the data for them is cached. I can start Kepler and run any of those workflows by themselves. I started Kepler with an ant run, so the jvm should have a 512MB memory max.</p>
<p>Kruger workflows: <a class="external" href="https://code.ecoinformatics.org/code/kruger/trunk/workflows">https://code.ecoinformatics.org/code/kruger/trunk/workflows</a><br />Kruger: r439<br />Kepler: wrp r23080</p>
<p>It may be worth looking at bug 4642.</p> Bug #4671 (Resolved): cannot export workflow runs when workflow has actors with <> in the nameshttps://projects.ecoinformatics.org/ecoinfo/issues/46712010-01-15T01:25:58ZOliver Soongsoong@nceas.ucsb.edu
<p>Title says it all. This is at r22494.</p> Bug #4654 (Resolved): searching KNB through Kepler returns many console errors and caching failshttps://projects.ecoinformatics.org/ecoinfo/issues/46542010-01-08T01:44:18ZOliver Soongsoong@nceas.ucsb.edu
<p>This is at r22407 under linux. Things seem to work, but the cache content is all blank (files are all of size 0).</p> Bug #4653 (Resolved): error message says "data cloumn"https://projects.ecoinformatics.org/ecoinfo/issues/46532010-01-08T01:13:53ZOliver Soongsoong@nceas.ucsb.edu
<p>The data cloumn didn't match head column</p> Bug #4652 (Resolved): clicking rows in the wrm no longer updates the report designer/viewerhttps://projects.ecoinformatics.org/ecoinfo/issues/46522010-01-08T00:55:38ZOliver Soongsoong@nceas.ucsb.edu
<p>If I have multiple entries in the wrm and I click on different rows, the report designer and viewer no longer automatically update their views. This is at r22407 under linux.</p> Bug #4642 (New): memory usage & slowdownshttps://projects.ecoinformatics.org/ecoinfo/issues/46422009-12-19T03:45:53ZOliver Soongsoong@nceas.ucsb.edu
<p>I just hit a big slowdown caused by OOM problems. This bug is mostly a place to put down some of the stuff I found out. I used jmap to produce histograms when Kepler was crawling and immediately after a fresh restart. When Kepler was slow, there was a single workflow open with 4 actors and the Check System Settings window. The fresh Kepler retained the wrm and cache content, but discarded the 4 actors and all the accumulated memory leaking cruft.</p>
<p>A few things jump out at me, and I'd say I'm pretty uninformed. I've formatted as Object: stale #, fresh #.</p>
<p>org.kepler.util.WorkflowRun: 39206, 29<br />javax.swing.JMenuItem: 3411, 96<br />java.util.HashMap: 689643, 22885<br />org.kepler.objectmanager.lsid.KeplerLSID: 120115, 1339<br />java.util.LinkedList: 95565, 4468<br />ptolemy.kernel.util.Location: 1837, 45</p>
<p>Interestingly enough, I have 28 wrm entries. I think something's up with the wrm, but also a lot of GUI objects seem to be hanging around as well, so there may be other things going on as well.</p>
<p>And on a side note, jps -> jmap -> jhat produces some pretty cool results.</p> Bug #4626 (Resolved): NPE when saying "yes" to adding a semantic type while saving a KARhttps://projects.ecoinformatics.org/ecoinfo/issues/46262009-12-11T02:27:34ZOliver Soongsoong@nceas.ucsb.edu
<p>I'm at r22155. File->Save Archive (KAR) and answer yes to adding a semantic type. I'm getting an NPE.</p>
<p>Exception in thread "AWT-EventQueue-0" java.lang.NullPointerException<br /> at sun.misc.MetaIndex.mayContain(MetaIndex.java:225)</p> Bug #4540 (Resolved): authentication problems with dev.nceas.ucsb.eduhttps://projects.ecoinformatics.org/ecoinfo/issues/45402009-11-10T18:42:02ZOliver Soongsoong@nceas.ucsb.edu
<p>If I start Kepler and open a workflow that accesses the dev metacat, I get some error messages and this on the console:</p>
<p>ERROR (org.ecoinformatics.seek.datasource.EcogridDataCacheItem:getDataItemFromEcoGrid:330) EcogridDataCacheItem - error connecting to Ecogrid<br />AxisFault<br /> faultCode: {http://schemas.xmlsoap.org/soap/envelope/}Server.userException<br /> faultSubcode:<br /> faultString: java.rmi.RemoteException: Please specify an identifier</p>
<p>The DEV server no longer shows up in my data sources list.</p> Bug #4521 (Resolved): KNB data error checking is overly aggressivehttps://projects.ecoinformatics.org/ecoinfo/issues/45212009-11-04T01:03:36ZOliver Soongsoong@nceas.ucsb.edu
<p>On r21314, I can no longer access the Kruger data. I believe this is due to some overly aggressive error checking on the data. The data packages (which are not directly under my control) have thrown errors on the console. I think this is due to negative numbers and zeroes in columns that are nominally whole numbers. In the past, there were non-fatal, but this is now major problem.</p>
<p>This change happened somewhere between r21218 and r21310.</p>
<p>The error message is specifically "Unable to parse the MetaData: Error parsing the eml package: Exception in DataTypeResolver"</p> Bug #4429 (New): EML 2 Dataset requires File Extension Filter parameterhttps://projects.ecoinformatics.org/ecoinfo/issues/44292009-09-30T20:43:38ZOliver Soongsoong@nceas.ucsb.edu
<p>I'm not 100% sure this isn't just a lack of documentation. When using EML 2 Dataset on a compressed data table (so using the As UnCompressed File Name Data Output Format), EML 2 Dataset will not function properly without a value in the File Extension Filter parameter. There does not seem to be a way to get all the files, regardless of extension, nor does there seem to be a way to get multiple extension types.</p>
<p>Personally, I'd be happy if the actor returned all files if the File Extension Parameter is blank.</p> Bug #4184 (Resolved): EML 2 Dataset disregarding selectedEntity from XMLhttps://projects.ecoinformatics.org/ecoinfo/issues/41842009-06-25T01:12:44ZOliver Soongsoong@nceas.ucsb.edu
<p>1. Drag an EML 2 Dataset actor onto the canvas and change the selectedEntity field from the initial value.<br />2. Save the workflow<br />3. Close and re-open the workflow<br />4. The selectedEntity value should be the initial value again.</p> Bug #4142 (Resolved): EML 2 Dataset caches restricted datasets, allows access across sessionshttps://projects.ecoinformatics.org/ecoinfo/issues/41422009-06-09T19:05:42ZOliver Soongsoong@nceas.ucsb.edu
<p>I authenticated to search and add an EML 2 Dataset actor for a restricted data table. If I save the workflow, close Kepler, and re-open, I can run the workflow and access the data without authenticating. This seems slightly unsafe, although it may be intended.</p>