handle null Boolean in SM.archived field
include sample data package for generating annotations. This is the classic Datos Meteorologicos set, but with Matthew Jones as the creator so that we can look up his ORCID in their sandbox environment. https://projects.ecoinformatics.org/ecoinfo/issues/6267
use Matthew Jones for test creator since he has an ORCID in their staging environment.
augment annotation indexing test/sample to include orcid annotation. https://projects.ecoinformatics.org/ecoinfo/issues/6267https://projects.ecoinformatics.org/ecoinfo/issues/6423
attribute the datapackage to the creator (using orcid if we can find it). https://projects.ecoinformatics.org/ecoinfo/issues/6267https://projects.ecoinformatics.org/ecoinfo/issues/6423
add test for BioPortal annotator service.
refactor web service calls to bioportal and orcid outside of the annotator class. test with orcid sandbox server. include orcid uri for the annotations being generated (we can index these and drive our searches on these values down the road). related to this: https://projects.ecoinformatics.org/ecoinfo/issues/6423 and also some semtools tasks.
remove leading '?' in the query parameter for MN.query() implementation. We want it to match CN behavior/expectations and comply with the DataONE specification for the interface. https://projects.ecoinformatics.org/ecoinfo/issues/6488
Use OBOE-SBC ontology for looking up concepts (it contains subclasses of our OBOE Characteristic and Standard superclasses). Restrict annotations to only subclasses that fit the OBOE model. Correct the xpointer and individual naming conventions so they are unique, but express the exact entity/attribute being annotated.
remove my api key. oops
add comment/pointer to BioPortal annotation service.
Include method to look up annotation classes from BioPortal. We still have OBOE-SBC in there, and theyhave the SWEET ontology. The suggestions returned are not perfect, but they can be better than nothing. Ideally, we'd only query a few ontologies so we don't end up using terms from medical ontologies that aren't really appropriate for our domain. https://projects.ecoinformatics.org/ecoinfo/issues/6256
Add xpointer FragmentSelectors to each annotation.Split attribute label into tokens to attempt matching to OBOE concepts.
include code to generate random annotations for UI testing. Effective, but can be confusing to see so many unrelated concepts on duplicate EML packages.
include characteristic_sm field with SPARQL query
include SSLVerify* directives for client certificates and a pointer for getting the DataONE chain files.
Added an explanation of "metacat context" to the Metacat Themes docs based on questions asked by an actual user following our instructions in the docs.
Edited the docs to incude more details about creating a custom theme
Remove the code to lookup alias dn in the getGroups method.
Rather than directly to modify the env, we use context.addToEnv.This fixed a bug in non-tls env, the alias log-in doesn't work.
first pass at generating annotations from EML attribute information. uses the OpenAnnotation model that the metacat-index tests assume which allows us to populate dynamic index fields for the annotation class[es]. There is still much to be done with finding appropriate concepts for each attribute. https://projects.ecoinformatics.org/ecoinfo/issues/6256
switch to index standard since it is more likely we will be able to determine this from our existing EML attribute information. https://projects.ecoinformatics.org/ecoinfo/issues/6253
Edited the replicaPolicies script to print out a list of IDs that have a different authoritative member node, the number of successes, and failures at the end.
Add comments to bash script to explain its function and dependencies
Added a bash script to call /replicaPolicies/{pid} via the DataONE API for all objects in a MN or a list of ids.
Add the test class for the pisco account.
Remove the test method for the pisco account since it maybe fails because of the fire wall issue.
Add the login test of the pisco account.
Add the pisco account.
Do a more thorough check that the characteristic annotation was successfully indexed as expected (semtools)
switch to the OpenAnnotation (OA) model for annotating datapackages with measurements/characteristics (semtools)
support content from all serverLocations when summarizing entity info (semtools)
bump the poms to 2.4.2
merge from trunk: these open layers resources were not committed!
merge from branch: more notes on 2.4.1 release in the readme
Add a pisco test account.
allow "+" in solr query syntax. https://projects.ecoinformatics.org/ecoinfo/issues/6435
Add a comment to warn the users not to change the password file path to a production one since it will be deleted.
clear test password file so subsequent runs will not fail -- tests assume blank file to begin with.
use local release of OpenLayers api so that it works over https with our secure deployments (openlayers.org does not offer the api from their servers using https).
include read events when re-indexing obsoleted objects. https://projects.ecoinformatics.org/ecoinfo/issues/6424
Change the loginForm attributes which is generated by the logout action.
Use the userManagement variable.
Set the userManagementURL property.
make the loginForm and logoutForm the onSubmit "false" and the method "post".
remove metacarta map layer -- their WMS service is no longer responding.
update to use 2.4.1 so the trunk has all artifacts for upgrades.
simple upgrade scripts for version 2.4.1
In the authenticate method, if metacat can't get user info, the login still can be successful.
change a log information.
In the getALiasedName method, the referral set to ignore. Since the alias name is the local referral, we need to set it to ignore.
Change the password.
test that obsoleted objects remain indexed, but are marked as obsoleted. https://projects.ecoinformatics.org/ecoinfo/issues/6424
recursively submit obsoleted objects for indexing when instructed. https://projects.ecoinformatics.org/ecoinfo/issues/6424
First pass at a class for summarizing attribute information for analysis. (semtools) https://projects.ecoinformatics.org/ecoinfo/issues/6256
add note about archive correction
merge recent upgrade changes from 2.4 branch
use UI 1.4 branch pre 2.4 branch
uncomment other tests in suite.
look up guid when done setting access by docid so we can sync and refresh accesspolicy on MN and CN.
additional logging for set access
add a few more checks while debugging test
Use D1client for communication with CN (for integration test)
get guid from online id for call to SyncAccessPolicy
setAccessAction: get guid from passed in id for calls to SyncAccessPolicy, HazelcastService.refreshSystemMetadataEntry
example of how we can look up pid (guid) given a metacat docid.
Changed some of the font and stlyes of the metacat docs for easier reading and fixed a big where the metacat admin "configure" buttons were not working
remove sensorML from the catalog since we don't actually ship it (yet?)
add test for invalid dryad content -- should be rejected because it is not schema valid.
clean up dryad test doc - only use D1 api. https://projects.ecoinformatics.org/ecoinfo/issues/6419
add generated diagrams for stats proposal -- seems to be our practice for other documentation pages.
test for inserting dryad instance doc. https://projects.ecoinformatics.org/ecoinfo/issues/6419
add test to check sync of access policies of data object referenced in EML 2.0.1 docs
initial basic test of inserting dryad metadata. NOTE: uses metacat api, not dataone api. https://projects.ecoinformatics.org/ecoinfo/issues/6419
do not delete the lib/schema directory on fullclean now that we actually have content in there in SVN. https://projects.ecoinformatics.org/ecoinfo/issues/6419
Add an example Dryad Metadata Profile instance document to test inserts of this schema type.
Add in Darwin Core schema support into xml_catalog, and insert it on upgrade as well. The schemas are cached in lib/schema/dwc, and Matt and Ben noted that the tdwg_basetypes.xsd and tdwg_dwctypes.xsd are part of the same namespace, but are xs:include'd rather than imported via namespace.
In lieu of pulling schemas from URL endpoints for dataONE, Darwin core, and Dryad, we're caching them in Metacat so we have stable copies. Remove the ant targets used to pull them.
Add the three Darwin Core schemas required by the Dryad Metadata Profile (via imports).
include a few tests for isEqual method. https://projects.ecoinformatics.org/ecoinfo/issues/6407
Add cached versions of the DataONE, Dryad, Dublin core, and Darwin Core schemas to Metacat. Remove schemaLocation attributes so that we rely on the local catalog and don't use (potentially changing) URL endpoints.
Use client.MetacatClient instance for all metacat api calls
Test syncing of access policies when updated with legacy metacat api
Change isEqual to private so it can be used by test suite
Add DataONE, Dublin Core, and Dryad schemas during the 2.4.0 upgrade, and be sure to remove the appropriate entries before inserting to avoid duplicate rows.
Add schema support for the DataONE, dublin Core, and Dryad schemas. Schemas get downloaded into lib/schema priior to jar and dist targets, and get loaded into xml_catalog on installation.
move the postgres changes to the oracle version -- update note about not attempting to restore because no Oracle MNs exist.
do not include "sm" alias in the SET clause.
allow statements starting with 'WITH'
comment out the select statements so they do not run during real upgrade.
use rangeOfDates | singleDateTime to populate the beginDate and endDate index fields. https://projects.ecoinformatics.org/ecoinfo/issues/6285
switch to ezid 1.0.0 release and pull from dev-testing.dataone.org Maven repo.
loosen the restriction on which archive flags we set to false -- if we have an obsoleted_by value then it need not be marked as archived.
add [partial] upgrade to the oracle script -- does not look for any records that the CN deleted because there are no Oracle-backed MNs at this time.
add comment (and commented out code) for possibly inspecting the /dirtySysMeta call for archive=true flag. https://projects.ecoinformatics.org/ecoinfo/issues/6417
use '/var/metacat/users/password.xml' as the default password file path to: a) indicate it is for managing users and b) that it uses XML serialization. https://projects.ecoinformatics.org/ecoinfo/issues/6320
add a link to the authentication interface page so users can more easily find information on how to add users to the auth file.
only index event information for known events. https://projects.ecoinformatics.org/ecoinfo/issues/6346