CGSpace Notes

Documenting day-to-day work on the CGSpace repository.

October, 2020

2020-10-06

  • Add tests for the new /items POST handlers to the DSpace 6.x branch of my dspace-statistics-api
  • Trying to test the changes Atmire sent last week but I had to re-create my local database from a recent CGSpace dump
    • During the FlywayDB migration I got an error:
2020-10-06 21:36:04,138 ERROR org.hibernate.engine.jdbc.spi.SqlExceptionHelper @ Batch entry 0 update public.bitstreamformatregistry set description='Electronic publishing', internal='FALSE', mimetype='application/epub+zip', short_description='EPUB', support_level=1 where bitstream_format_id=78 was aborted: ERROR: duplicate key value violates unique constraint "bitstreamformatregistry_short_description_key"
  Detail: Key (short_description)=(EPUB) already exists.  Call getNextException to see other errors in the batch.
2020-10-06 21:36:04,138 WARN  org.hibernate.engine.jdbc.spi.SqlExceptionHelper @ SQL Error: 0, SQLState: 23505
2020-10-06 21:36:04,138 ERROR org.hibernate.engine.jdbc.spi.SqlExceptionHelper @ ERROR: duplicate key value violates unique constraint "bitstreamformatregistry_short_description_key"
  Detail: Key (short_description)=(EPUB) already exists.
2020-10-06 21:36:04,142 ERROR org.hibernate.engine.jdbc.batch.internal.BatchingBatch @ HHH000315: Exception executing batch [could not execute batch]
2020-10-06 21:36:04,143 ERROR org.dspace.storage.rdbms.DatabaseRegistryUpdater @ Error attempting to update Bitstream Format and/or Metadata Registries
org.hibernate.exception.ConstraintViolationException: could not execute batch
	at org.hibernate.exception.internal.SQLStateConversionDelegate.convert(SQLStateConversionDelegate.java:129)
	at org.hibernate.exception.internal.StandardSQLExceptionConverter.convert(StandardSQLExceptionConverter.java:49)
	at org.hibernate.engine.jdbc.spi.SqlExceptionHelper.convert(SqlExceptionHelper.java:124)
	at org.hibernate.engine.jdbc.batch.internal.BatchingBatch.performExecution(BatchingBatch.java:122)
	at org.hibernate.engine.jdbc.batch.internal.BatchingBatch.doExecuteBatch(BatchingBatch.java:101)
	at org.hibernate.engine.jdbc.batch.internal.AbstractBatchImpl.execute(AbstractBatchImpl.java:161)
	at org.hibernate.engine.jdbc.internal.JdbcCoordinatorImpl.executeBatch(JdbcCoordinatorImpl.java:207)
	at org.hibernate.engine.spi.ActionQueue.executeActions(ActionQueue.java:390)
	at org.hibernate.engine.spi.ActionQueue.executeActions(ActionQueue.java:304)
	at org.hibernate.event.internal.AbstractFlushingEventListener.performExecutions(AbstractFlushingEventListener.java:349)
	at org.hibernate.event.internal.DefaultFlushEventListener.onFlush(DefaultFlushEventListener.java:56)
	at org.hibernate.internal.SessionImpl.flush(SessionImpl.java:1195)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.hibernate.context.internal.ThreadLocalSessionContext$TransactionProtectionWrapper.invoke(ThreadLocalSessionContext.java:352)
	at com.sun.proxy.$Proxy162.flush(Unknown Source)
	at org.dspace.core.HibernateDBConnection.commit(HibernateDBConnection.java:83)
	at org.dspace.core.Context.commit(Context.java:435)
	at org.dspace.core.Context.complete(Context.java:380)
	at org.dspace.administer.MetadataImporter.loadRegistry(MetadataImporter.java:164)
	at org.dspace.storage.rdbms.DatabaseRegistryUpdater.updateRegistries(DatabaseRegistryUpdater.java:72)
	at org.dspace.storage.rdbms.DatabaseRegistryUpdater.afterMigrate(DatabaseRegistryUpdater.java:121)
	at org.flywaydb.core.internal.command.DbMigrate$3.doInTransaction(DbMigrate.java:250)
	at org.flywaydb.core.internal.util.jdbc.TransactionTemplate.execute(TransactionTemplate.java:72)
	at org.flywaydb.core.internal.command.DbMigrate.migrate(DbMigrate.java:246)
	at org.flywaydb.core.Flyway$1.execute(Flyway.java:959)
	at org.flywaydb.core.Flyway$1.execute(Flyway.java:917)
	at org.flywaydb.core.Flyway.execute(Flyway.java:1373)
	at org.flywaydb.core.Flyway.migrate(Flyway.java:917)
	at org.dspace.storage.rdbms.DatabaseUtils.updateDatabase(DatabaseUtils.java:663)
	at org.dspace.storage.rdbms.DatabaseUtils.updateDatabase(DatabaseUtils.java:575)
	at org.dspace.storage.rdbms.DatabaseUtils.updateDatabase(DatabaseUtils.java:551)
	at org.dspace.core.Context.<clinit>(Context.java:103)
	at org.dspace.app.util.AbstractDSpaceWebapp.register(AbstractDSpaceWebapp.java:74)
	at org.dspace.app.util.DSpaceWebappListener.contextInitialized(DSpaceWebappListener.java:31)
	at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:5197)
	at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5720)
	at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183)
	at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:1016)
	at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:992)
  • I checked the database migrations with dspace database info and they were all OK
    • Then I restarted the Tomcat again and it started up OK…
  • There were two issues I had reported to Atmire last month:
    • Importing items from the command line throws a NullPointerException from com.atmire.dspace.cua.CUASolrLoggerServiceImpl for every item, but the item still gets imported
    • No results for author name in Listing and Reports, despite there being hits in Discovery search
  • To test the first one I imported a very simple CSV file with one item with minimal data
    • There is a new error now (but the item does get imported):
$ dspace metadata-import -f /tmp/2020-10-06-import-test.csv -e aorth@mjanja.ch
Loading @mire database changes for module MQM
Changes have been processed
-----------------------------------------------------------
New item:
 + New owning collection (10568/3): ILRI articles in journals
 + Add    (dc.contributor.author): Orth, Alan
 + Add    (dc.date.issued): 2020-09-01
 + Add    (dc.title): Testing CUA import NPE

1 item(s) will be changed

Do you want to make these changes? [y/n] y
-----------------------------------------------------------
New item: aff5e78d-87c9-438d-94f8-1050b649961c (10568/108548)
 + New owning collection  (10568/3): ILRI articles in journals
 + Added   (dc.contributor.author): Orth, Alan
 + Added   (dc.date.issued): 2020-09-01
 + Added   (dc.title): Testing CUA import NPE
Tue Oct 06 22:06:14 CEST 2020 | Query:containerItem:aff5e78d-87c9-438d-94f8-1050b649961c
Error while updating
org.apache.solr.client.solrj.impl.HttpSolrServer$RemoteSolrException: Expected mime type application/octet-stream but got text/html. <!doctype html><html lang="en"><head><title>HTTP Status 404 – Not Found</title><style type="text/css">body {font-family:Tahoma,Arial,sans-serif;} h1, h2, h3, b {color:white;background-color:#525D76;} h1 {font-size:22px;} h2 {font-size:16px;} h3 {font-size:14px;} p {font-size:12px;} a {color:black;} .line {height:1px;background-color:#525D76;border:none;}</style></head><body><h1>HTTP Status 404 – Not Found</h1><hr class="line" /><p><b>Type</b> Status Report</p><p><b>Message</b> The requested resource [/solr/update] is not available</p><p><b>Description</b> The origin server did not find a current representation for the target resource or is not willing to disclose that one exists.</p><hr class="line" /><h3>Apache Tomcat/7.0.104</h3></body></html>
        at org.apache.solr.client.solrj.impl.HttpSolrServer.executeMethod(HttpSolrServer.java:512)
        at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:210)
        at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:206)
        at org.apache.solr.client.solrj.request.AbstractUpdateRequest.process(AbstractUpdateRequest.java:124)
        at org.apache.solr.client.solrj.SolrServer.commit(SolrServer.java:168)
        at com.atmire.dspace.cua.CUASolrLoggerServiceImpl$5.visit(SourceFile:1131)
        at com.atmire.dspace.cua.CUASolrLoggerServiceImpl.visitEachStatisticShard(SourceFile:212)
        at com.atmire.dspace.cua.CUASolrLoggerServiceImpl.update(SourceFile:1104)
        at com.atmire.dspace.cua.CUASolrLoggerServiceImpl.update(SourceFile:1093)
        at org.dspace.statistics.StatisticsLoggingConsumer.consume(SourceFile:104)
        at org.dspace.event.BasicDispatcher.consume(BasicDispatcher.java:177)
        at org.dspace.event.BasicDispatcher.dispatch(BasicDispatcher.java:123)
        at org.dspace.core.Context.dispatchEvents(Context.java:455)
        at org.dspace.core.Context.commit(Context.java:424)
        at org.dspace.core.Context.complete(Context.java:380)
        at org.dspace.app.bulkedit.MetadataImport.main(MetadataImport.java:1399)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.dspace.app.launcher.ScriptLauncher.runOneCommand(ScriptLauncher.java:229)
        at org.dspace.app.launcher.ScriptLauncher.main(ScriptLauncher.java:81)
  • Also, I tested Listings and Reports and there are still no hits for “Orth, Alan” as a contributor, despite there being dozens of items in the repository and the Solr query generated by Listings and Reports actually returning hits:
2020-10-06 22:23:44,116 INFO org.apache.solr.core.SolrCore @ [search] webapp=/solr path=/select params={q=*:*&fl=handle,search.resourcetype,search.resourceid,search.uniqueid&start=0&fq=NOT(withdrawn:true)&fq=NOT(discoverable:false)&fq=search.resourcetype:2&fq=author_keyword:Orth,\+A.+OR+author_keyword:Orth,\+Alan&fq=dateIssued.year:[2013+TO+2021]&rows=500&wt=javabin&version=2} hits=18 status=0 QTime=10 
  • Solr returns hits=18 for the L&R query, but there are no result shown in the L&R UI
  • I sent all this feedback to Atmire…

2020-10-07

  • Udana from IWMI had asked about stats discrepencies from reports they had generated in previous months or years
    • I told him that we very often purge bots and the number of stats can change drastically
    • Also, I told him that it is not possible to compare stats from previous exports and that the stats should be taking with a grain of salt
  • Testing POSTing items to the DSpace 6 REST API
    • We need to authenticate to get a JSESSIONID cookie first:
$ http -f POST https://dspacetest.cgiar.org/rest/login email=aorth@fuuu.com 'password=fuuuu'
$ http https://dspacetest.cgiar.org/rest/status Cookie:JSESSIONID=EABAC9EFF942028AA52DFDA16DBCAFDE
  • Then we post an item in JSON format to /rest/collections/{uuid}/items:
$ http POST https://dspacetest.cgiar.org/rest/collections/f10ad667-2746-4705-8b16-4439abe61d22/items Cookie:JSESSIONID=EABAC9EFF942028AA52DFDA16DBCAFDE < item-object.json
  • Format of JSON is:
{ "metadata": [
    {
      "key": "dc.title",
      "value": "Testing REST API post",
      "language": "en_US"
    },
    {
      "key": "dc.contributor.author",
      "value": "Orth, Alan",
      "language": "en_US"
    },
    {
      "key": "dc.date.issued",
      "value": "2020-09-01",
      "language": "en_US"
    }
  ],
  "archived":"false",
  "withdrawn":"false"
}
  • What is unclear to me is the archived parameter, it seems to do nothing… perhaps it is only used for the /items endpoint when printing information about an item
    • If I submit to a collection that has a workflow, even as a super admin and with “archived=false” in the JSON, the item enters the workflow (“Awaiting editor’s attention”)
    • If I submit to a new collection without a workflow the item gets archived immediately
    • I created some notes to share with Salem and Leroy for future reference when we start discussion POSTing items to the REST API
  • I created an account for Salem on DSpace Test and added it to the submitters group of an ICARDA collection with no other workflow steps so we can see what happens
    • We are curious to see if he gets a UUID when posting from MEL
  • I did some tests by adding his account to certain workflow steps and trying to POST the item
  • Member of collection “Submitters” step:
    • HTTP Status 401 – Unauthorized
    • The request has not been applied because it lacks valid authentication credentials for the target resource.
  • Member of collection “Accept/Reject” step:
    • Same error…
  • Member of collection “Accept/Reject/Edit Metadata” step:
    • Same error…
  • Member of collection Administrators with no other workflow steps…:
    • Posts straight to archive
  • Member of collection Administrators with empty “Accept/Reject/Edit Metadata” step:
    • Posts straight to archive
  • Member of collection Administrators with populated “Accept/Reject/Edit Metadata” step:
    • Does not post straight to archive, goes to workflow
  • Note that community administrators have no role in item submission other than being able to create/manage collection groups

2020-10-08

  • I did some testing of the DSpace 5 REST API because Salem and I were curious
    • The authentication is a little different (uses a serialized JSON object instead of a form and the token is an HTTP header instead of a cookie):
$ http POST http://localhost:8080/rest/login email=aorth@fuuu.com 'password=ddddd'
$ http http://localhost:8080/rest/status rest-dspace-token:d846f138-75d3-47ba-9180-b88789a28099
$ http POST http://localhost:8080/rest/collections/1549/items rest-dspace-token:d846f138-75d3-47ba-9180-b88789a28099 < item-object.json
  • The item submission works exactly the same as in DSpace 6:
  1. The submitting user must be a collection admin
  2. If the collection has a workflow the item will enter it and the API returns an item ID
  3. If the collection does not have a workflow then the item is committed to the archive and you get a Handle

2020-10-09

  • Skype with Peter about AReS and CGSpace
    • We discussed removing Atmire Listings and Reports from DSpace 6 because we can probably make the same reports in AReS and this module is the one that is currently holding us back from the upgrade
    • We discussed allowing partners to submit content via the REST API and perhaps making it an extra fee due to the burden it incurs with unfinished submissions, manual duplicate checking, developer support, etc
    • He was excited about the possibility of using my statistics API for more things on AReS as well as item view pages
  • Also I fixed a bunch of the CRP mappings in the AReS value mapper and started a fresh re-indexing

2020-10-12

  • Looking at CGSpace’s Solr statistics for 2020-09 and I see:
    • RTB website BOT: 212916
    • Java/1.8.0_66: 3122
    • Mozilla/5.0 (compatible; um-LN/1.0; mailto: techinfo@ubermetrics-technologies.com; Windows NT 6.1; WOW64; rv:40.0) Gecko/20100101 Firefox/40.1: 614
    • omgili/0.5 +http://omgili.com: 272
    • Mozilla/5.0 (compatible; TrendsmapResolver/0.1): 199
    • Vizzit: 160
    • Scoop.it: 151
  • I’m confused because a pattern for bot has existed in the default DSpace spider agents file forever…
    • I see 259,000 hits in CGSpace’s 2020 Solr core when I search for this: userAgent:/.*[Bb][Oo][Tt].*/
      • This includes 228,000 for RTB website BOT and 18,000 for ILRI Livestock Website Publications importer BOT
    • I made a few requests to DSpace Test with the RTB user agent to see if it gets logged or not:
$ http --print Hh https://dspacetest.cgiar.org/rest/bitstreams/dfa1d9c3-75d3-4380-a9d3-4c8cbbed2d21/retrieve User-Agent:"RTB website BOT"
$ http --print Hh https://dspacetest.cgiar.org/rest/bitstreams/dfa1d9c3-75d3-4380-a9d3-4c8cbbed2d21/retrieve User-Agent:"RTB website BOT"
$ http --print Hh https://dspacetest.cgiar.org/rest/bitstreams/dfa1d9c3-75d3-4380-a9d3-4c8cbbed2d21/retrieve User-Agent:"RTB website BOT"
$ http --print Hh https://dspacetest.cgiar.org/rest/bitstreams/dfa1d9c3-75d3-4380-a9d3-4c8cbbed2d21/retrieve User-Agent:"RTB website BOT"
  • After a few minutes I saw these four hits in Solr… WTF
    • So is there some issue with DSpace’s parsing of the spider agent files?
    • I added RTB website BOT to the ilri pattern file, restarted Tomcat, and made four more requests to the bitstream
    • These four requests were recorded in Solr too, WTF!
    • It seems like the patterns aren’t working at all…
    • I decided to try something drastic and removed all pattern files, adding only one single pattern bot to make sure this is not because of a syntax or precedence issue
    • Now even those four requests were recorded in Solr, WTF!
    • I will try one last thing, to put a single entry with the exact pattern RTB website BOT in a single spider agents pattern file…
    • Nope! Still records the hits… WTF
    • As a last resort I tried to use the vanilla DSpace 6 example file
    • And the hits still get recorded… WTF
    • So now I’m wondering if this is because of our custom Atmire shit?
    • I will have to test on a vanilla DSpace instance I guess before I can complain to the dspace-tech mailing list
  • I re-factored the check-spider-hits.sh script to read patterns from a text file rather than sed’s stdout, and to properly search for spaces in patterns that use \s because Lucene’s search syntax doesn’t support it (and spaces work just fine)
  • I added [Ss]pider to the Tomcat Crawler Sessions Manager Valve regex because this can catch a few more generic bots and force them to use the same Tomcat JSESSIONID
  • I added a few of the patterns from above to our local agents list and ran the check-spider-hits.sh on CGSpace:
$ ./check-spider-hits.sh -f dspace/config/spiders/agents/ilri -s statistics -u http://localhost:8083/solr -p
Purging 228916 hits from RTB website BOT in statistics
Purging 18707 hits from ILRI Livestock Website Publications importer BOT in statistics
Purging 2661 hits from ^Java\/[0-9]{1,2}.[0-9] in statistics
Purging 199 hits from [Ss]pider in statistics
Purging 2326 hits from ubermetrics in statistics
Purging 888 hits from omgili\.com in statistics
Purging 1888 hits from TrendsmapResolver in statistics
Purging 3546 hits from Vizzit in statistics
Purging 2127 hits from Scoop\.it in statistics

Total number of bot hits purged: 261258
$ ./check-spider-hits.sh -f dspace/config/spiders/agents/ilri -s statistics-2019 -u http://localhost:8083/solr -p
Purging 2952 hits from TrendsmapResolver in statistics-2019
Purging 4252 hits from Vizzit in statistics-2019
Purging 2976 hits from Scoop\.it in statistics-2019

Total number of bot hits purged: 10180
$ ./check-spider-hits.sh -f dspace/config/spiders/agents/ilri -s statistics-2018 -u http://localhost:8083/solr -p
Purging 1702 hits from TrendsmapResolver in statistics-2018
Purging 1062 hits from Vizzit in statistics-2018
Purging 920 hits from Scoop\.it in statistics-2018

Total number of bot hits purged: 3684

2020-10-13

  • Skype with Peter about AReS again
    • We decided to use Title Case for our countries on CGSpace to minimize the need for mapping on AReS
    • We did some work to add a dozen more mappings for strange and incorrect CRPs on AReS
  • I can update the country metadata in PostgreSQL like this:
dspace=> BEGIN;
dspace=> UPDATE metadatavalue SET text_value=INITCAP(text_value) WHERE resource_type_id=2 AND metadata_field_id=228;
UPDATE 51756
dspace=> COMMIT;
  • I will need to pay special attention to Côte d’Ivoire, Bosnia and Herzegovina, and a few others though… maybe better do search and replace using fix-metadata-values.csv
    • Export a list of distinct values from the database:
dspace=> \COPY (SELECT DISTINCT(text_value) as "cg.coverage.country" FROM metadatavalue WHERE resource_type_id=2 AND metadata_field_id=228) TO /tmp/2020-10-13-countries.csv WITH CSV HEADER;
COPY 195
  • Then use OpenRefine and make a new column for corrections, then use this GREL to convert to title case: value.toTitlecase()
    • I still had to double check everything to catch some corner cases (Andorra, Timor-leste, etc)
  • For the input forms I found out how to do a complicated search and replace in vim:
:'<,'>s/\<\(pair\|displayed\|stored\|value\|AND\)\@!\(\w\)\(\w*\|\)\>/\u\2\L\3/g
  • It uses a negative lookahead (aka “lookaround” in PCRE?) to match words that are not “pair”, “displayed”, etc because we don’t want to edit the XML tags themselves…
    • I had to fix a few manually after doing this, as above with PostgreSQL

2020-10-14

  • I discussed the title casing of countries with Abenet and she suggested we also apply title casing to regions
    • I exported the list of regions from the database:
dspace=> \COPY (SELECT DISTINCT(text_value) as "cg.coverage.region" FROM metadatavalue WHERE resource_type_id=2 AND metadata_field_id=227) TO /tmp/2020-10-14-regions.csv WITH CSV HEADER;
COPY 34
  • I did the same as the countries in OpenRefine for the database values and in vim for the input forms
  • After testing the replacements locally I ran them on CGSpace:
$ ./fix-metadata-values.py -i /tmp/2020-10-13-CGSpace-countries.csv -db dspace -u dspace -p 'fuuu' -f cg.coverage.country -t 'correct' -m 228
$ ./fix-metadata-values.py -i /tmp/2020-10-14-CGSpace-regions.csv -db dspace -u dspace -p 'fuuu' -f cg.coverage.region -t 'correct' -m 227
  • Then I started a full re-indexing:
$ time chrt -b 0 ionice -c2 -n7 nice -n19 dspace index-discovery -b

real    88m21.678s
user    7m59.182s
sys     2m22.713s
  • I added a dozen or so more mappings to fix some country outliers on AReS
    • I will start a fresh harvest there once the Discovery update is done on CGSpace
  • I also adjusted my fix-metadata-values.py and delete-metadata-values.py scripts to work on DSpace 6 where there is no more resource_type_id field
    • I will need to do it on a few more scripts as well, but I’ll do that after we migrate to DSpace 6 because those scripts are less important
  • I found a new setting in DSpace 6’s usage-statistics.cfg about case insensitive matching of bots that defaults to false, so I enabled it in our DSpace 6 branch
    • I am curious to see if that resolves the strange issues I noticed yesterday about bot matching of patterns in the spider agents file completely not working

2020-10-15

  • Re-deploy latest code on both CGSpace and DSpace Test to get the input forms changes
    • Run system updates and reboot each server (linode18 and linode26)
    • I had to restart Tomcat seven times on CGSpace before all Solr stats cores came up OK
  • Skype with Peter and Abenet about AReS and CGSpace
    • We agreed to lower case the AGROVOC subjects on CGSpace to make it harmonized with MELSpace and WorldFish
    • We agreed to separate the AGROVOC from the other center- and CRP-specific subjects so that the search and tag clouds are cleaner and more useful
    • We added a filter for journal title
  • I enabled anonymous access to the “Export search metadata” option on DSpace Test
    • If I search for author containing “Orth, Alan” or “Orth Alan” the export search metadata returns HTTP 400
    • If I search for author containing “Orth” it exports a CSV properly…
  • I created issues on the OpenRXV repository:
  • Atmire responded about the Listings and Reports and Content and Usage Statistics issues with DSpace 6 that I reported last week
    • They said that the CUA issue was a mistake and should be fixed in a minor version bump
    • They asked me to confirm if the L&R version bump from last week did not solve the issue there (which I had tested locally, but not on DSpace Test)
    • I will test them both again on DSpace Test and report back
  • I posted a message on Yammer to inform all our users about the changes to countries, regions, and AGROVOC subjects
  • I modified all AGROVOC subjects to be lower case in PostgreSQL and then exported a list of the top 1500 to update the controlled vocabulary in our submission form:
dspace=> BEGIN;
dspace=> UPDATE metadatavalue SET text_value=LOWER(text_value) WHERE resource_type_id=2 AND metadata_field_id=57;
UPDATE 335063
dspace=> COMMIT;
dspace=> \COPY (SELECT DISTINCT text_value as "dc.subject", count(text_value) FROM metadatavalue WHERE resource_type_id=2 AND metadata_field_id=57 GROUP BY "dc.subject" ORDER BY count DESC LIMIT 1500) TO /tmp/2020-10-15-top-1500-agrovoc-subject.csv WITH CSV HEADER;
COPY 1500
  • Use my agrovoc-lookup.py script to validate subject terms against the AGROVOC REST API, extract matches with csvgrep, and then update and format the controlled vocabulary:
$ csvcut -c 1 /tmp/2020-10-15-top-1500-agrovoc-subject.csv | tail -n 1500 > /tmp/subjects.txt
$ ./agrovoc-lookup.py -i /tmp/subjects.txt -o /tmp/subjects.csv -d
$ csvgrep -c 4 -m 0 -i /tmp/subjects.csv | csvcut -c 1 | sed '1d' > dspace/config/controlled-vocabularies/dc-subject.xml
# apply formatting in XML file
$ tidy -xml -utf8 -iq -m -w 0 dspace/config/controlled-vocabularies/dc-subject.xml
  • Then I started a full re-indexing on CGSpace:
$ time chrt -b 0 ionice -c2 -n7 nice -n19 dspace index-discovery -b

real    88m21.678s
user    7m59.182s
sys     2m22.713s

2020-10-18

  • Macaroni Bros wrote to me to ask why some of their CCAFS harvesting is failing
    • They are scraping HTML from /browse responses like this:

https://cgspace.cgiar.org/browse?type=crpsubject&value=Climate+Change%2C+Agriculture+and+Food+Security&XML&rpp=5000

  • They are using the user agent “CCAFS Website Publications importer BOT” so they are getting rate limited by nginx
  • Ideally they would use the REST find-by-metadata-field endpoint, but it is really slow for large result sets (like twenty minutes!):
$ curl -f -H "CCAFS Website Publications importer BOT" -H "Content-Type: application/json" -X POST "https://dspacetest.cgiar.org/rest/items/find-by-metadata-field?limit=100" -d '{"key":"cg.contributor.crp", "value":"Climate Change, Agriculture and Food Security","language": "en_US"}'
  • For now I will whitelist their user agent so that they can continue scraping /browse
  • I figured out that the mappings for AReS are stored in Elasticsearch
    • There is a Kibana interface running on port 5601 that can help explore the values in the index
    • I can interact with Elasticsearch by sending requests, for example to delete an item by its _id:
$ curl -XPOST "localhost:9200/openrxv-values/_delete_by_query" -H 'Content-Type: application/json' -d'
{
  "query": {
    "match": {
      "_id": "64j_THMBiwiQ-PKfCSlI"
    }
  }
}
  • I added a new find/replace:
$ curl -XPOST "localhost:9200/openrxv-values/_doc?pretty" -H 'Content-Type: application/json' -d'
{
  "find": "ALAN1",
  "replace": "ALAN2",
}
'
  • I see it in Kibana, and I can search it in Elasticsearch, but I don’t see it in OpenRXV’s mapping values dashboard
  • Now I deleted everything in the openrxv-values index:
$ curl -XDELETE http://localhost:9200/openrxv-values
  • Then I tried posting it again:
$ curl -XPOST "localhost:9200/openrxv-values/_doc?pretty" -H 'Content-Type: application/json' -d'
{
  "find": "ALAN1",
  "replace": "ALAN2",
}
'
  • But I still don’t see it in AReS
  • Interesting! I added a find/replace manually in AReS and now I see the one I POSTed…
  • I fixed a few bugs in the Simple and Extended PDF reports on AReS
    • Add missing ISI Journal and Type to Simple PDF report
    • Fix DOIs in Simple PDF report
    • Add missing “https://hdl.handle.net” to Handles in Extented PDF report
  • Testing Atmire CUA and L&R based on their feedback from a few days ago
    • I no longer get the NullPointerException from CUA when importing metadata on the command line (!)
    • Listings and Reports now shows results for simple queries that I tested (!), though it seems that there are some new JavaScript libraries I need to allow in nginx
  • I sent a mail to the dspace-tech mailing list asking about the error with DSpace 6’s “Export Search Metadata” function
    • If I search for an author like “Orth, Alan” it gives an HTTP 400, but if I search for “Orth” alone it exports a CSV
    • I replicated the same issue on demo.dspace.org

2020-10-19

  • Last night I learned how to POST mappings to Elasticsearch for AReS:
$ curl -XDELETE http://localhost:9200/openrxv-values
$ curl -XPOST http://localhost:9200/openrxv-values/_doc/_bulk -H "Content-Type: application/json" --data-binary @./mapping.json
  • The JSON file looks like this, with one instruction on each line:
{"index":{}}
{ "find": "CRP on Dryland Systems - DS", "replace": "Dryland Systems" }
{"index":{}}
{ "find": "FISH", "replace": "Fish" }
  • Adjust the report templates on AReS based on some of Peter’s feedback
  • I wrote a quick Python script to filter and convert the old AReS mappings to Elasticsearch’s Bulk API format:
#!/usr/bin/env python3

import json
import re

f = open('/tmp/mapping.json', 'r')
data = json.load(f)

# Iterate over old mapping file, which is in format "find": "replace", ie:
#
#   "alan": "ALAN"
#
# And convert to proper dictionaries for import into Elasticsearch's Bulk API:
#
#   { "find": "alan", "replace": "ALAN" }
#
for find, replace in data.items():
    # Skip all upper and all lower case strings because they are indicative of
    # some AGROVOC or other mappings we no longer want to do
    if find.isupper() or find.islower() or replace.isupper() or replace.islower():
        continue

    # Skip replacements with acronyms like:
    #
    #   International Livestock Research Institute - ILRI
    #
    acronym_pattern = re.compile(r"[A-Z]+$")
    acronym_pattern_match = acronym_pattern.search(replace)
    if acronym_pattern_match is not None:
        continue

    mapping = { "find": find, "replace": replace }

    # Print command for Elasticsearch
    print('{"index":{}}')
    print(json.dumps(mapping))

f.close()
  • It filters all upper and lower case strings as well as any replacements that end in an acronym like “- ILRI”, reducing the number of mappings from around 4,000 to about 900
  • I deleted the existing openrxv-values Elasticsearch core and then POSTed it:
$ ./convert-mapping.py > /tmp/elastic-mappings.txt
$ curl -XDELETE http://localhost:9200/openrxv-values
$ curl -XPOST http://localhost:9200/openrxv-values/_doc/_bulk -H "Content-Type: application/json" --data-binary @/tmp/elastic-mappings.txt
  • Then in AReS I didn’t see the mappings in the dashboard until I added a new one manually, after which they all appeared
    • I started a new harvesting
  • I checked the CIMMYT DSpace repository and I see they have the REST API enabled
    • The data doesn’t look too bad actually: they have countries in title case, AGROVOC in upper case, CRPs, etc
    • According to their OAI they have 6,500 items in the repository
    • I would be interested to explore the possibility to harvest them…
  • Bosede said they were having problems with the “Access” step during item submission
    • I looked at the Munin graphs for PostgreSQL and both connections and locks look normal so I’m not sure what it could be
    • I restarted the PostgreSQL service just to see if that would help
    • She said she was still experiencing the issue…
  • I ran the dspace cleanup -v process on CGSpace and got an error:
Error: ERROR: update or delete on table "bitstream" violates foreign key constraint "bundle_primary_bitstream_id_fkey" on table "bundle"
  Detail: Key (bitstream_id)=(192921) is still referenced from table "bundle".
  • The solution is, as always:
$ psql -d dspace -U dspace -c 'update bundle set primary_bitstream_id=NULL where primary_bitstream_id in (192921);'
UPDATE 1
  • After looking at the CGSpace Solr stats for 2020-10 I found some hits to purge:
$ ./check-spider-hits.sh -f /tmp/agents -s statistics -u http://localhost:8083/solr -p

Purging 2474 hits from ShortLinkTranslate in statistics
Purging 2568 hits from RI\/1\.0 in statistics
Purging 1851 hits from ILRI Livestock Website Publications importer BOT in statistics
Purging 1282 hits from curl in statistics

Total number of bot hits purged: 8174
  • Add “Infographic” to types in input form
  • Looking into the spider agent issue from last week, where hits seem to be logged regardless of ANY spider agent patterns being loaded
    • I changed the following two options:
      • usage-statistics.logBots = false
      • usage-statistics.bots.case-insensitive = true
    • Then I made several requests with a bot user agent:
$ http --print Hh https://dspacetest.cgiar.org/rest/bitstreams/dfa1d9c3-75d3-4380-a9d3-4c8cbbed2d21/retrieve User-Agent:"RTB website BOT"
$ curl -s 'http://localhost:8083/solr/statistics/update?softCommit=true'
  • And I saw three hits in Solr with isBot: true!!!
    • I made a few more requests with user agent “fumanchu” and it logs them with isBot: false
    • I made a request with user agent “Delphi 2009” which is in the ilri pattern file, and it was logged with isBot: true
    • I made a few more requests and confirmed that if a pattern is in the list it gets logged with isBot: true despite the fact that usage-statistics.logBots is false…
    • So WTF this means that it knows they are from a bot, but it logs them anyways
    • Is this an issue with Atmire’s modules?
    • I sent them feedback on the ticket

2020-10-21

  • Peter needs to do some reporting on gender across the entirety of CGSpace so he asked me to tag a bunch of items with the AGROVOC “gender” subject (in CGIAR Gender Platform community, all ILRI items with subject “gender” or “women”, all CCAFS with “gender and social inclusion” etc)
    • First I exported the Gender Platform community and tagged all the items there with “gender” in OpenRefine
    • Then I exported all of CGSpace and extracted just the ILRI and other center-specific tags with csvcut:
$ export JAVA_OPTS="-Dfile.encoding=UTF-8 -Xmx2048m"
$ dspace metadata-export -f /tmp/cgspace.csv
$ csvcut -c 'id,dc.subject[],dc.subject[en_US],cg.subject.ilri[],cg.subject.ilri[en_US],cg.subject.alliancebiovciat[],cg.subject.alliancebiovciat[en_US],cg.subject.bioversity[en_US],cg.subject.ccafs[],cg.subject.ccafs[en_US],cg.subject.ciat[],cg.subject.ciat[en_US],cg.subject.cip[],cg.subject.cip[en_US],cg.subject.cpwf[en_US],cg.subject.iita,cg.subject.iita[en_US],cg.subject.iwmi[en_US]' /tmp/cgspace.csv > /tmp/cgspace-subjects.csv
  • Then I went through all center subjects looking for “WOMEN” or “GENDER” and checking if they were missing the associated AGROVOC subject
    • To reduce the size of the CSV file I removed all center subject columns after filtering them, and I flagged all rows that I changed so I could upload a CSV with only the items that were modified
    • In total it was about 1,100 items that I tagged across the Gender Platform community and elsewhere
    • Also, I ran the CSVs through my csv-metadata-quality checker to do basic sanity checks, which ended up removing a few dozen duplicated subjects

2020-10-22

  • Bosede was getting this error on CGSpace yesterday:
Authorization denied for action WORKFLOW_STEP_1 on COLLECTION:1072 by user 1759
  • Collection 1072 appears to be IITA Miscellaneous
    • The submit step is defined, but has no users or groups
    • I added the IITA submitters there and told Bosede to try again
  • Add two new blocks to list the top communities and collections on AReS