- Add tests for the new `/items` POST handlers to the DSpace 6.x branch of my [dspace-statistics-api](https://github.com/ilri/dspace-statistics-api/tree/v6_x)
- It took a bit of extra work because I had to learn how to mock the responses for when Solr is not available
2020-10-06 21:36:04,143 ERROR org.dspace.storage.rdbms.DatabaseRegistryUpdater @ Error attempting to update Bitstream Format and/or Metadata Registries
org.hibernate.exception.ConstraintViolationException: could not execute batch
at org.hibernate.exception.internal.SQLStateConversionDelegate.convert(SQLStateConversionDelegate.java:129)
at org.hibernate.exception.internal.StandardSQLExceptionConverter.convert(StandardSQLExceptionConverter.java:49)
at org.hibernate.engine.jdbc.spi.SqlExceptionHelper.convert(SqlExceptionHelper.java:124)
at org.hibernate.engine.jdbc.batch.internal.BatchingBatch.performExecution(BatchingBatch.java:122)
at org.hibernate.engine.jdbc.batch.internal.BatchingBatch.doExecuteBatch(BatchingBatch.java:101)
at org.hibernate.engine.jdbc.batch.internal.AbstractBatchImpl.execute(AbstractBatchImpl.java:161)
at org.hibernate.engine.jdbc.internal.JdbcCoordinatorImpl.executeBatch(JdbcCoordinatorImpl.java:207)
at org.hibernate.engine.spi.ActionQueue.executeActions(ActionQueue.java:390)
at org.hibernate.engine.spi.ActionQueue.executeActions(ActionQueue.java:304)
at org.hibernate.event.internal.AbstractFlushingEventListener.performExecutions(AbstractFlushingEventListener.java:349)
at org.hibernate.event.internal.DefaultFlushEventListener.onFlush(DefaultFlushEventListener.java:56)
at org.hibernate.internal.SessionImpl.flush(SessionImpl.java:1195)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.hibernate.context.internal.ThreadLocalSessionContext$TransactionProtectionWrapper.invoke(ThreadLocalSessionContext.java:352)
at com.sun.proxy.$Proxy162.flush(Unknown Source)
at org.dspace.core.HibernateDBConnection.commit(HibernateDBConnection.java:83)
at org.dspace.core.Context.commit(Context.java:435)
at org.dspace.core.Context.complete(Context.java:380)
at org.dspace.administer.MetadataImporter.loadRegistry(MetadataImporter.java:164)
at org.dspace.storage.rdbms.DatabaseRegistryUpdater.updateRegistries(DatabaseRegistryUpdater.java:72)
at org.dspace.storage.rdbms.DatabaseRegistryUpdater.afterMigrate(DatabaseRegistryUpdater.java:121)
at org.flywaydb.core.internal.command.DbMigrate$3.doInTransaction(DbMigrate.java:250)
at org.flywaydb.core.internal.util.jdbc.TransactionTemplate.execute(TransactionTemplate.java:72)
at org.flywaydb.core.internal.command.DbMigrate.migrate(DbMigrate.java:246)
at org.flywaydb.core.Flyway$1.execute(Flyway.java:959)
at org.flywaydb.core.Flyway$1.execute(Flyway.java:917)
at org.flywaydb.core.Flyway.execute(Flyway.java:1373)
at org.flywaydb.core.Flyway.migrate(Flyway.java:917)
at org.dspace.storage.rdbms.DatabaseUtils.updateDatabase(DatabaseUtils.java:663)
at org.dspace.storage.rdbms.DatabaseUtils.updateDatabase(DatabaseUtils.java:575)
at org.dspace.storage.rdbms.DatabaseUtils.updateDatabase(DatabaseUtils.java:551)
at org.dspace.core.Context.<clinit>(Context.java:103)
at org.dspace.app.util.AbstractDSpaceWebapp.register(AbstractDSpaceWebapp.java:74)
at org.dspace.app.util.DSpaceWebappListener.contextInitialized(DSpaceWebappListener.java:31)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:5197)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5720)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:183)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:1016)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:992)
```
- I checked the database migrations with `dspace database info` and they were all OK
- Then I restarted the Tomcat again and it started up OK...
- There were two issues I had reported to Atmire last month:
- Importing items from the command line throws a `NullPointerException` from `com.atmire.dspace.cua.CUASolrLoggerServiceImpl` for every item, but the item still gets imported
- No results for author name in Listing and Reports, despite there being hits in Discovery search
- To test the first one I imported a very simple CSV file with one item with minimal data
- There is a new error now (but the item does get imported):
New item: aff5e78d-87c9-438d-94f8-1050b649961c (10568/108548)
+ New owning collection (10568/3): ILRI articles in journals
+ Added (dc.contributor.author): Orth, Alan
+ Added (dc.date.issued): 2020-09-01
+ Added (dc.title): Testing CUA import NPE
Tue Oct 06 22:06:14 CEST 2020 | Query:containerItem:aff5e78d-87c9-438d-94f8-1050b649961c
Error while updating
org.apache.solr.client.solrj.impl.HttpSolrServer$RemoteSolrException: Expected mime type application/octet-stream but got text/html. <!doctype html><htmllang="en"><head><title>HTTP Status 404 – Not Found</title><styletype="text/css">body{font-family:Tahoma,Arial,sans-serif;}h1,h2,h3,b{color:white;background-color:#525D76;}h1{font-size:22px;}h2{font-size:16px;}h3{font-size:14px;}p{font-size:12px;}a{color:black;}.line{height:1px;background-color:#525D76;border:none;}</style></head><body><h1>HTTP Status 404 – Not Found</h1><hrclass="line"/><p><b>Type</b> Status Report</p><p><b>Message</b> The requested resource [/solr/update] is not available</p><p><b>Description</b> The origin server did not find a current representation for the target resource or is not willing to disclose that one exists.</p><hrclass="line"/><h3>Apache Tomcat/7.0.104</h3></body></html>
at org.apache.solr.client.solrj.impl.HttpSolrServer.executeMethod(HttpSolrServer.java:512)
at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:210)
at org.apache.solr.client.solrj.impl.HttpSolrServer.request(HttpSolrServer.java:206)
at org.apache.solr.client.solrj.request.AbstractUpdateRequest.process(AbstractUpdateRequest.java:124)
at org.apache.solr.client.solrj.SolrServer.commit(SolrServer.java:168)
at com.atmire.dspace.cua.CUASolrLoggerServiceImpl$5.visit(SourceFile:1131)
at com.atmire.dspace.cua.CUASolrLoggerServiceImpl.visitEachStatisticShard(SourceFile:212)
at com.atmire.dspace.cua.CUASolrLoggerServiceImpl.update(SourceFile:1104)
at com.atmire.dspace.cua.CUASolrLoggerServiceImpl.update(SourceFile:1093)
at org.dspace.statistics.StatisticsLoggingConsumer.consume(SourceFile:104)
at org.dspace.event.BasicDispatcher.consume(BasicDispatcher.java:177)
at org.dspace.event.BasicDispatcher.dispatch(BasicDispatcher.java:123)
at org.dspace.core.Context.dispatchEvents(Context.java:455)
at org.dspace.core.Context.commit(Context.java:424)
at org.dspace.core.Context.complete(Context.java:380)
at org.dspace.app.bulkedit.MetadataImport.main(MetadataImport.java:1399)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.dspace.app.launcher.ScriptLauncher.runOneCommand(ScriptLauncher.java:229)
at org.dspace.app.launcher.ScriptLauncher.main(ScriptLauncher.java:81)
```
- Also, I tested Listings and Reports and there are still no hits for "Orth, Alan" as a contributor, despite there being dozens of items in the repository and the Solr query generated by Listings and Reports actually returning hits:
- Then we post an item in JSON format to `/rest/collections/{uuid}/items`:
```
$ http POST https://dspacetest.cgiar.org/rest/collections/f10ad667-2746-4705-8b16-4439abe61d22/items Cookie:JSESSIONID=EABAC9EFF942028AA52DFDA16DBCAFDE <item-object.json
```
- Format of JSON is:
```
{ "metadata": [
{
"key": "dc.title",
"value": "Testing REST API post",
"language": "en_US"
},
{
"key": "dc.contributor.author",
"value": "Orth, Alan",
"language": "en_US"
},
{
"key": "dc.date.issued",
"value": "2020-09-01",
"language": "en_US"
}
],
"archived":"false",
"withdrawn":"false"
}
```
- What is unclear to me is the `archived` parameter, it seems to do nothing... perhaps it is only used for the `/items` endpoint when printing information about an item
- If I submit to a collection that has a workflow, even as a super admin and with "archived=false" in the JSON, the item enters the workflow ("Awaiting editor's attention")
- If I submit to a new collection without a workflow the item gets archived immediately
- I created [some notes](https://gist.github.com/alanorth/40fc3092aefd78f978cca00e8abeeb7a) to share with Salem and Leroy for future reference when we start discussion POSTing items to the REST API
- I created an account for Salem on DSpace Test and added it to the submitters group of an ICARDA collection with no other workflow steps so we can see what happens
- We are curious to see if he gets a UUID when posting from MEL
- We discussed removing Atmire Listings and Reports from DSpace 6 because we can probably make the same reports in AReS and this module is the one that is currently holding us back from the upgrade
- We discussed allowing partners to submit content via the REST API and perhaps making it an extra fee due to the burden it incurs with unfinished submissions, manual duplicate checking, developer support, etc
- He was excited about the possibility of using my statistics API for more things on AReS as well as item view pages
- Also I fixed a bunch of the CRP mappings in the AReS value mapper and started a fresh re-indexing
## 2020-10-12
- Looking at CGSpace's Solr statistics for 2020-09 and I see:
-`RTB website BOT`: 212916
-`Java/1.8.0_66`: 3122
-`Mozilla/5.0 (compatible; um-LN/1.0; mailto: techinfo@ubermetrics-technologies.com; Windows NT 6.1; WOW64; rv:40.0) Gecko/20100101 Firefox/40.1`: 614
- After a few minutes I saw these four hits in Solr... WTF
- So is there some issue with DSpace's parsing of the spider agent files?
- I added `RTB website BOT` to the ilri pattern file, restarted Tomcat, and made four more requests to the bitstream
- These four requests were recorded in Solr too, WTF!
- It seems like the patterns aren't working at all...
- I decided to try something drastic and removed all pattern files, adding only one single pattern `bot` to make sure this is not because of a syntax or precedence issue
- Now even those four requests were recorded in Solr, WTF!
- I will try one last thing, to put a single entry with the exact pattern `RTB website BOT` in a single spider agents pattern file...
- Nope! Still records the hits... WTF
- As a last resort I tried to use the vanilla [DSpace 6 `example` file](https://github.com/DSpace/DSpace/blob/dspace-6_x/dspace/config/spiders/agents/example)
- And the hits still get recorded... WTF
- So now I'm wondering if this is because of our custom Atmire shit?
- I will have to test on a vanilla DSpace instance I guess before I can complain to the dspace-tech mailing list
- I re-factored the `check-spider-hits.sh` script to read patterns from a text file rather than sed's stdout, and to properly search for spaces in patterns that use `\s` because Lucene's search syntax doesn't support it (and spaces work just fine)
- I added `[Ss]pider` to the Tomcat Crawler Sessions Manager Valve regex because this can catch a few more generic bots and force them to use the same Tomcat JSESSIONID
- I added a few of the patterns from above to our local agents list and ran the `check-spider-hits.sh` on CGSpace:
- We decided to use Title Case for our countries on CGSpace to minimize the need for mapping on AReS
- We did some work to add a dozen more mappings for strange and incorrect CRPs on AReS
- I can update the country metadata in PostgreSQL like this:
```
dspace=> BEGIN;
dspace=> UPDATE metadatavalue SET text_value=INITCAP(text_value) WHERE resource_type_id=2 AND metadata_field_id=228;
UPDATE 51756
dspace=> COMMIT;
```
- I will need to pay special attention to Côte d'Ivoire, Bosnia and Herzegovina, and a few others though... maybe better do search and replace using `fix-metadata-values.csv`
- Export a list of distinct values from the database:
```
dspace=> \COPY (SELECT DISTINCT(text_value) as "cg.coverage.country" FROM metadatavalue WHERE resource_type_id=2 AND metadata_field_id=228) TO /tmp/2020-10-13-countries.csv WITH CSV HEADER;
COPY 195
```
- Then use OpenRefine and make a new column for corrections, then use this GREL to convert to title case: `value.toTitlecase()`
- I still had to double check everything to catch some corner cases (Andorra, Timor-leste, etc)
- For the input forms I found out how to do a complicated search and replace in vim:
- It uses a [negative lookahead](https://jbodah.github.io/blog/2016/11/01/positivenegative-lookaheadlookbehind-vim/) (aka "lookaround" in PCRE?) to match words that are *not* "pair", "displayed", etc because we don't want to edit the XML tags themselves...
- I had to fix a few manually after doing this, as above with PostgreSQL
## 2020-10-14
- I discussed the title casing of countries with Abenet and she suggested we also apply title casing to regions
- I exported the list of regions from the database:
```
dspace=> \COPY (SELECT DISTINCT(text_value) as "cg.coverage.region" FROM metadatavalue WHERE resource_type_id=2 AND metadata_field_id=227) TO /tmp/2020-10-14-regions.csv WITH CSV HEADER;
COPY 34
```
- I did the same as the countries in OpenRefine for the database values and in vim for the input forms
- After testing the replacements locally I ran them on CGSpace:
- I added a dozen or so more mappings to fix some country outliers on AReS
- I will start a fresh harvest there once the Discovery update is done on CGSpace
- I also adjusted my `fix-metadata-values.py` and `delete-metadata-values.py` scripts to work on DSpace 6 where there is no more `resource_type_id` field
- I will need to do it on a few more scripts as well, but I'll do that after we migrate to DSpace 6 because those scripts are less important
- I found a new setting in DSpace 6's `usage-statistics.cfg` about case insensitive matching of bots that defaults to false, so I enabled it in our DSpace 6 branch
- I am curious to see if that resolves the strange issues I noticed yesterday about bot matching of patterns in the spider agents file completely not working
- Re-deploy latest code on both CGSpace and DSpace Test to get the input forms changes
- Run system updates and reboot each server (linode18 and linode26)
- I had to restart Tomcat seven times on CGSpace before all Solr stats cores came up OK
- Skype with Peter and Abenet about AReS and CGSpace
- We agreed to lower case the AGROVOC subjects on CGSpace to make it harmonized with MELSpace and WorldFish
- We agreed to separate the AGROVOC from the other center- and CRP-specific subjects so that the search and tag clouds are cleaner and more useful
- We added a filter for journal title
- I enabled anonymous access to the "Export search metadata" option on DSpace Test
- If I search for author containing "Orth, Alan" or "Orth Alan" the export search metadata returns HTTP 400
- If I search for author containing "Orth" it exports a CSV properly...
- I created issues on the OpenRXV repository:
- [Can't download templates that have spaces in their file name](https://github.com/ilri/OpenRXV/issues/42)
- [Can't search for text values with a space in "Mapping Values" interface](https://github.com/ilri/OpenRXV/issues/43)
- Atmire responded about the Listings and Reports and Content and Usage Statistics issues with DSpace 6 that I reported last week
- They said that the CUA issue was a mistake and should be fixed in a minor version bump
- They asked me to confirm if the L&R version bump from last week did not solve the issue there (which I had tested locally, but not on DSpace Test)
- I will test them both again on DSpace Test and report back
- I posted a message on Yammer to inform all our users about the changes to countries, regions, and AGROVOC subjects
- I modified all AGROVOC subjects to be lower case in PostgreSQL and then exported a list of the top 1500 to update the controlled vocabulary in our submission form:
```
dspace=> BEGIN;
dspace=> UPDATE metadatavalue SET text_value=LOWER(text_value) WHERE resource_type_id=2 AND metadata_field_id=57;
UPDATE 335063
dspace=> COMMIT;
dspace=> \COPY (SELECT DISTINCT text_value as "dc.subject", count(text_value) FROM metadatavalue WHERE resource_type_id=2 AND metadata_field_id=57 GROUP BY "dc.subject" ORDER BY count DESC LIMIT 1500) TO /tmp/2020-10-15-top-1500-agrovoc-subject.csv WITH CSV HEADER;
COPY 1500
```
- Use my `agrovoc-lookup.py` script to validate subject terms against the AGROVOC REST API, extract matches with `csvgrep`, and then update and format the controlled vocabulary:
- Interesting! I added a find/replace manually in AReS and now I see the one I POSTed...
- I fixed a few bugs in the Simple and Extended PDF reports on AReS
- Add missing ISI Journal and Type to Simple PDF report
- Fix DOIs in Simple PDF report
- Add missing "https://hdl.handle.net" to Handles in Extented PDF report
- Testing Atmire CUA and L&R based on their feedback from a few days ago
- I no longer get the NullPointerException from CUA when importing metadata on the command line (!)
- Listings and Reports now shows results for simple queries that I tested (!), though it seems that there are some new JavaScript libraries I need to allow in nginx
- I sent a mail to the dspace-tech mailing list asking about the error with DSpace 6's "Export Search Metadata" function
- If I search for an author like "Orth, Alan" it gives an HTTP 400, but if I search for "Orth" alone it exports a CSV
- I replicated the same issue on demo.dspace.org
## 2020-10-19
- Last night I learned how to POST mappings to Elasticsearch for AReS:
- The JSON file looks like this, with one instruction on each line:
```
{"index":{}}
{ "find": "CRP on Dryland Systems - DS", "replace": "Dryland Systems" }
{"index":{}}
{ "find": "FISH", "replace": "Fish" }
```
- Adjust the report templates on AReS based on some of Peter's feedback
- I wrote a quick Python script to filter and convert the old AReS mappings to [Elasticsearch's Bulk API](https://www.elastic.co/guide/en/elasticsearch/reference/current/docs-bulk.html) format:
```python
#!/usr/bin/env python3
import json
import re
f = open('/tmp/mapping.json', 'r')
data = json.load(f)
# Iterate over old mapping file, which is in format "find": "replace", ie:
#
# "alan": "ALAN"
#
# And convert to proper dictionaries for import into Elasticsearch's Bulk API:
#
# { "find": "alan", "replace": "ALAN" }
#
for find, replace in data.items():
# Skip all upper and all lower case strings because they are indicative of
# some AGROVOC or other mappings we no longer want to do
if find.isupper() or find.islower() or replace.isupper() or replace.islower():
continue
# Skip replacements with acronyms like:
#
# International Livestock Research Institute - ILRI
- It filters all upper and lower case strings as well as any replacements that end in an acronym like "- ILRI", reducing the number of mappings from around 4,000 to about 900
- I deleted the existing `openrxv-values` Elasticsearch core and then POSTed it:
- And I saw three hits in Solr with `isBot: true`!!!
- I made a few more requests with user agent "fumanchu" and it logs them with `isBot: false`...
- I made a request with user agent "Delphi 2009" which is in the ilri pattern file, and it was logged with `isBot: true`
- I made a few more requests and confirmed that if a pattern is in the list it gets logged with `isBot: true` despite the fact that `usage-statistics.logBots` is false...
- So WTF this means that it *knows* they are from a bot, but it logs them anyways
- Peter needs to do some reporting on gender across the entirety of CGSpace so he asked me to tag a bunch of items with the AGROVOC "gender" subject (in CGIAR Gender Platform community, all ILRI items with subject "gender" or "women", all CCAFS with "gender and social inclusion" etc)
- First I exported the Gender Platform community and tagged all the items there with "gender" in OpenRefine
- Then I exported all of CGSpace and extracted just the ILRI and other center-specific tags with `csvcut`:
- Then I went through all center subjects looking for "WOMEN" or "GENDER" and checking if they were missing the associated AGROVOC subject
- To reduce the size of the CSV file I removed all center subject columns after filtering them, and I flagged all rows that I changed so I could upload a CSV with only the items that were modified
- In total it was about 1,100 items that I tagged across the Gender Platform community and elsewhere
- Also, I ran the CSVs through my `csv-metadata-quality` checker to do basic sanity checks, which ended up removing a few dozen duplicated subjects