CGSpace Notes

Documenting day-to-day work on the CGSpace repository.

July, 2018

2018-07-01

  • I want to upgrade DSpace Test to DSpace 5.8 so I took a backup of its current database just in case:

    $ pg_dump -b -v -o --format=custom -U dspace -f dspace-2018-07-01.backup dspace
    
  • During the mvn package stage on the 5.8 branch I kept getting issues with java running out of memory:

    There is insufficient memory for the Java Runtime Environment to continue.
    
Read more →

June, 2018

2018-06-04

  • Test the DSpace 5.8 module upgrades from Atmire (#378)
    • There seems to be a problem with the CUA and L&R versions in pom.xml because they are using SNAPSHOT and it doesn’t build
  • I added the new CCAFS Phase II Project Tag PII-FP1_PACCA2 and merged it into the 5_x-prod branch (#379)
  • I proofed and tested the ILRI author corrections that Peter sent back to me this week:

    $ ./fix-metadata-values.py -i /tmp/2018-05-30-Correct-660-authors.csv -db dspace -u dspace -p 'fuuu' -f dc.contributor.author -t correct -m 3 -n
    
  • I think a sane proofing workflow in OpenRefine is to apply the custom text facets for check/delete/remove and illegal characters that I developed in March, 2018

  • Time to index ~70,000 items on CGSpace:

    $ time schedtool -D -e ionice -c2 -n7 nice -n19 [dspace]/bin/dspace index-discovery -b                                  
    
    real    74m42.646s
    user    8m5.056s
    sys     2m7.289s
    
Read more →

May, 2018

2018-05-01

Read more →

April, 2018

2018-04-01

  • I tried to test something on DSpace Test but noticed that it’s down since god knows when
  • Catalina logs at least show some memory errors yesterday:
Read more →

February, 2018

2018-02-01

  • Peter gave feedback on the dc.rights proof of concept that I had sent him last week
  • We don’t need to distinguish between internal and external works, so that makes it just a simple list
  • Yesterday I figured out how to monitor DSpace sessions using JMX
  • I copied the logic in the jmx_tomcat_dbpools provided by Ubuntu’s munin-plugins-java package and used the stuff I discovered about JMX in 2018-01
Read more →

January, 2018

2018-01-02

  • Uptime Robot noticed that CGSpace went down and up a few times last night, for a few minutes each time
  • I didn’t get any load alerts from Linode and the REST and XMLUI logs don’t show anything out of the ordinary
  • The nginx logs show HTTP 200s until 02/Jan/2018:11:27:17 +0000 when Uptime Robot got an HTTP 500
  • In dspace.log around that time I see many errors like “Client closed the connection before file download was complete”
  • And just before that I see this:

    Caused by: org.apache.tomcat.jdbc.pool.PoolExhaustedException: [http-bio-127.0.0.1-8443-exec-980] Timeout: Pool empty. Unable to fetch a connection in 5 seconds, none available[size:50; busy:50; idle:0; lastwait:5000].
    
  • Ah hah! So the pool was actually empty!

  • I need to increase that, let’s try to bump it up from 50 to 75

  • After that one client got an HTTP 499 but then the rest were HTTP 200, so I don’t know what the hell Uptime Robot saw

  • I notice this error quite a few times in dspace.log:

    2018-01-02 01:21:19,137 ERROR org.dspace.app.xmlui.aspect.discovery.SidebarFacetsTransformer @ Error while searching for sidebar facets
    org.dspace.discovery.SearchServiceException: org.apache.solr.search.SyntaxError: Cannot parse 'dateIssued_keyword:[1976+TO+1979]': Encountered " "]" "] "" at line 1, column 32.
    
  • And there are many of these errors every day for the past month:

    $ grep -c "Error while searching for sidebar facets" dspace.log.*
    dspace.log.2017-11-21:4
    dspace.log.2017-11-22:1
    dspace.log.2017-11-23:4
    dspace.log.2017-11-24:11
    dspace.log.2017-11-25:0
    dspace.log.2017-11-26:1
    dspace.log.2017-11-27:7
    dspace.log.2017-11-28:21
    dspace.log.2017-11-29:31
    dspace.log.2017-11-30:15
    dspace.log.2017-12-01:15
    dspace.log.2017-12-02:20
    dspace.log.2017-12-03:38
    dspace.log.2017-12-04:65
    dspace.log.2017-12-05:43
    dspace.log.2017-12-06:72
    dspace.log.2017-12-07:27
    dspace.log.2017-12-08:15
    dspace.log.2017-12-09:29
    dspace.log.2017-12-10:35
    dspace.log.2017-12-11:20
    dspace.log.2017-12-12:44
    dspace.log.2017-12-13:36
    dspace.log.2017-12-14:59
    dspace.log.2017-12-15:104
    dspace.log.2017-12-16:53
    dspace.log.2017-12-17:66
    dspace.log.2017-12-18:83
    dspace.log.2017-12-19:101
    dspace.log.2017-12-20:74
    dspace.log.2017-12-21:55
    dspace.log.2017-12-22:66
    dspace.log.2017-12-23:50
    dspace.log.2017-12-24:85
    dspace.log.2017-12-25:62
    dspace.log.2017-12-26:49
    dspace.log.2017-12-27:30
    dspace.log.2017-12-28:54
    dspace.log.2017-12-29:68
    dspace.log.2017-12-30:89
    dspace.log.2017-12-31:53
    dspace.log.2018-01-01:45
    dspace.log.2018-01-02:34
    
  • Danny wrote to ask for help renewing the wildcard ilri.org certificate and I advised that we should probably use Let’s Encrypt if it’s just a handful of domains

Read more →

December, 2017

2017-12-01

  • Uptime Robot noticed that CGSpace went down
  • The logs say “Timeout waiting for idle object”
  • PostgreSQL activity says there are 115 connections currently
  • The list of connections to XMLUI and REST API for today:
Read more →

November, 2017

2017-11-01

  • The CORE developers responded to say they are looking into their bot not respecting our robots.txt

2017-11-02

  • Today there have been no hits by CORE and no alerts from Linode (coincidence?)

    # grep -c "CORE" /var/log/nginx/access.log
    0
    
  • Generate list of authors on CGSpace for Peter to go through and correct:

    dspace=# \copy (select distinct text_value, count(*) as count from metadatavalue where metadata_field_id = (select metadata_field_id from metadatafieldregistry where element = 'contributor' and qualifier = 'author') AND resource_type_id = 2 group by text_value order by count desc) to /tmp/authors.csv with csv;
    COPY 54701
    
Read more →

October, 2017

2017-10-01

  • Peter emailed to point out that many items in the ILRI archive collection have multiple handles:

    http://hdl.handle.net/10568/78495||http://hdl.handle.net/10568/79336
    
  • There appears to be a pattern but I’ll have to look a bit closer and try to clean them up automatically, either in SQL or in OpenRefine

  • Add Katherine Lutz to the groups for content submission and edit steps of the CGIAR System collections

Read more →