cgspace-notes/content/posts/2020-11.md

10 KiB

title date author categories
November, 2020 2020-11-01T13:11:54+02:00 Alan Orth
Notes

2020-11-01

  • Continue with processing the statistics-2019 Solr core with the AtomicStatisticsUpdateCLI tool on DSpace Test
    • So far we've spent at least fifty hours to process the statistics and statistics-2019 core... wow.

2020-11-02

  • Talk to Moayad and fix a few issues on OpenRXV:
    • Incorrect views and downloads (caused by Elasticsearch's default result set size of 10)
    • Invalid share link
    • Missing "https://" for Handles in the Simple Excel report (caused by using the handle instead of the uri)
    • Sorting the list of items by views
  • I resumed the processing of the statistics-2018 Solr core after it spent 20 hours to get to 60%

2020-11-04

  • After 29 hours the statistics-2017 core finished processing so I started the statistics-2016 core on DSpace Test

2020-11-05

  • Peter sent me corrections and deletions for the author affiliations
    • I quickly proofed them for UTF-8 issues in OpenRefine and csv-metadata-quality and then tested them locally and then applied them on CGSpace:
$ ./fix-metadata-values.py -i 2020-11-05-fix-862-affiliations.csv -db dspace -u dspace -p 'fuuu' -f cg.contributor.affiliation -t 'correct' -m 211
$ ./delete-metadata-values.py -i 2020-11-05-delete-29-affiliations.csv -db dspace -u dspace -p 'fuuu' -f cg.contributor.affiliation -m 211
  • Then I started a Discovery re-index on CGSpace:
$ time chrt -b 0 ionice -c2 -n7 nice -n19 dspace index-discovery -b

real    92m24.993s
user    8m11.858s
sys     2m26.931s

2020-11-06

  • Restart the AtomicStatisticsUpdateCLI processing of the statistics-2016 core on DSpace Test after 20 hours...
    • This phase finished after five hours so I started it on the statistics-2015 core

2020-11-07

  • Atmire responded about the issue with duplicate values in owningComm and containerCommunity etc
    • I told them to please look into it and use some of our credits if need be
  • The statistics-2015 core finished after 20 hours so I started the statistics-2014 core

2020-11-08

  • Add "Data Paper" to types on CGSpace
  • Add "SCALING CLIMATE-SMART AGRICULTURE" to CCAFS subjects on CGSpace
  • Add "ANDEAN ROOTS AND TUBERS" to CIP subjects on CGSpace
  • Add CGIAR System subjects to Discovery sidebar facets on CGSpace
    • Also add the System subject to item view on CGSpace
  • The statistics-2014 core finished processing after five hours, so I started processing the statistics-2013 core on DSpace Test
  • Since I was going to restart CGSpace and update the Discovery indexes anyways I decided to check for any straggling upper case AGROVOC entries and lower case them:
dspace=# BEGIN;
dspace=# UPDATE metadatavalue SET text_value=LOWER(text_value) WHERE resource_type_id=2 AND metadata_field_id=57 AND text_value ~ '[[:upper:]]';
UPDATE 164
dspace=# COMMIT;
  • Run system updates on CGSpace (linode18) and reboot it
    • I had to restart Tomcat once after the machine started up to get all Solr statistics cores to load properly
  • After about ten more hours the rest of the Solr statistics cores finished processing on DSpace Test and I started optimizing them in Solr admin UI

2020-11-10

  • I am noticing that CGSpace doesn't have any statistics showing for years before 2020, but all cores are loaded successfully in Solr Admin UI... strange
    • I restarted Tomcat and I see in Solr Admin UI that the statistics-2015 core failed to load
    • Looking in the DSpace log I see:
2020-11-10 08:43:59,634 INFO  org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2015
2020-11-10 08:43:59,687 INFO  org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2018
2020-11-10 08:43:59,707 INFO  org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2015
2020-11-10 08:44:00,004 WARN  org.dspace.core.ConfigurationManager @ Requested configuration module: atmire-datatables not found
2020-11-10 08:44:00,005 WARN  org.dspace.core.ConfigurationManager @ Requested configuration module: atmire-datatables not found
2020-11-10 08:44:00,005 WARN  org.dspace.core.ConfigurationManager @ Requested configuration module: atmire-datatables not found
2020-11-10 08:44:00,325 INFO  org.dspace.statistics.SolrLogger @ Created core with name: statistics-2015
  • Seems that the core gets probed twice... perhaps a threading issue?
    • The only thing I can think of is the acceptorThreadCount parameter in Tomcat's server.xml, which has been set to 2 since 2018-01 (we started sharding the Solr statistics cores in 2019-01 and that's when this problem arose)
    • I will try reducing that to 1
    • Wow, now it's even worse:
2020-11-10 08:51:03,007 INFO  org.dspace.statistics.SolrLogger @ Created core with name: statistics-2018
2020-11-10 08:51:03,008 INFO  org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2015
2020-11-10 08:51:03,137 INFO  org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2018
2020-11-10 08:51:03,153 INFO  org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2015
2020-11-10 08:51:03,289 INFO  org.dspace.statistics.SolrLogger @ Created core with name: statistics-2015
2020-11-10 08:51:03,289 INFO  org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2010
2020-11-10 08:51:03,475 INFO  org.dspace.statistics.SolrLogger @ Created core with name: statistics-2010
2020-11-10 08:51:03,475 INFO  org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2016
2020-11-10 08:51:03,730 INFO  org.dspace.statistics.SolrLogger @ Created core with name: statistics-2016
2020-11-10 08:51:03,731 INFO  org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2017
2020-11-10 08:51:03,992 INFO  org.dspace.statistics.SolrLogger @ Created core with name: statistics-2017
2020-11-10 08:51:03,992 INFO  org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2011
2020-11-10 08:51:04,178 INFO  org.dspace.statistics.SolrLogger @ Created core with name: statistics-2011
2020-11-10 08:51:04,178 INFO  org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2012
  • Could it be because we have two Tomcat connectors?
    • I restarted Tomcat a few more times before all cores loaded, and still there are no stats before 2020-01... hmmmmm
  • I added a lowercase formatter to OpenRXV so that we can lowercase AGROVOC subjects during harvesting

2020-11-11

  • Atmire responded with a quote for the work to fix the duplicate owningComm, etc in our Solr data
    • I told them to proceed, as it's within our budget of credits
    • They will write a processor for DSpace 6 to remove the duplicates
  • I did some tests to add a usage statistics chart to the item views on DSpace Test
    • It is inspired by Salem's work on WorldFish's repository, and it hits the dspace-statistics-api for the current item and displays a graph
    • I got it working very easily for all-time statistics with Chart.js, but I think I will need to use Highcharts or something else because Chart.js is HTML5 canvas and doesn't allow theming via CSS (so our Bootstrap brand colors for each theme won't work)
    • I think I'll pursue this after the DSpace 6 upgrade...

2020-11-12

  • I was looking at Solr again trying to find a way to get community and collection stats by faceting on owningComm and owningColl and it seems to work actually
    • The duplicated values in the multi-value fields don't seem to affect the counts, as I had thought previously (though we should still get rid of them)
    • One major difference between the raw numbers I was looking at and Atmire's numbers is that Atmire's code filters "Internal" IP addresses...
    • Also, instead of doing isBot:false I think I should do -isBot:true because it's not a given that all documents will have this field and have it false, but we can definitely exclude the ones that have it as true
  • First we get the total number of communities with stats (using calcdistinct):
facet=true&facet.field=owningComm&facet.mincount=1&facet.limit=1&facet.offset=0&stats=true&stats.field=owningComm&stats.calcdistinct=true&shards=http://localhost:8081/solr/statistics,http://localhost:8081/solr/statistics-2019,http://localhost:8081/solr/statistics-2018,http://localhost:8081/solr/statistics-2017,http://localhost:8081/solr/statistics-2016,http://localhost:8081/solr/statistics-2015,http://localhost:8081/solr/statistics-2014,http://localhost:8081/solr/statistics-2013,http://localhost:8081/solr/statistics-2012,http://localhost:8081/solr/statistics-2011,http://localhost:8081/solr/statistics-2010
  • Then get stats themselves, iterating 100 items at a time with limit and offset:
facet=true&facet.field=owningComm&facet.mincount=1&facet.limit=100&facet.offset=0&shards=http://localhost:8081/solr/statistics,http://localhost:8081/solr/statistics-2019,http://localhost:8081/solr/statistics-2018,http://localhost:8081/solr/statistics-2017,http://localhost:8081/solr/statistics-2016,http://localhost:8081/solr/statistics-2015,http://localhost:8081/solr/statistics-2014,http://localhost:8081/solr/statistics-2013,http://localhost:8081/solr/statistics-2012,http://localhost:8081/solr/statistics-2011,http://localhost:8081/solr/statistics-2010
  • I was surprised to see 10,000,000 docs with isBot:true when I was testing on DSpace Test...
    • This has got to be a mistake of some kind, as I see 4 million in 2014 that are from dns:localhost., perhaps that's when we didn't have useProxies set up correctly?
    • I don't see the same thing on CGSpace... I wonder what happened?
    • Perhaps they got re-tagged during the DSpace 6 upgrade, somehow during the Solr migration? Hmmmmm. Definitely have to be careful with isBot:true in the future and not automatically purge these!!!
  • I noticed 120,000+ hits from monit, FeedBurner, and Blackboard Safeassign in 2014, 2015, 2016, 2017, etc...
    • I hadn't seen monit before, but the others are already in DSpace's spider agents lists for some time so probably only appear in older stats cores
    • The issue with purging these using check-spider-hits.sh is that it can't do case-insensitive regexes and some metacharacters like \s don't work so I added case-sensitive patterns to a local agents file and purged them with the script