cgspace-notes/content/posts/2020-11.md

210 lines
12 KiB
Markdown
Raw Normal View History

2020-11-04 19:41:19 +01:00
---
title: "November, 2020"
date: 2020-11-01T13:11:54+02:00
author: "Alan Orth"
categories: ["Notes"]
---
## 2020-11-01
- Continue with processing the statistics-2019 Solr core with the AtomicStatisticsUpdateCLI tool on DSpace Test
- So far we've spent at least fifty hours to process the statistics and statistics-2019 core... wow.
<!--more-->
## 2020-11-02
- Talk to Moayad and fix a few issues on OpenRXV:
- Incorrect views and downloads (caused by Elasticsearch's default result set size of 10)
- Invalid share link
- Missing "https://" for Handles in the Simple Excel report (caused by using the `handle` instead of the `uri`)
- Sorting the list of items by views
- I resumed the processing of the statistics-2018 Solr core after it spent 20 hours to get to 60%
## 2020-11-04
- After 29 hours the statistics-2017 core finished processing so I started the statistics-2016 core on DSpace Test
2020-11-05 11:14:50 +01:00
## 2020-11-05
- Peter sent me corrections and deletions for the author affiliations
- I quickly proofed them for UTF-8 issues in OpenRefine and csv-metadata-quality and then tested them locally and then applied them on CGSpace:
```
$ ./fix-metadata-values.py -i 2020-11-05-fix-862-affiliations.csv -db dspace -u dspace -p 'fuuu' -f cg.contributor.affiliation -t 'correct' -m 211
$ ./delete-metadata-values.py -i 2020-11-05-delete-29-affiliations.csv -db dspace -u dspace -p 'fuuu' -f cg.contributor.affiliation -m 211
```
- Then I started a Discovery re-index on CGSpace:
```
$ time chrt -b 0 ionice -c2 -n7 nice -n19 dspace index-discovery -b
2020-11-05 13:04:34 +01:00
real 92m24.993s
user 8m11.858s
sys 2m26.931s
2020-11-05 11:14:50 +01:00
```
2020-11-07 15:22:36 +01:00
## 2020-11-06
- Restart the AtomicStatisticsUpdateCLI processing of the statistics-2016 core on DSpace Test after 20 hours...
- This phase finished after five hours so I started it on the statistics-2015 core
## 2020-11-07
- Atmire responded about the issue with duplicate values in owningComm and containerCommunity etc
- I told them to please look into it and use some of our credits if need be
- The statistics-2015 core finished after 20 hours so I started the statistics-2014 core
2020-11-05 11:14:50 +01:00
2020-11-08 13:43:00 +01:00
## 2020-11-08
- Add "Data Paper" to types on CGSpace
- Add "SCALING CLIMATE-SMART AGRICULTURE" to CCAFS subjects on CGSpace
- Add "ANDEAN ROOTS AND TUBERS" to CIP subjects on CGSpace
- Add CGIAR System subjects to Discovery sidebar facets on CGSpace
- Also add the System subject to item view on CGSpace
2020-11-08 14:03:02 +01:00
- The statistics-2014 core finished processing after five hours, so I started processing the statistics-2013 core on DSpace Test
- Since I was going to restart CGSpace and update the Discovery indexes anyways I decided to check for any straggling upper case AGROVOC entries and lower case them:
```
dspace=# BEGIN;
dspace=# UPDATE metadatavalue SET text_value=LOWER(text_value) WHERE resource_type_id=2 AND metadata_field_id=57 AND text_value ~ '[[:upper:]]';
UPDATE 164
dspace=# COMMIT;
```
- Run system updates on CGSpace (linode18) and reboot it
- I had to restart Tomcat once after the machine started up to get all Solr statistics cores to load properly
2020-11-10 16:00:02 +01:00
- After about ten more hours the rest of the Solr statistics cores finished processing on DSpace Test and I started optimizing them in Solr admin UI
## 2020-11-10
- I am noticing that CGSpace doesn't have any statistics showing for years before 2020, but all cores are loaded successfully in Solr Admin UI... strange
- I restarted Tomcat and I see in Solr Admin UI that the statistics-2015 core failed to load
- Looking in the DSpace log I see:
```
2020-11-10 08:43:59,634 INFO org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2015
2020-11-10 08:43:59,687 INFO org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2018
2020-11-10 08:43:59,707 INFO org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2015
2020-11-10 08:44:00,004 WARN org.dspace.core.ConfigurationManager @ Requested configuration module: atmire-datatables not found
2020-11-10 08:44:00,005 WARN org.dspace.core.ConfigurationManager @ Requested configuration module: atmire-datatables not found
2020-11-10 08:44:00,005 WARN org.dspace.core.ConfigurationManager @ Requested configuration module: atmire-datatables not found
2020-11-10 08:44:00,325 INFO org.dspace.statistics.SolrLogger @ Created core with name: statistics-2015
```
- Seems that the core gets probed twice... perhaps a threading issue?
- The only thing I can think of is the `acceptorThreadCount` parameter in Tomcat's server.xml, which has been set to 2 since 2018-01 (we started sharding the Solr statistics cores in 2019-01 and that's when this problem arose)
- I will try reducing that to 1
- Wow, now it's even worse:
```
2020-11-10 08:51:03,007 INFO org.dspace.statistics.SolrLogger @ Created core with name: statistics-2018
2020-11-10 08:51:03,008 INFO org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2015
2020-11-10 08:51:03,137 INFO org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2018
2020-11-10 08:51:03,153 INFO org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2015
2020-11-10 08:51:03,289 INFO org.dspace.statistics.SolrLogger @ Created core with name: statistics-2015
2020-11-10 08:51:03,289 INFO org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2010
2020-11-10 08:51:03,475 INFO org.dspace.statistics.SolrLogger @ Created core with name: statistics-2010
2020-11-10 08:51:03,475 INFO org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2016
2020-11-10 08:51:03,730 INFO org.dspace.statistics.SolrLogger @ Created core with name: statistics-2016
2020-11-10 08:51:03,731 INFO org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2017
2020-11-10 08:51:03,992 INFO org.dspace.statistics.SolrLogger @ Created core with name: statistics-2017
2020-11-10 08:51:03,992 INFO org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2011
2020-11-10 08:51:04,178 INFO org.dspace.statistics.SolrLogger @ Created core with name: statistics-2011
2020-11-10 08:51:04,178 INFO org.dspace.statistics.SolrLogger @ Loading core with name: statistics-2012
```
- Could it be because we have two Tomcat connectors?
- I restarted Tomcat a few more times before all cores loaded, and still there are no stats before 2020-01... hmmmmm
- I added a [lowercase formatter to OpenRXV](https://github.com/ilri/OpenRXV/commit/3816b9b3f3d9182d2ba1a899c1017c5895a59dee) so that we can lowercase AGROVOC subjects during harvesting
2020-11-08 13:43:00 +01:00
2020-11-13 08:22:18 +01:00
## 2020-11-11
- Atmire responded with a quote for the work to fix the duplicate owningComm, etc in our Solr data
- I told them to proceed, as it's within our budget of credits
- They will write a processor for DSpace 6 to remove the duplicates
- I did some tests to add a usage statistics chart to the item views on DSpace Test
- It is inspired by Salem's work on WorldFish's repository, and it hits the dspace-statistics-api for the current item and displays a graph
- I got it working very easily for all-time statistics with Chart.js, but I think I will need to use Highcharts or something else because Chart.js is HTML5 canvas and doesn't allow theming via CSS (so our Bootstrap brand colors for each theme won't work)
2020-11-15 10:35:31 +01:00
- Hmm, Highcharts is not licensed under and open source license so I will not use it
- Perhaps I'll use Chartist with the popover plugin...
2020-11-13 08:22:18 +01:00
- I think I'll pursue this after the DSpace 6 upgrade...
## 2020-11-12
- I was looking at Solr again trying to find a way to get community and collection stats by faceting on `owningComm` and `owningColl` and it seems to work actually
- The duplicated values in the multi-value fields don't seem to affect the counts, as I had thought previously (though we should still get rid of them)
- One major difference between the raw numbers I was looking at and Atmire's numbers is that Atmire's code filters "Internal" IP addresses...
- Also, instead of doing `isBot:false` I think I should do `-isBot:true` because it's not a given that all documents will have this field and have it false, but we can definitely exclude the ones that have it as true
- First we get the total number of communities with stats (using calcdistinct):
```
facet=true&facet.field=owningComm&facet.mincount=1&facet.limit=1&facet.offset=0&stats=true&stats.field=owningComm&stats.calcdistinct=true&shards=http://localhost:8081/solr/statistics,http://localhost:8081/solr/statistics-2019,http://localhost:8081/solr/statistics-2018,http://localhost:8081/solr/statistics-2017,http://localhost:8081/solr/statistics-2016,http://localhost:8081/solr/statistics-2015,http://localhost:8081/solr/statistics-2014,http://localhost:8081/solr/statistics-2013,http://localhost:8081/solr/statistics-2012,http://localhost:8081/solr/statistics-2011,http://localhost:8081/solr/statistics-2010
```
- Then get stats themselves, iterating 100 items at a time with limit and offset:
```
facet=true&facet.field=owningComm&facet.mincount=1&facet.limit=100&facet.offset=0&shards=http://localhost:8081/solr/statistics,http://localhost:8081/solr/statistics-2019,http://localhost:8081/solr/statistics-2018,http://localhost:8081/solr/statistics-2017,http://localhost:8081/solr/statistics-2016,http://localhost:8081/solr/statistics-2015,http://localhost:8081/solr/statistics-2014,http://localhost:8081/solr/statistics-2013,http://localhost:8081/solr/statistics-2012,http://localhost:8081/solr/statistics-2011,http://localhost:8081/solr/statistics-2010
```
- I was surprised to see 10,000,000 docs with `isBot:true` when I was testing on DSpace Test...
- This has got to be a mistake of some kind, as I see 4 million in 2014 that are from `dns:localhost.`, perhaps that's when we didn't have useProxies set up correctly?
- I don't see the same thing on CGSpace... I wonder what happened?
- Perhaps they got re-tagged during the DSpace 6 upgrade, somehow during the Solr migration? Hmmmmm. Definitely have to be careful with `isBot:true` in the future and not automatically purge these!!!
- I noticed 120,000+ hits from monit, FeedBurner, and Blackboard Safeassign in 2014, 2015, 2016, 2017, etc...
- I hadn't seen monit before, but the others are already in DSpace's spider agents lists for some time so probably only appear in older stats cores
- The issue with purging these using `check-spider-hits.sh` is that it can't do case-insensitive regexes and some metacharacters like `\s` don't work so I added case-sensitive patterns to a local agents file and purged them with the script
2020-11-15 10:35:31 +01:00
## 2020-11-15
- Upgrade CGSpace to DSpace 6.3
- First build, update, and migrate the database:
```
$ dspace cleanup -v
$ git checkout origin/6_x-dev-atmire-modules
$ npm install -g yarn
$ chrt -b 0 mvn -U -Dmirage2.on=true -Dmirage2.deps.included=false -P \!dspace-lni,\!dspace-rdf,\!dspace-sword,\!dspace-swordv2,\!dspace-jspui clean package
$ sudo su - postgres
$ psql dspace -c 'CREATE EXTENSION pgcrypto;'
$ psql dspace -c "DELETE FROM schema_version WHERE version IN ('5.8.2015.12.03.3');"
$ exit
$ rm -rf /home/cgspace/config/spring
$ ant update
$ dspace database info
$ dspace database migrate
$ sudo systemctl start tomcat7
```
- After starting Tomcat DSpace should start up OK and begin Discovery indexing, but I want to also upgrade from PostgreSQL 9.6 to 10
- I installed and configured PostgreSQL 10 using the Ansible playbooks, then migrated the database manually:
```
# systemctl stop tomcat7
# pg_ctlcluster 9.6 main stop
# tar -cvzpf var-lib-postgresql-9.6.tar.gz /var/lib/postgresql/9.6
# tar -cvzpf etc-postgresql-9.6.tar.gz /etc/postgresql/9.6
# pg_ctlcluster 10 main stop
# pg_dropcluster 10 main
# pg_upgradecluster 9.6 main
# pg_dropcluster 9.6 main
# systemctl start postgresql
# dpkg -l | grep postgresql | grep 9.6 | awk '{print $2}' | xargs dpkg -r
```
- Then I ran all system updates and rebooted the server...
- After the server came back up I re-ran the Ansible playbook to make sure all configs and services were updated
- I disabled the dspace-statistsics-api for now because it won't work until I migrate all the Solr statistics anyways
- Start a full Discovery re-indexing:
```
$ time chrt -b 0 ionice -c2 -n7 nice -n19 dspace index-discovery -b
```
- I updated all the Ansible infrastructure and DSpace branches to be the DSpace 6 ones
- I will wait until the Discovery indexing is finished to start doing the Solr statistics migration
2020-11-04 19:41:19 +01:00
<!-- vim: set sw=2 ts=2: -->