diff --git a/content/posts/2024-02.md b/content/posts/2024-02.md
index 7ae9c6f30..65393d4e4 100644
--- a/content/posts/2024-02.md
+++ b/content/posts/2024-02.md
@@ -107,4 +107,8 @@ $ xmllint --xpath '//node/isComposedBy/node()' dspace/config/controlled-vocabula
$ cat /tmp/authors /tmp/ifpri-authors | sort -u > /tmp/new-authors
```
+## 2024-02-28
+
+- I figured out a way to add a new Angular component to handle all our relation fields
+
diff --git a/docs/2015-11/index.html b/docs/2015-11/index.html
index 9499986f5..bca66b241 100644
--- a/docs/2015-11/index.html
+++ b/docs/2015-11/index.html
@@ -34,7 +34,7 @@ Last week I had increased the limit from 30 to 60, which seemed to help, but now
$ psql -c 'SELECT * from pg_stat_activity;' | grep idle | grep -c cgspace
78
"/>
-
+
diff --git a/docs/2015-12/index.html b/docs/2015-12/index.html
index 0d35e97a3..db1dfa681 100644
--- a/docs/2015-12/index.html
+++ b/docs/2015-12/index.html
@@ -36,7 +36,7 @@ Replace lzop with xz in log compression cron jobs on DSpace Test—it uses less
-rw-rw-r-- 1 tomcat7 tomcat7 387K Nov 18 23:59 dspace.log.2015-11-18.lzo
-rw-rw-r-- 1 tomcat7 tomcat7 169K Nov 18 23:59 dspace.log.2015-11-18.xz
"/>
-
+
diff --git a/docs/2016-01/index.html b/docs/2016-01/index.html
index b90678c85..da66cd10d 100644
--- a/docs/2016-01/index.html
+++ b/docs/2016-01/index.html
@@ -28,7 +28,7 @@ Move ILRI collection 10568/12503 from 10568/27869 to 10568/27629 using the move_
I realized it is only necessary to clear the Cocoon cache after moving collections—rather than reindexing—as no metadata has changed, and therefore no search or browse indexes need to be updated.
Update GitHub wiki for documentation of maintenance tasks.
"/>
-
+
diff --git a/docs/2016-02/index.html b/docs/2016-02/index.html
index 3402764ce..39d4699e6 100644
--- a/docs/2016-02/index.html
+++ b/docs/2016-02/index.html
@@ -38,7 +38,7 @@ I noticed we have a very interesting list of countries on CGSpace:
Not only are there 49,000 countries, we have some blanks (25)…
Also, lots of things like “COTE D`LVOIRE” and “COTE D IVOIRE”
"/>
-
+
diff --git a/docs/2016-03/index.html b/docs/2016-03/index.html
index b2baaf173..8f2cf0348 100644
--- a/docs/2016-03/index.html
+++ b/docs/2016-03/index.html
@@ -28,7 +28,7 @@ Looking at issues with author authorities on CGSpace
For some reason we still have the index-lucene-update cron job active on CGSpace, but I’m pretty sure we don’t need it as of the latest few versions of Atmire’s Listings and Reports module
Reinstall my local (Mac OS X) DSpace stack with Tomcat 7, PostgreSQL 9.3, and Java JDK 1.7 to match environment on CGSpace server
"/>
-
+
diff --git a/docs/2016-04/index.html b/docs/2016-04/index.html
index 41e12e6c8..c4e3f6fe5 100644
--- a/docs/2016-04/index.html
+++ b/docs/2016-04/index.html
@@ -32,7 +32,7 @@ After running DSpace for over five years I’ve never needed to look in any
This will save us a few gigs of backup space we’re paying for on S3
Also, I noticed the checker log has some errors we should pay attention to:
"/>
-
+
diff --git a/docs/2016-05/index.html b/docs/2016-05/index.html
index ad00274d7..df5028933 100644
--- a/docs/2016-05/index.html
+++ b/docs/2016-05/index.html
@@ -34,7 +34,7 @@ There are 3,000 IPs accessing the REST API in a 24-hour period!
# awk '{print $1}' /var/log/nginx/rest.log | uniq | wc -l
3168
"/>
-
+
diff --git a/docs/2016-06/index.html b/docs/2016-06/index.html
index 2fd4ce8b6..50e06a83f 100644
--- a/docs/2016-06/index.html
+++ b/docs/2016-06/index.html
@@ -34,7 +34,7 @@ This is their publications set: http://ebrary.ifpri.org/oai/oai.php?verb=ListRec
You can see the others by using the OAI ListSets verb: http://ebrary.ifpri.org/oai/oai.php?verb=ListSets
Working on second phase of metadata migration, looks like this will work for moving CPWF-specific data in dc.identifier.fund to cg.identifier.cpwfproject and then the rest to dc.description.sponsorship
"/>
-
+
diff --git a/docs/2016-07/index.html b/docs/2016-07/index.html
index 26d0453fb..31f4a07ec 100644
--- a/docs/2016-07/index.html
+++ b/docs/2016-07/index.html
@@ -44,7 +44,7 @@ dspacetest=# select text_value from metadatavalue where metadata_field_id=3 and
In this case the select query was showing 95 results before the update
"/>
-
+
diff --git a/docs/2016-08/index.html b/docs/2016-08/index.html
index a705168da..9ce5dc507 100644
--- a/docs/2016-08/index.html
+++ b/docs/2016-08/index.html
@@ -42,7 +42,7 @@ $ git checkout -b 55new 5_x-prod
$ git reset --hard ilri/5_x-prod
$ git rebase -i dspace-5.5
"/>
-
+
diff --git a/docs/2016-09/index.html b/docs/2016-09/index.html
index eab5f13b7..4c9e17189 100644
--- a/docs/2016-09/index.html
+++ b/docs/2016-09/index.html
@@ -34,7 +34,7 @@ It looks like we might be able to use OUs now, instead of DCs:
$ ldapsearch -x -H ldaps://svcgroot2.cgiarad.org:3269/ -b "dc=cgiarad,dc=org" -D "admigration1@cgiarad.org" -W "(sAMAccountName=admigration1)"
"/>
-
+
diff --git a/docs/2016-10/index.html b/docs/2016-10/index.html
index e6c06733b..4b4183712 100644
--- a/docs/2016-10/index.html
+++ b/docs/2016-10/index.html
@@ -42,7 +42,7 @@ I exported a random item’s metadata as CSV, deleted all columns except id
0000-0002-6115-0956||0000-0002-3812-8793||0000-0001-7462-405X
"/>
-
+
diff --git a/docs/2016-11/index.html b/docs/2016-11/index.html
index 98b563e90..4aeedad5a 100644
--- a/docs/2016-11/index.html
+++ b/docs/2016-11/index.html
@@ -26,7 +26,7 @@ Add dc.type to the output options for Atmire’s Listings and Reports module
Add dc.type to the output options for Atmire’s Listings and Reports module (#286)
"/>
-
+
diff --git a/docs/2016-12/index.html b/docs/2016-12/index.html
index 9efa8d6e4..940083167 100644
--- a/docs/2016-12/index.html
+++ b/docs/2016-12/index.html
@@ -46,7 +46,7 @@ I see thousands of them in the logs for the last few months, so it’s not r
I’ve raised a ticket with Atmire to ask
Another worrying error from dspace.log is:
"/>
-
+
diff --git a/docs/2017-01/index.html b/docs/2017-01/index.html
index 3e0f44e17..81c41c029 100644
--- a/docs/2017-01/index.html
+++ b/docs/2017-01/index.html
@@ -28,7 +28,7 @@ I checked to see if the Solr sharding task that is supposed to run on January 1s
I tested on DSpace Test as well and it doesn’t work there either
I asked on the dspace-tech mailing list because it seems to be broken, and actually now I’m not sure if we’ve ever had the sharding task run successfully over all these years
"/>
-
+
diff --git a/docs/2017-02/index.html b/docs/2017-02/index.html
index 26ed058ac..5140ddff4 100644
--- a/docs/2017-02/index.html
+++ b/docs/2017-02/index.html
@@ -50,7 +50,7 @@ DELETE 1
Create issue on GitHub to track the addition of CCAFS Phase II project tags (#301)
Looks like we’ll be using cg.identifier.ccafsprojectpii as the field name
"/>
-
+
diff --git a/docs/2017-03/index.html b/docs/2017-03/index.html
index 164bc74c4..9b10ae24f 100644
--- a/docs/2017-03/index.html
+++ b/docs/2017-03/index.html
@@ -54,7 +54,7 @@ Interestingly, it seems DSpace 4.x’s thumbnails were sRGB, but forcing reg
$ identify ~/Desktop/alc_contrastes_desafios.jpg
/Users/aorth/Desktop/alc_contrastes_desafios.jpg JPEG 464x600 464x600+0+0 8-bit CMYK 168KB 0.000u 0:00.000
"/>
-
+
diff --git a/docs/2017-04/index.html b/docs/2017-04/index.html
index 782579eec..9d022bd01 100644
--- a/docs/2017-04/index.html
+++ b/docs/2017-04/index.html
@@ -40,7 +40,7 @@ Testing the CMYK patch on a collection with 650 items:
$ [dspace]/bin/dspace filter-media -f -i 10568/16498 -p "ImageMagick PDF Thumbnail" -v >& /tmp/filter-media-cmyk.txt
"/>
-
+
diff --git a/docs/2017-05/index.html b/docs/2017-05/index.html
index 45651f67b..fe83fb9d3 100644
--- a/docs/2017-05/index.html
+++ b/docs/2017-05/index.html
@@ -18,7 +18,7 @@
-
+
diff --git a/docs/2017-06/index.html b/docs/2017-06/index.html
index 9dd2f7228..1b4f7dbb6 100644
--- a/docs/2017-06/index.html
+++ b/docs/2017-06/index.html
@@ -18,7 +18,7 @@
-
+
diff --git a/docs/2017-07/index.html b/docs/2017-07/index.html
index 197fcaf1e..312913a4e 100644
--- a/docs/2017-07/index.html
+++ b/docs/2017-07/index.html
@@ -36,7 +36,7 @@ Merge changes for WLE Phase II theme rename (#329)
Looking at extracting the metadata registries from ICARDA’s MEL DSpace database so we can compare fields with CGSpace
We can use PostgreSQL’s extended output format (-x) plus sed to format the output into quasi XML:
"/>
-
+
diff --git a/docs/2017-08/index.html b/docs/2017-08/index.html
index c8331d1a7..15d410f2c 100644
--- a/docs/2017-08/index.html
+++ b/docs/2017-08/index.html
@@ -60,7 +60,7 @@ This was due to newline characters in the dc.description.abstract column, which
I exported a new CSV from the collection on DSpace Test and then manually removed the characters in vim using g/^$/d
Then I cleaned up the author authorities and HTML characters in OpenRefine and sent the file back to Abenet
"/>
-
+
diff --git a/docs/2017-09/index.html b/docs/2017-09/index.html
index 9d6a6e555..991f00896 100644
--- a/docs/2017-09/index.html
+++ b/docs/2017-09/index.html
@@ -32,7 +32,7 @@ Linode sent an alert that CGSpace (linode18) was using 261% CPU for the past two
Ask Sisay to clean up the WLE approvers a bit, as Marianne’s user account is both in the approvers step as well as the group
"/>
-
+
diff --git a/docs/2017-10/index.html b/docs/2017-10/index.html
index 92b93d818..a679cefd1 100644
--- a/docs/2017-10/index.html
+++ b/docs/2017-10/index.html
@@ -34,7 +34,7 @@ http://hdl.handle.net/10568/78495||http://hdl.handle.net/10568/79336
There appears to be a pattern but I’ll have to look a bit closer and try to clean them up automatically, either in SQL or in OpenRefine
Add Katherine Lutz to the groups for content submission and edit steps of the CGIAR System collections
"/>
-
+
diff --git a/docs/2017-11/index.html b/docs/2017-11/index.html
index 61d2c8ba5..8ebe56287 100644
--- a/docs/2017-11/index.html
+++ b/docs/2017-11/index.html
@@ -48,7 +48,7 @@ Generate list of authors on CGSpace for Peter to go through and correct:
dspace=# \copy (select distinct text_value, count(*) as count from metadatavalue where metadata_field_id = (select metadata_field_id from metadatafieldregistry where element = 'contributor' and qualifier = 'author') AND resource_type_id = 2 group by text_value order by count desc) to /tmp/authors.csv with csv;
COPY 54701
"/>
-
+
diff --git a/docs/2017-12/index.html b/docs/2017-12/index.html
index f7d64e0d6..3972bc306 100644
--- a/docs/2017-12/index.html
+++ b/docs/2017-12/index.html
@@ -30,7 +30,7 @@ The logs say “Timeout waiting for idle object”
PostgreSQL activity says there are 115 connections currently
The list of connections to XMLUI and REST API for today:
"/>
-
+
diff --git a/docs/2018-01/index.html b/docs/2018-01/index.html
index 0ff931d11..0b031c338 100644
--- a/docs/2018-01/index.html
+++ b/docs/2018-01/index.html
@@ -150,7 +150,7 @@ dspace.log.2018-01-02:34
Danny wrote to ask for help renewing the wildcard ilri.org certificate and I advised that we should probably use Let’s Encrypt if it’s just a handful of domains
"/>
-
+
diff --git a/docs/2018-02/index.html b/docs/2018-02/index.html
index d40f7f4e8..b36676b2c 100644
--- a/docs/2018-02/index.html
+++ b/docs/2018-02/index.html
@@ -30,7 +30,7 @@ We don’t need to distinguish between internal and external works, so that
Yesterday I figured out how to monitor DSpace sessions using JMX
I copied the logic in the jmx_tomcat_dbpools provided by Ubuntu’s munin-plugins-java package and used the stuff I discovered about JMX in 2018-01
"/>
-
+
diff --git a/docs/2018-03/index.html b/docs/2018-03/index.html
index e9b2453ec..fa40e9e73 100644
--- a/docs/2018-03/index.html
+++ b/docs/2018-03/index.html
@@ -24,7 +24,7 @@ Export a CSV of the IITA community metadata for Martin Mueller
Export a CSV of the IITA community metadata for Martin Mueller
"/>
-
+
diff --git a/docs/2018-04/index.html b/docs/2018-04/index.html
index 70cf0d0c1..81d04afae 100644
--- a/docs/2018-04/index.html
+++ b/docs/2018-04/index.html
@@ -26,7 +26,7 @@ Catalina logs at least show some memory errors yesterday:
I tried to test something on DSpace Test but noticed that it’s down since god knows when
Catalina logs at least show some memory errors yesterday:
"/>
-
+
diff --git a/docs/2018-05/index.html b/docs/2018-05/index.html
index 17f8aef86..f86a7a747 100644
--- a/docs/2018-05/index.html
+++ b/docs/2018-05/index.html
@@ -38,7 +38,7 @@ http://localhost:3000/solr/statistics/update?stream.body=%3Ccommit/%3E
Then I reduced the JVM heap size from 6144 back to 5120m
Also, I switched it to use OpenJDK instead of Oracle Java, as well as re-worked the Ansible infrastructure scripts to support hosts choosing which distribution they want to use
"/>
-
+
diff --git a/docs/2018-06/index.html b/docs/2018-06/index.html
index 874535b48..a70cf05dc 100644
--- a/docs/2018-06/index.html
+++ b/docs/2018-06/index.html
@@ -58,7 +58,7 @@ real 74m42.646s
user 8m5.056s
sys 2m7.289s
"/>
-
+
diff --git a/docs/2018-07/index.html b/docs/2018-07/index.html
index 9b61d92c2..d3c88b45c 100644
--- a/docs/2018-07/index.html
+++ b/docs/2018-07/index.html
@@ -36,7 +36,7 @@ During the mvn package stage on the 5.8 branch I kept getting issues with java r
There is insufficient memory for the Java Runtime Environment to continue.
"/>
-
+
diff --git a/docs/2018-08/index.html b/docs/2018-08/index.html
index 69162fe38..4e66272bf 100644
--- a/docs/2018-08/index.html
+++ b/docs/2018-08/index.html
@@ -46,7 +46,7 @@ Anyways, perhaps I should increase the JVM heap from 5120m to 6144m like we did
The server only has 8GB of RAM so we’ll eventually need to upgrade to a larger one because we’ll start starving the OS, PostgreSQL, and command line batch processes
I ran all system updates on DSpace Test and rebooted it
"/>
-
+
diff --git a/docs/2018-09/index.html b/docs/2018-09/index.html
index b8813eef2..419c00e94 100644
--- a/docs/2018-09/index.html
+++ b/docs/2018-09/index.html
@@ -30,7 +30,7 @@ I’ll update the DSpace role in our Ansible infrastructure playbooks and ru
Also, I’ll re-run the postgresql tasks because the custom PostgreSQL variables are dynamic according to the system’s RAM, and we never re-ran them after migrating to larger Linodes last month
I’m testing the new DSpace 5.8 branch in my Ubuntu 18.04 environment and I’m getting those autowire errors in Tomcat 8.5.30 again:
"/>
-
+
diff --git a/docs/2018-10/index.html b/docs/2018-10/index.html
index cfefe0297..25b70c6a5 100644
--- a/docs/2018-10/index.html
+++ b/docs/2018-10/index.html
@@ -26,7 +26,7 @@ I created a GitHub issue to track this #389, because I’m super busy in Nai
Phil Thornton got an ORCID identifier so we need to add it to the list on CGSpace and tag his existing items
I created a GitHub issue to track this #389, because I’m super busy in Nairobi right now
"/>
-
+
diff --git a/docs/2018-11/index.html b/docs/2018-11/index.html
index cadcea9d2..989efa63a 100644
--- a/docs/2018-11/index.html
+++ b/docs/2018-11/index.html
@@ -36,7 +36,7 @@ Send a note about my dspace-statistics-api to the dspace-tech mailing list
Linode has been sending mails a few times a day recently that CGSpace (linode18) has had high CPU usage
Today these are the top 10 IPs:
"/>
-
+
diff --git a/docs/2018-12/index.html b/docs/2018-12/index.html
index ca3b7a925..22cd4eaaf 100644
--- a/docs/2018-12/index.html
+++ b/docs/2018-12/index.html
@@ -36,7 +36,7 @@ Then I ran all system updates and restarted the server
I noticed that there is another issue with PDF thumbnails on CGSpace, and I see there was another Ghostscript vulnerability last week
"/>
-
+
diff --git a/docs/2019-01/index.html b/docs/2019-01/index.html
index 4fdf185ea..fbd03b298 100644
--- a/docs/2019-01/index.html
+++ b/docs/2019-01/index.html
@@ -50,7 +50,7 @@ I don’t see anything interesting in the web server logs around that time t
357 207.46.13.1
903 54.70.40.11
"/>
-
+
diff --git a/docs/2019-02/index.html b/docs/2019-02/index.html
index f318c02cb..2b5b1f99a 100644
--- a/docs/2019-02/index.html
+++ b/docs/2019-02/index.html
@@ -72,7 +72,7 @@ real 0m19.873s
user 0m22.203s
sys 0m1.979s
"/>
-
+
diff --git a/docs/2019-03/index.html b/docs/2019-03/index.html
index bee9b3b32..d0ab2e03f 100644
--- a/docs/2019-03/index.html
+++ b/docs/2019-03/index.html
@@ -46,7 +46,7 @@ Most worryingly, there are encoding errors in the abstracts for eleven items, fo
I think I will need to ask Udana to re-copy and paste the abstracts with more care using Google Docs
"/>
-
+
diff --git a/docs/2019-04/index.html b/docs/2019-04/index.html
index 033117c8f..90fd673b5 100644
--- a/docs/2019-04/index.html
+++ b/docs/2019-04/index.html
@@ -64,7 +64,7 @@ $ ./fix-metadata-values.py -i /tmp/2019-02-21-fix-4-regions.csv -db dspace -u ds
$ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-2-countries.csv -db dspace -u dspace -p 'fuuu' -m 228 -f cg.coverage.country -d
$ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-1-region.csv -db dspace -u dspace -p 'fuuu' -m 231 -f cg.coverage.region -d
"/>
-
+
diff --git a/docs/2019-05/index.html b/docs/2019-05/index.html
index 181d1fc3c..454f0da53 100644
--- a/docs/2019-05/index.html
+++ b/docs/2019-05/index.html
@@ -48,7 +48,7 @@ DELETE 1
But after this I tried to delete the item from the XMLUI and it is still present…
"/>
-
+
diff --git a/docs/2019-06/index.html b/docs/2019-06/index.html
index 1da538b42..847ae14ba 100644
--- a/docs/2019-06/index.html
+++ b/docs/2019-06/index.html
@@ -34,7 +34,7 @@ Run system updates on CGSpace (linode18) and reboot it
Skype with Marie-Angélique and Abenet about CG Core v2
"/>
-
+
diff --git a/docs/2019-07/index.html b/docs/2019-07/index.html
index 6c937e314..774177a9f 100644
--- a/docs/2019-07/index.html
+++ b/docs/2019-07/index.html
@@ -38,7 +38,7 @@ CGSpace
Abenet had another similar issue a few days ago when trying to find the stats for 2018 in the RTB community
"/>
-
+
diff --git a/docs/2019-08/index.html b/docs/2019-08/index.html
index b4caf98f1..af3a75f74 100644
--- a/docs/2019-08/index.html
+++ b/docs/2019-08/index.html
@@ -46,7 +46,7 @@ After rebooting, all statistics cores were loaded… wow, that’s luck
Run system updates on DSpace Test (linode19) and reboot it
"/>
-
+
diff --git a/docs/2019-09/index.html b/docs/2019-09/index.html
index 72c1756e9..6ebd03046 100644
--- a/docs/2019-09/index.html
+++ b/docs/2019-09/index.html
@@ -72,7 +72,7 @@ Here are the top ten IPs in the nginx XMLUI and REST/OAI logs this morning:
7249 2a01:7e00::f03c:91ff:fe18:7396
9124 45.5.186.2
"/>
-
+
diff --git a/docs/2019-10/index.html b/docs/2019-10/index.html
index 500dc51c0..0d9c379da 100644
--- a/docs/2019-10/index.html
+++ b/docs/2019-10/index.html
@@ -18,7 +18,7 @@
-
+
diff --git a/docs/2019-11/index.html b/docs/2019-11/index.html
index 9f05f35e6..ef87ea202 100644
--- a/docs/2019-11/index.html
+++ b/docs/2019-11/index.html
@@ -58,7 +58,7 @@ Let’s see how many of the REST API requests were for bitstreams (because t
# zcat --force /var/log/nginx/rest.log.*.gz | grep -E "[0-9]{1,2}/Oct/2019" | grep -c -E "/rest/bitstreams"
106781
"/>
-
+
diff --git a/docs/2019-12/index.html b/docs/2019-12/index.html
index a3240eaf7..8182b585e 100644
--- a/docs/2019-12/index.html
+++ b/docs/2019-12/index.html
@@ -46,7 +46,7 @@ Make sure all packages are up to date and the package manager is up to date, the
# dpkg -C
# reboot
"/>
-
+
diff --git a/docs/2020-01/index.html b/docs/2020-01/index.html
index 27166a3f8..3bd85cccd 100644
--- a/docs/2020-01/index.html
+++ b/docs/2020-01/index.html
@@ -56,7 +56,7 @@ I tweeted the CGSpace repository link
"/>
-
+
diff --git a/docs/2020-02/index.html b/docs/2020-02/index.html
index 08b9b1442..f7184bc4e 100644
--- a/docs/2020-02/index.html
+++ b/docs/2020-02/index.html
@@ -38,7 +38,7 @@ The code finally builds and runs with a fresh install
"/>
-
+
diff --git a/docs/2020-03/index.html b/docs/2020-03/index.html
index 556d051a9..18983077e 100644
--- a/docs/2020-03/index.html
+++ b/docs/2020-03/index.html
@@ -42,7 +42,7 @@ You need to download this into the DSpace 6.x source and compile it
"/>
-
+
diff --git a/docs/2020-04/index.html b/docs/2020-04/index.html
index a1cea4623..001626022 100644
--- a/docs/2020-04/index.html
+++ b/docs/2020-04/index.html
@@ -48,7 +48,7 @@ The third item now has a donut with score 1 since I tweeted it last week
On the same note, the one item Abenet pointed out last week now has a donut with score of 104 after I tweeted it last week
"/>
-
+
diff --git a/docs/2020-05/index.html b/docs/2020-05/index.html
index 0f6233ef8..0d5f7b83f 100644
--- a/docs/2020-05/index.html
+++ b/docs/2020-05/index.html
@@ -34,7 +34,7 @@ I see that CGSpace (linode18) is still using PostgreSQL JDBC driver version 42.2
"/>
-
+
diff --git a/docs/2020-06/index.html b/docs/2020-06/index.html
index fa3758189..73637822d 100644
--- a/docs/2020-06/index.html
+++ b/docs/2020-06/index.html
@@ -36,7 +36,7 @@ I sent Atmire the dspace.log from today and told them to log into the server to
In other news, I checked the statistics API on DSpace 6 and it’s working
I tried to build the OAI registry on the freshly migrated DSpace 6 on DSpace Test and I get an error:
"/>
-
+
diff --git a/docs/2020-07/index.html b/docs/2020-07/index.html
index 7b9373fe2..f5fa96507 100644
--- a/docs/2020-07/index.html
+++ b/docs/2020-07/index.html
@@ -38,7 +38,7 @@ I restarted Tomcat and PostgreSQL and the issue was gone
Since I was restarting Tomcat anyways I decided to redeploy the latest changes from the 5_x-prod branch and I added a note about COVID-19 items to the CGSpace frontpage at Peter’s request
"/>
-
+
diff --git a/docs/2020-08/index.html b/docs/2020-08/index.html
index 810d273b6..2709a3eaf 100644
--- a/docs/2020-08/index.html
+++ b/docs/2020-08/index.html
@@ -36,7 +36,7 @@ It is class based so I can easily add support for other vocabularies, and the te
"/>
-
+
diff --git a/docs/2020-09/index.html b/docs/2020-09/index.html
index 6909d32cb..a0c67568d 100644
--- a/docs/2020-09/index.html
+++ b/docs/2020-09/index.html
@@ -48,7 +48,7 @@ I filed a bug on OpenRXV: https://github.com/ilri/OpenRXV/issues/39
I filed an issue on OpenRXV to make some minor edits to the admin UI: https://github.com/ilri/OpenRXV/issues/40
"/>
-
+
diff --git a/docs/2020-10/index.html b/docs/2020-10/index.html
index 25b8d9aad..027392e17 100644
--- a/docs/2020-10/index.html
+++ b/docs/2020-10/index.html
@@ -44,7 +44,7 @@ During the FlywayDB migration I got an error:
"/>
-
+
diff --git a/docs/2020-11/index.html b/docs/2020-11/index.html
index b0a1e7429..816ea800b 100644
--- a/docs/2020-11/index.html
+++ b/docs/2020-11/index.html
@@ -32,7 +32,7 @@ So far we’ve spent at least fifty hours to process the statistics and stat
"/>
-
+
diff --git a/docs/2020-12/index.html b/docs/2020-12/index.html
index 5b0a8ce82..4063fddd1 100644
--- a/docs/2020-12/index.html
+++ b/docs/2020-12/index.html
@@ -36,7 +36,7 @@ I started processing those (about 411,000 records):
"/>
-
+
diff --git a/docs/2021-01/index.html b/docs/2021-01/index.html
index 44151bfa6..c4d19d845 100644
--- a/docs/2021-01/index.html
+++ b/docs/2021-01/index.html
@@ -50,7 +50,7 @@ For example, this item has 51 views on CGSpace, but 0 on AReS
"/>
-
+
diff --git a/docs/2021-02/index.html b/docs/2021-02/index.html
index 0adb5a8af..422fa7e27 100644
--- a/docs/2021-02/index.html
+++ b/docs/2021-02/index.html
@@ -60,7 +60,7 @@ $ curl -s 'http://localhost:9200/openrxv-items-temp/_count?q=*&pretty
}
}
"/>
-
+
diff --git a/docs/2021-03/index.html b/docs/2021-03/index.html
index cdd9a17cf..b224b745f 100644
--- a/docs/2021-03/index.html
+++ b/docs/2021-03/index.html
@@ -34,7 +34,7 @@ Also, we found some issues building and running OpenRXV currently due to ecosyst
"/>
-
+
diff --git a/docs/2021-04/index.html b/docs/2021-04/index.html
index e1c4735ba..224510539 100644
--- a/docs/2021-04/index.html
+++ b/docs/2021-04/index.html
@@ -44,7 +44,7 @@ Perhaps one of the containers crashed, I should have looked closer but I was in
"/>
-
+
diff --git a/docs/2021-05/index.html b/docs/2021-05/index.html
index be37789d0..946671fec 100644
--- a/docs/2021-05/index.html
+++ b/docs/2021-05/index.html
@@ -36,7 +36,7 @@ I looked at the top user agents and IPs in the Solr statistics for last month an
I will add the RI/1.0 pattern to our DSpace agents overload and purge them from Solr (we had previously seen this agent with 9,000 hits or so in 2020-09), but I think I will leave the Microsoft Word one… as that’s an actual user…
"/>
-
+
diff --git a/docs/2021-06/index.html b/docs/2021-06/index.html
index 39360a58d..95f3c627c 100644
--- a/docs/2021-06/index.html
+++ b/docs/2021-06/index.html
@@ -36,7 +36,7 @@ I simply started it and AReS was running again:
"/>
-
+
diff --git a/docs/2021-07/index.html b/docs/2021-07/index.html
index ededf4c2a..083459e67 100644
--- a/docs/2021-07/index.html
+++ b/docs/2021-07/index.html
@@ -30,7 +30,7 @@ Export another list of ALL subjects on CGSpace, including AGROVOC and non-AGROVO
localhost/dspace63= > \COPY (SELECT DISTINCT LOWER(text_value) AS subject, count(*) FROM metadatavalue WHERE dspace_object_id in (SELECT dspace_object_id FROM item) AND metadata_field_id IN (119, 120, 127, 122, 128, 125, 135, 203, 208, 210, 215, 123, 236, 242, 187) GROUP BY subject ORDER BY count DESC) to /tmp/2021-07-01-all-subjects.csv WITH CSV HEADER;
COPY 20994
"/>
-
+
diff --git a/docs/2021-08/index.html b/docs/2021-08/index.html
index dccd98e42..4d271deb5 100644
--- a/docs/2021-08/index.html
+++ b/docs/2021-08/index.html
@@ -32,7 +32,7 @@ Update Docker images on AReS server (linode20) and reboot the server:
I decided to upgrade linode20 from Ubuntu 18.04 to 20.04
"/>
-
+
diff --git a/docs/2021-09/index.html b/docs/2021-09/index.html
index c6b067c12..35e921002 100644
--- a/docs/2021-09/index.html
+++ b/docs/2021-09/index.html
@@ -48,7 +48,7 @@ The syntax Moayad showed me last month doesn’t seem to honor the search qu
"/>
-
+
diff --git a/docs/2021-10/index.html b/docs/2021-10/index.html
index 0e0680572..7b98934c9 100644
--- a/docs/2021-10/index.html
+++ b/docs/2021-10/index.html
@@ -46,7 +46,7 @@ $ wc -l /tmp/2021-10-01-affiliations.txt
So we have 1879/7100 (26.46%) matching already
"/>
-
+
diff --git a/docs/2021-11/index.html b/docs/2021-11/index.html
index 97507be04..2870e876c 100644
--- a/docs/2021-11/index.html
+++ b/docs/2021-11/index.html
@@ -32,7 +32,7 @@ First I exported all the 2019 stats from CGSpace:
$ ./run.sh -s http://localhost:8081/solr/statistics -f 'time:2019-*' -a export -o statistics-2019.json -k uid
$ zstd statistics-2019.json
"/>
-
+
diff --git a/docs/2021-12/index.html b/docs/2021-12/index.html
index 0caa25712..41bdbb06f 100644
--- a/docs/2021-12/index.html
+++ b/docs/2021-12/index.html
@@ -40,7 +40,7 @@ Purging 455 hits from WhatsApp in statistics
Total number of bot hits purged: 3679
"/>
-
+
diff --git a/docs/2022-01/index.html b/docs/2022-01/index.html
index f658a2324..4e3231301 100644
--- a/docs/2022-01/index.html
+++ b/docs/2022-01/index.html
@@ -24,7 +24,7 @@ Start a full harvest on AReS
Start a full harvest on AReS
"/>
-
+
diff --git a/docs/2022-02/index.html b/docs/2022-02/index.html
index a26bf379c..3b54c83b0 100644
--- a/docs/2022-02/index.html
+++ b/docs/2022-02/index.html
@@ -38,7 +38,7 @@ We agreed to try to do more alignment of affiliations/funders with ROR
"/>
-
+
diff --git a/docs/2022-03/index.html b/docs/2022-03/index.html
index ce8880931..c2856a0f9 100644
--- a/docs/2022-03/index.html
+++ b/docs/2022-03/index.html
@@ -34,7 +34,7 @@ $ ./ilri/check-duplicates.py -i /tmp/tac4.csv -db dspace -u dspace -p 'fuuu&
$ csvcut -c id,filename ~/Downloads/2022-03-01-CGSpace-TAC-ICW-batch4-701-980.csv > /tmp/tac4-filenames.csv
$ csvjoin -c id /tmp/2022-03-01-tac-batch4-701-980.csv /tmp/tac4-filenames.csv > /tmp/2022-03-01-tac-batch4-701-980-filenames.csv
"/>
-
+
diff --git a/docs/2022-04/index.html b/docs/2022-04/index.html
index 9dff992bd..1c3db57e1 100644
--- a/docs/2022-04/index.html
+++ b/docs/2022-04/index.html
@@ -18,7 +18,7 @@
-
+
diff --git a/docs/2022-05/index.html b/docs/2022-05/index.html
index a9be893f1..c048a0525 100644
--- a/docs/2022-05/index.html
+++ b/docs/2022-05/index.html
@@ -66,7 +66,7 @@ If I query Solr for time:2022-04* AND dns:*msnbot* AND dns:*.msn.com. I see a ha
I purged 93,974 hits from these IPs using my check-spider-ip-hits.sh script
"/>
-
+
diff --git a/docs/2022-06/index.html b/docs/2022-06/index.html
index 1ee9a7ff3..8b38a296f 100644
--- a/docs/2022-06/index.html
+++ b/docs/2022-06/index.html
@@ -48,7 +48,7 @@ There seem to be many more of these:
"/>
-
+
diff --git a/docs/2022-07/index.html b/docs/2022-07/index.html
index f97d26ad4..d11f586b9 100644
--- a/docs/2022-07/index.html
+++ b/docs/2022-07/index.html
@@ -34,7 +34,7 @@ Also, the trgm functions I’ve used before are case insensitive, but Levens
"/>
-
+
diff --git a/docs/2022-08/index.html b/docs/2022-08/index.html
index 69e8c7150..3b1d54f24 100644
--- a/docs/2022-08/index.html
+++ b/docs/2022-08/index.html
@@ -24,7 +24,7 @@ Our request to add CC-BY-3.0-IGO to SPDX was approved a few weeks ago
Our request to add CC-BY-3.0-IGO to SPDX was approved a few weeks ago
"/>
-
+
diff --git a/docs/2022-09/index.html b/docs/2022-09/index.html
index 969354364..acc0e3f88 100644
--- a/docs/2022-09/index.html
+++ b/docs/2022-09/index.html
@@ -46,7 +46,7 @@ I also fixed a few bugs and improved the region-matching logic
"/>
-
+
diff --git a/docs/2022-10/index.html b/docs/2022-10/index.html
index 50c9d4d15..d7d0496eb 100644
--- a/docs/2022-10/index.html
+++ b/docs/2022-10/index.html
@@ -36,7 +36,7 @@ I filed an issue to ask about Java 11+ support
"/>
-
+
diff --git a/docs/2022-11/index.html b/docs/2022-11/index.html
index bc79cca0d..d27630afd 100644
--- a/docs/2022-11/index.html
+++ b/docs/2022-11/index.html
@@ -44,7 +44,7 @@ I want to make sure they use groups instead of individuals where possible!
I reverted the Cocoon autosave change because it was more of a nuissance that Peter can’t upload CSVs from the web interface and is a very low severity security issue
"/>
-
+
diff --git a/docs/2022-12/index.html b/docs/2022-12/index.html
index a146ce315..9c8556e31 100644
--- a/docs/2022-12/index.html
+++ b/docs/2022-12/index.html
@@ -36,7 +36,7 @@ I exported the CCAFS and IITA communities, extracted just the country and region
Add a few more authors to my CSV with author names and ORCID identifiers and tag 283 items!
Replace “East Asia” with “Eastern Asia” region on CGSpace (UN M.49 region)
"/>
-
+
diff --git a/docs/2023-01/index.html b/docs/2023-01/index.html
index 4306a99a7..0320b3e67 100644
--- a/docs/2023-01/index.html
+++ b/docs/2023-01/index.html
@@ -34,7 +34,7 @@ I see we have some new ones that aren’t in our list if I combine with this
"/>
-
+
diff --git a/docs/2023-02/index.html b/docs/2023-02/index.html
index 2aa57b9a8..507b246a9 100644
--- a/docs/2023-02/index.html
+++ b/docs/2023-02/index.html
@@ -32,7 +32,7 @@ I want to try to expand my use of their data to journals, publishers, volumes, i
"/>
-
+
diff --git a/docs/2023-03/index.html b/docs/2023-03/index.html
index e856c0bbf..338fef960 100644
--- a/docs/2023-03/index.html
+++ b/docs/2023-03/index.html
@@ -28,7 +28,7 @@ Remove cg.subject.wle and cg.identifier.wletheme from CGSpace input form after c
iso-codes 4.13.0 was released, which incorporates my changes to the common names for Iran, Laos, and Syria
I finally got through with porting the input form from DSpace 6 to DSpace 7
"/>
-
+
diff --git a/docs/2023-04/index.html b/docs/2023-04/index.html
index ebc12ae91..399c2b155 100644
--- a/docs/2023-04/index.html
+++ b/docs/2023-04/index.html
@@ -36,7 +36,7 @@ I also did a check for missing country/region mappings with csv-metadata-quality
Start a harvest on AReS
"/>
-
+
diff --git a/docs/2023-05/index.html b/docs/2023-05/index.html
index 25047bdae..2226d7e31 100644
--- a/docs/2023-05/index.html
+++ b/docs/2023-05/index.html
@@ -46,7 +46,7 @@ Also I found at least two spelling mistakes, for example “decison support
Work on cleaning, proofing, and uploading twenty-seven records for IFPRI to CGSpace
"/>
-
+
diff --git a/docs/2023-06/index.html b/docs/2023-06/index.html
index 0ddb93880..52d81dae8 100644
--- a/docs/2023-06/index.html
+++ b/docs/2023-06/index.html
@@ -44,7 +44,7 @@ From what I can see we need to upgrade the MODS schema from 3.1 to 3.7 and then
"/>
-
+
diff --git a/docs/2023-07/index.html b/docs/2023-07/index.html
index 0629401d1..4adc08f7b 100644
--- a/docs/2023-07/index.html
+++ b/docs/2023-07/index.html
@@ -18,7 +18,7 @@
-
+
diff --git a/docs/2023-08/index.html b/docs/2023-08/index.html
index c301e3179..88d2fa523 100644
--- a/docs/2023-08/index.html
+++ b/docs/2023-08/index.html
@@ -34,7 +34,7 @@ I did some minor cleanups myself and applied them to CGSpace
Start working on some batch uploads for IFPRI
"/>
-
+
diff --git a/docs/2023-09/index.html b/docs/2023-09/index.html
index 049f2b822..0cd1c3376 100644
--- a/docs/2023-09/index.html
+++ b/docs/2023-09/index.html
@@ -26,7 +26,7 @@ Start a harvest on AReS
Export CGSpace to check for missing Initiative collection mappings
Start a harvest on AReS
"/>
-
+
diff --git a/docs/2023-10/index.html b/docs/2023-10/index.html
index 16bedd581..333cc4ec7 100644
--- a/docs/2023-10/index.html
+++ b/docs/2023-10/index.html
@@ -36,7 +36,7 @@ We can be on the safe side by using only abstracts for items that are licensed u
"/>
-
+
diff --git a/docs/2023-11/index.html b/docs/2023-11/index.html
index 208869b04..e4799a47d 100644
--- a/docs/2023-11/index.html
+++ b/docs/2023-11/index.html
@@ -42,7 +42,7 @@ I improved the filtering and wrote some Python using pandas to merge my sources
Export CGSpace to check missing Initiative collection mappings
Start a harvest on AReS
"/>
-
+
diff --git a/docs/2023-12/index.html b/docs/2023-12/index.html
index 63b7046d0..b2f5ce2fd 100644
--- a/docs/2023-12/index.html
+++ b/docs/2023-12/index.html
@@ -18,7 +18,7 @@
-
+
diff --git a/docs/2024-01/index.html b/docs/2024-01/index.html
index 79065ad1d..e4ef19708 100644
--- a/docs/2024-01/index.html
+++ b/docs/2024-01/index.html
@@ -6,41 +6,27 @@
-
-
+
-
-
+
+
-
-
+
-
+
@@ -48,11 +34,11 @@ Work on IFPRI ISNAR archive cleanup
{
"@context": "http://schema.org",
"@type": "BlogPosting",
- "headline": "January, 2024",
+ "headline": "February, 2024",
"url": "https://alanorth.github.io/cgspace-notes/2024-01/",
- "wordCount": "2215",
- "datePublished": "2024-01-02T10:08:00+03:00",
- "dateModified": "2024-02-05T11:09:40+03:00",
+ "wordCount": "551",
+ "datePublished": "2024-01-05T11:10:00+03:00",
+ "dateModified": "2024-02-27T17:18:35+03:00",
"author": {
"@type": "Person",
"name": "Alan Orth"
@@ -65,7 +51,7 @@ Work on IFPRI ISNAR archive cleanup
-
dspace=#BEGIN;
+BEGIN
+dspace=*#UPDATE metadatavalue SET text_value=LOWER(text_value) WHERE dspace_object_id IN (SELECT uuid FROM item) AND metadata_field_id=187AND text_value ~'[[:upper:]]';
+UPDATE180
+dspace=*#COMMIT;
+COMMIT
+
2024-02-06
-
I’m not quite sure what we need to do for the Handle server
-
For now I just ran the dspace make-handle-config script and diffed it with the one from DSpace 6
-
I sent the bundle to the Handle admins to make sure it’s OK before we do the migration
+
Discuss IWMI using the CGSpace REST API for their new website
+
Export the IWMI community to extract their ORCID identifiers:
-
-
Continue testing and debugging the cgspace-java-helpers on DSpace 7
-
Work on IFPRI ISNAR archive cleanup
-
-
2024-01-03
-
-
I haven’t heard from the Handle admins so I’m preparing a backup solution using nginx streams
-
This seems to work in my simple tests (this must be outside the http {} block):
Help Peter proof and upload 252 items from the 2023 Gender conference to CGSpace
-
Meeting with IFPRI to discuss their migration to CGSpace
-
-
We agreed to add two new fields, one for IFPRI project and one for IFPRI publication ranking
-
Most likely we will use cg.identifier.project as a general field and consolidate other project fields there
-
Not sure which field to use for the publication rank…
-
-
-
-
2024-01-05
-
-
Proof and upload 51 items in bulk for IFPRI
-
I did a big cleanup of user groups in anticipation of complaints about slow workflow tasks etc in DSpace 7
-
-
I removed ILRI editors from all the dozens of CCAFS community and collection groups, and I should do the same for other CRPs since they are closed for two years now
-
-
-
-
2024-01-06
-
-
Migrate CGSpace to DSpace 7
-
-
2024-01-07
-
-
High load on the server and UptimeRobot saying the frontend is flapping
-
-
I noticed tons of logs from pm2 in the systemd journal, so I disabled those in the systemd unit because they are available from pm2’s log directory anyway
-
I also noticed the same for Solr, so I disabled stdout for that systemd unit as well
-
-
-
I spent a lot of time bringing back the nginx rate limits we used in DSpace 6 and it seems to have helped
-
I see some client doing weird HEAD requests to search pages:
Export list of publishers for Peter to select some amount to use as a controlled vocabulary:
+
Maria asked me about the “missing” item from last week again
+
+
I can see it when I used the Admin search, but not in her workflow
+
It was submitted by TIP so I checked that user’s workspace and found it there
+
After depositing, it went into the workflow so Maria should be able to see it now
-
localhost/dspace7= ☘ \COPY (SELECT DISTINCT text_value AS "dcterms.publisher", count(*) FROM metadatavalue WHERE dspace_object_id in (SELECT dspace_object_id FROM item) AND metadata_field_id = 178 GROUP BY "dcterms.publisher" ORDER BY count DESC) to /tmp/2024-01-publishers.csv WITH CSV HEADER;
-COPY 4332
+
+
+
2024-02-09
+
+
Minor edits to CGSpace submission form
+
Upload 55 ISNAR book chapters to CGSpace from Peter
+
+
2024-02-19
+
+
Looking into the collection mapping issue on CGSpace
+
Minor work on OpenRXV to fix a bug in the ng-select drop downs
+
Minor work on the DSpace 7 nginx configuration to allow requesting robots.txt and sitemaps without hitting rate limits
+
+
2024-02-21
+
+
Minor updates on OpenRXV, including one bug fix for missing mapped collections
+
+
Salem had to re-work the harvester for DSpace 7 since the mapped collections and parent collection list are separate!
+
+
+
+
2024-02-22
+
+
Discuss tagging of datasets and re-work the submission form to encourage use of DOI field for any item that has a DOI, and the normal URL field if not
+
+
The “cg.identifier.dataurl” field will be used for “related” datasets
+
I still have to check and move some metadata for existing datasets
+
+
+
+
2024-02-23
+
+
This morning Tomcat died due to an OOM kill from the kernel:
+
+
kernel: Out of memory: Killed process 698 (java) total-vm:14151300kB, anon-rss:9665812kB, file-rss:320kB, shmem-rss:0kB, UID:997 pgtables:20436kB oom_score_adj:0
-
Address some feedback on DSpace 7 from users, including fileing some issues on GitHub
+
I don’t see any abnormal pattern in my Grafana graphs, for JVM or system load… very weird
+
I updated the submission form on CGSpace to include the new changes to URLs for datasets
The Alliance TIP team was having issues posting to one collection via the legacy DSpace 6 REST API
-
-
In the DSpace logs I see the same issue that they had last month:
+
I also updated about 80 datasets to move the URLs to the correct field
-
ERROR unknown unknown org.dspace.rest.Resource @ Something get wrong. Aborting context in finally statement.
-
2024-01-09
+
2024-02-25
-
I restarted Tomcat to see if it helps the REST issue
-
After talking with Peter about publishers we decided to get a clean list of the top ~100 publishers and then make sure all CGIAR centers, Initiatives, and Impact Platforms are there as well
-
-
I exported a list from PostgreSQL and then filtered by count > 40 in OpenRefine and then extracted the metadata values:
+
This morning Tomcat died while I was doing a CSV export, with an OOM kill from the kernel:
Export a list of ORCID identifiers from PostgreSQL to look them up on ORCID and update our controlled vocabulary:
-
-
localhost/dspace7= ☘ \COPY (SELECT DISTINCT(text_value) FROM metadatavalue WHERE dspace_object_id IN (SELECT uuid FROM item) AND metadata_field_id=247) to /tmp/2024-01-09-orcid-identifiers.txt;
-localhost/dspace7= ☘ \q
-$ cat ~/src/git/DSpace/dspace/config/controlled-vocabularies/cg-creator-identifier.xml /tmp/2024-01-09-orcid-identifiers.txt | grep -oE '[A-Z0-9]{4}-[A-Z0-9]{4}-[A-Z0-9]{4}-[A-Z0-9]{4}' | sort -u > /tmp/2024-01-09-orcids.txt
-$ ./ilri/resolve_orcids.py -i /tmp/2024-01-09-orcids.txt -o /tmp/2024-01-09-orcids-names.txt -d
+
kernel: Out of memory: Killed process 720768 (java) total-vm:14079976kB, anon-rss:9301684kB, file-rss:152kB, shmem-rss:0kB, UID:997 pgtables:19488kB oom_score_adj:0
-
Then I updated existing ORCID identifiers in CGSpace:
+
I don’t know why this is happening so often recently…
Bizu seems to be having issues due to belonging to too many groups
+
2024-02-27
-
I see some messages from Solr in the DSpace log:
+
IFPRI sent me a list of authors to add to our list for now, until we can find a better way of doing it
+
+
I extracted the existing authors from our controlled vocabulary and combined them with IFPRI’s:
-
2024-01-09 06:23:35,893 ERROR unknown unknown org.dspace.authorize.AuthorizeServiceImpl @ Failed getting getting community/collection admin status for bahhhhh@cgiar.org The search error is: Error from server at http://localhost:8983/solr/search: org.apache.solr.search.SyntaxError: Cannot parse 'search.resourcetype:Community AND (admin:eef481147-daf3-4fd2-bb8d-e18af8131d8c OR admin:g80199ef9-bcd6-4961-9512-501dea076607 OR admin:g4ac29263-cf0c-48d0-8be7-7f09317d50ec OR admin:g0e594148-a0f6-4f00-970d-6b7812f89540 OR admin:g0265b87a-2183-4357-a971-7a5b0c7add3a OR admin:g371ae807-f014-4305-b4ec-f2a8f6f0dcfa OR admin:gdc5cb27c-4a5a-45c2-b656-a399fded70de OR admin:ge36d0ece-7a52-4925-afeb-6641d6a348cc OR admin:g15dc1173-7ddf-43cf-a89a-77a7f81c4cfc OR admin:gc3a599d3-c758-46cd-9855-c98f6ab58ae4 OR admin:g3d648c3e-58c3-4342-b500-07cba10ba52d OR admin:g82bf5168-65c1-4627-8eb4-724fa0ea51a7 OR admin:ge751e973-697d-419c-b59b-5a5644702874 OR admin:g44dd0a80-c1e6-4274-9be4-9f342d74928c OR admin:g4842f9c2-73ed-476a-a81a-7167d8aa7946 OR admin:g5f279b3f-c2ce-4c75-b151-1de52c1a540e OR admin:ga6df8adc-2e1d-40f2-8f1e-f77796d0eecd OR admin:gfdfc1621-382e-437a-8674-c9007627565c OR admin:g15cd114a-0b89-442b-a1b4-1febb6959571 OR admin:g12aede99-d018-4c00-b4d4-a732541d0017 OR admin:gc59529d7-002a-4216-b2e1-d909afd2d4a9 OR admin:gd0806714-bc13-460d-bedd-121bdd5436a4 OR admin:gce70739a-8820-4d56-b19c-f191855479e4 OR admin:g7d3409eb-81e3-4156-afb1-7f02de22065f OR admin:g54bc009e-2954-4dad-8c30-be6a09dc5093 OR admin:gc5e1d6b7-4603-40d7-852f-6654c159dec9 OR admin:g0046214d-c85b-4f12-a5e6-2f57a2c3abb0 OR admin:g4c7b4fd0-938f-40e9-ab3e-447c317296c1 OR admin:gcfae9b69-d8dd-4cf3-9a4e-d6e31ff68731 OR ... admin:g20f366c0-96c0-4416-ad0b-46884010925f)': too many boolean clauses The search resourceType filter was: search.resourcetype:Community
-
We previous had this issue in 2020-01 and 2020-02 with DSpace 5 and DSpace 6
-
At the time the solution was to increase the maxBooleanClauses in Solr and to disable access rights awareness, but I don’t think we want to do the second one now
-
I saw many users of Solr in other applications increasing this to obscenely high numbers, so I think we should be OK to increase it from 1024 to 2048
-
-
-
Re-visiting the DSpace user groomer to delete inactive users
-
I decided to do the same on CGSpace and it worked without errors
-
I finished working on the controlled vocabulary for publishers
-
-
2024-01-10
-
-
I spent some time deleting old groups on CGSpace
-
I looked into the use of the cg.identifier.ciatproject field and found there are only a handful of uses, with some even seeming to be a mistake:
-
-
localhost/dspace7= ☘ SELECT DISTINCT text_value AS "cg.identifier.ciatproject", count(*) FROM metadatavalue WHERE dspace_object_id in (SELECT dspace_object_id FROM item) AND metadata
-_field_id = 232 GROUP BY "cg.identifier.ciatproject" ORDER BY count DESC;
- cg.identifier.ciatproject │ count
-───────────────────────────┼───────
- D145 │ 4
- LAM_LivestockPlus │ 2
- A215 │ 1
- A217 │ 1
- A220 │ 1
- A223 │ 1
- A224 │ 1
- A227 │ 1
- A229 │ 1
- A230 │ 1
- CLIMATE CHANGE MITIGATION │ 1
- LIVESTOCK │ 1
-(12 rows)
-
-Time: 240.041 ms
-
-
I think we can move those to a new cg.identifier.project if we create one
-
The cg.identifier.cpwfproject field is similarly sparse, but the CCAFS ones are widely used
-
-
2024-01-12
-
-
Export a list of affiliations to do some cleanup:
-
-
localhost/dspace7= ☘ \COPY (SELECT DISTINCT text_value AS "cg.contributor.affiliation", count(*) FROM metadatavalue WHERE dspace_object_id in (SELECT dspace_object_id FROM item) AND metadata_field_id = 211 GROUP BY "cg.contributor.affiliation" ORDER BY count DESC) to /tmp/2024-01-affiliations.csv WITH CSV HEADER;
-COPY 11719
-
-
I first did some clustering and editing in OpenRefine, then I’ll import those back into CGSpace and then do another export
-
Troubleshooting the statistics pages that aren’t working on DSpace 7
-
-
On a hunch, I queried for for Solr statistics documents that did not have an id matching the 36-character UUID pattern:
I see that we had 31,744 statistic events yesterday, and 799 have no id!
-
I asked about this on Slack and will file an issue on GitHub if someone else also finds such records
-
-
Several people said they have them, so it’s a bug of some sort in DSpace, not our configuration
-
-
-
-
2024-01-13
-
-
Yesterday alone we had 37,000 unique IPs making requests to nginx
-
-
I looked up the ASNs and found 6,000 IPs from this network in Amazon Singapore: 47.128.0.0/14
-
-
-
-
2024-01-15
-
-
Investigating the CSS selector warning that I’ve seen in PM2 logs:
-
-
0|dspace-ui | 1 rules skipped due to selector errors:
-0|dspace-ui | .custom-file-input:lang(en)~.custom-file-label -> unmatched pseudo-class :lang
-
-
It seems to be a bug in Angular, as this selector comes from Bootstrap 4.6.x and is not invalid
-
-
But that led me to a more interesting issue with inlineCritical optimization for styles in Angular SSR that might be responsible for causing high load in the frontend
Looking these IPs up I see there are 18,000 coming from Comcast, 10,000 from AT&T, 4110 from Charter, 3500 from Cox and dozens of other residential IPs
-
-
I highly doubt these are home users browsing CGSpace… seems super fishy
-
Also, over 1,000 IPs from SpaceX Starlink in the last week. RIGHT
-
I will temporarily add a few new datacenter ISP network blocks to our rate limit:
-
-
16509 Amazon-02
-
701 UUNET
-
8075 Microsoft
-
15169 Google
-
14618 Amazon-AES
-
396982 Google Cloud
-
-
-
The load on the server immediately dropped
-
-
-
-
2024-01-17
-
-
It turns out AS701 (UUNET) is Verizon Business, which is used as an ISP for many staff at IFPRI
-
-
This was causing them to see HTTP 429 “too many requests” errors on CGSpace
-
I removed this ASN from the rate limiting
-
-
-
-
2024-01-18
-
-
Start looking at Solr stats again
-
-
I found one statistics record that has 22,000 of the same collection in owningColl and 22,000 of the same community in owningComm
-
The record is from 2015 and think it would be easier to delete it than fix it:
This adds about 400 new identifiers to the controlled vocabulary
-
I consolidated our various project identifier fields for closed programs into one cg.identifer.project:
-
-
cg.identifier.ccafsproject
-
cg.identifier.ccafsprojectpii
-
cg.identifier.ciatproject
-
cg.identifier.cpwfproject
-
-
-
I prefixed the existing 2,644 metadata values with “CCAFS”, “CIAT”, or “CPWF” so we can figure out where they came from if need be, and deleted the old fields from the metadata registry
-
-
2024-01-26
-
-
Minor work on dspace-angular to clean up component styles
-
Add cg.identifier.publicationRank to CGSpace metadata registry and submission form
-
-
2024-01-29
-
-
Rework the nginx bot and network limits slightly to remove some old patterns/networks and remove Google
-
-
The Google Scholar team contacted me to ask why their requests were timing out (well…)
-
-
+
I figured out a way to add a new Angular component to handle all our relation fields