diff --git a/content/posts/2024-02.md b/content/posts/2024-02.md index 846927c6e..1926290f1 100644 --- a/content/posts/2024-02.md +++ b/content/posts/2024-02.md @@ -62,4 +62,27 @@ $ ./ilri/resolve_orcids.py -i /tmp/iwmi-orcids.txt -o /tmp/iwmi-orcids-names.csv - Minor work on OpenRXV to fix a bug in the ng-select drop downs - Minor work on the DSpace 7 nginx configuration to allow requesting robots.txt and sitemaps without hitting rate limits +## 2024-02-21 + +- Minor updates on OpenRXV, including one bug fix for missing mapped collections + - Salem had to re-work the harvester for DSpace 7 since the mapped collections and parent collection list are separate! + +## 2024-02-22 + +- Discuss tagging of datasets and re-work the submission form to encourage use of DOI field for any item that has a DOI, and the normal URL field if not + - The "cg.identifier.dataurl" field will be used for "related" datasets + - I still have to check and move some metadata for existing datasets + +## 2024-02-23 + +- This morning Tomcat died due to an OOM kill from the kernel: + +```console +kernel: Out of memory: Killed process 698 (java) total-vm:14151300kB, anon-rss:9665812kB, file-rss:320kB, shmem-rss:0kB, UID:997 pgtables:20436kB oom_score_adj:0 +``` + +- I don't see any abnormal pattern in my Grafana graphs, for JVM or system load... very weird +- I updated the submission form on CGSpace to include the new changes to URLs for datasets + - I also updated about 80 datasets to move the URLs to the correct field + diff --git a/docs/2015-11/index.html b/docs/2015-11/index.html index bea107e4b..9499986f5 100644 --- a/docs/2015-11/index.html +++ b/docs/2015-11/index.html @@ -34,7 +34,7 @@ Last week I had increased the limit from 30 to 60, which seemed to help, but now $ psql -c 'SELECT * from pg_stat_activity;' | grep idle | grep -c cgspace 78 "/> - + diff --git a/docs/2015-12/index.html b/docs/2015-12/index.html index 810791519..0d35e97a3 100644 --- a/docs/2015-12/index.html +++ b/docs/2015-12/index.html @@ -36,7 +36,7 @@ Replace lzop with xz in log compression cron jobs on DSpace Test—it uses less -rw-rw-r-- 1 tomcat7 tomcat7 387K Nov 18 23:59 dspace.log.2015-11-18.lzo -rw-rw-r-- 1 tomcat7 tomcat7 169K Nov 18 23:59 dspace.log.2015-11-18.xz "/> - + diff --git a/docs/2016-01/index.html b/docs/2016-01/index.html index 409a0c947..b90678c85 100644 --- a/docs/2016-01/index.html +++ b/docs/2016-01/index.html @@ -28,7 +28,7 @@ Move ILRI collection 10568/12503 from 10568/27869 to 10568/27629 using the move_ I realized it is only necessary to clear the Cocoon cache after moving collections—rather than reindexing—as no metadata has changed, and therefore no search or browse indexes need to be updated. Update GitHub wiki for documentation of maintenance tasks. "/> - + diff --git a/docs/2016-02/index.html b/docs/2016-02/index.html index 7312e2e8d..3402764ce 100644 --- a/docs/2016-02/index.html +++ b/docs/2016-02/index.html @@ -38,7 +38,7 @@ I noticed we have a very interesting list of countries on CGSpace: Not only are there 49,000 countries, we have some blanks (25)… Also, lots of things like “COTE D`LVOIRE” and “COTE D IVOIRE” "/> - + diff --git a/docs/2016-03/index.html b/docs/2016-03/index.html index 722893afa..b2baaf173 100644 --- a/docs/2016-03/index.html +++ b/docs/2016-03/index.html @@ -28,7 +28,7 @@ Looking at issues with author authorities on CGSpace For some reason we still have the index-lucene-update cron job active on CGSpace, but I’m pretty sure we don’t need it as of the latest few versions of Atmire’s Listings and Reports module Reinstall my local (Mac OS X) DSpace stack with Tomcat 7, PostgreSQL 9.3, and Java JDK 1.7 to match environment on CGSpace server "/> - + diff --git a/docs/2016-04/index.html b/docs/2016-04/index.html index c176aa762..41e12e6c8 100644 --- a/docs/2016-04/index.html +++ b/docs/2016-04/index.html @@ -32,7 +32,7 @@ After running DSpace for over five years I’ve never needed to look in any This will save us a few gigs of backup space we’re paying for on S3 Also, I noticed the checker log has some errors we should pay attention to: "/> - + diff --git a/docs/2016-05/index.html b/docs/2016-05/index.html index bf3b93320..ad00274d7 100644 --- a/docs/2016-05/index.html +++ b/docs/2016-05/index.html @@ -34,7 +34,7 @@ There are 3,000 IPs accessing the REST API in a 24-hour period! # awk '{print $1}' /var/log/nginx/rest.log | uniq | wc -l 3168 "/> - + diff --git a/docs/2016-06/index.html b/docs/2016-06/index.html index f533dc5d4..2fd4ce8b6 100644 --- a/docs/2016-06/index.html +++ b/docs/2016-06/index.html @@ -34,7 +34,7 @@ This is their publications set: http://ebrary.ifpri.org/oai/oai.php?verb=ListRec You can see the others by using the OAI ListSets verb: http://ebrary.ifpri.org/oai/oai.php?verb=ListSets Working on second phase of metadata migration, looks like this will work for moving CPWF-specific data in dc.identifier.fund to cg.identifier.cpwfproject and then the rest to dc.description.sponsorship "/> - + diff --git a/docs/2016-07/index.html b/docs/2016-07/index.html index d7d7cd2cc..26d0453fb 100644 --- a/docs/2016-07/index.html +++ b/docs/2016-07/index.html @@ -44,7 +44,7 @@ dspacetest=# select text_value from metadatavalue where metadata_field_id=3 and In this case the select query was showing 95 results before the update "/> - + diff --git a/docs/2016-08/index.html b/docs/2016-08/index.html index 71d594a2d..a705168da 100644 --- a/docs/2016-08/index.html +++ b/docs/2016-08/index.html @@ -42,7 +42,7 @@ $ git checkout -b 55new 5_x-prod $ git reset --hard ilri/5_x-prod $ git rebase -i dspace-5.5 "/> - + diff --git a/docs/2016-09/index.html b/docs/2016-09/index.html index 6cd58550f..eab5f13b7 100644 --- a/docs/2016-09/index.html +++ b/docs/2016-09/index.html @@ -34,7 +34,7 @@ It looks like we might be able to use OUs now, instead of DCs: $ ldapsearch -x -H ldaps://svcgroot2.cgiarad.org:3269/ -b "dc=cgiarad,dc=org" -D "admigration1@cgiarad.org" -W "(sAMAccountName=admigration1)" "/> - + diff --git a/docs/2016-10/index.html b/docs/2016-10/index.html index dfede9c8b..e6c06733b 100644 --- a/docs/2016-10/index.html +++ b/docs/2016-10/index.html @@ -42,7 +42,7 @@ I exported a random item’s metadata as CSV, deleted all columns except id 0000-0002-6115-0956||0000-0002-3812-8793||0000-0001-7462-405X "/> - + diff --git a/docs/2016-11/index.html b/docs/2016-11/index.html index 8a397ad2c..98b563e90 100644 --- a/docs/2016-11/index.html +++ b/docs/2016-11/index.html @@ -26,7 +26,7 @@ Add dc.type to the output options for Atmire’s Listings and Reports module Add dc.type to the output options for Atmire’s Listings and Reports module (#286) "/> - + diff --git a/docs/2016-12/index.html b/docs/2016-12/index.html index b9f5c046a..9efa8d6e4 100644 --- a/docs/2016-12/index.html +++ b/docs/2016-12/index.html @@ -46,7 +46,7 @@ I see thousands of them in the logs for the last few months, so it’s not r I’ve raised a ticket with Atmire to ask Another worrying error from dspace.log is: "/> - + diff --git a/docs/2017-01/index.html b/docs/2017-01/index.html index 174095efb..3e0f44e17 100644 --- a/docs/2017-01/index.html +++ b/docs/2017-01/index.html @@ -28,7 +28,7 @@ I checked to see if the Solr sharding task that is supposed to run on January 1s I tested on DSpace Test as well and it doesn’t work there either I asked on the dspace-tech mailing list because it seems to be broken, and actually now I’m not sure if we’ve ever had the sharding task run successfully over all these years "/> - + diff --git a/docs/2017-02/index.html b/docs/2017-02/index.html index 59c4c67f9..26ed058ac 100644 --- a/docs/2017-02/index.html +++ b/docs/2017-02/index.html @@ -50,7 +50,7 @@ DELETE 1 Create issue on GitHub to track the addition of CCAFS Phase II project tags (#301) Looks like we’ll be using cg.identifier.ccafsprojectpii as the field name "/> - + diff --git a/docs/2017-03/index.html b/docs/2017-03/index.html index 3e66c9492..164bc74c4 100644 --- a/docs/2017-03/index.html +++ b/docs/2017-03/index.html @@ -54,7 +54,7 @@ Interestingly, it seems DSpace 4.x’s thumbnails were sRGB, but forcing reg $ identify ~/Desktop/alc_contrastes_desafios.jpg /Users/aorth/Desktop/alc_contrastes_desafios.jpg JPEG 464x600 464x600+0+0 8-bit CMYK 168KB 0.000u 0:00.000 "/> - + diff --git a/docs/2017-04/index.html b/docs/2017-04/index.html index ec100b675..782579eec 100644 --- a/docs/2017-04/index.html +++ b/docs/2017-04/index.html @@ -40,7 +40,7 @@ Testing the CMYK patch on a collection with 650 items: $ [dspace]/bin/dspace filter-media -f -i 10568/16498 -p "ImageMagick PDF Thumbnail" -v >& /tmp/filter-media-cmyk.txt "/> - + diff --git a/docs/2017-05/index.html b/docs/2017-05/index.html index 1f5114edf..45651f67b 100644 --- a/docs/2017-05/index.html +++ b/docs/2017-05/index.html @@ -18,7 +18,7 @@ - + diff --git a/docs/2017-06/index.html b/docs/2017-06/index.html index 415960343..9dd2f7228 100644 --- a/docs/2017-06/index.html +++ b/docs/2017-06/index.html @@ -18,7 +18,7 @@ - + diff --git a/docs/2017-07/index.html b/docs/2017-07/index.html index c5baad7dd..197fcaf1e 100644 --- a/docs/2017-07/index.html +++ b/docs/2017-07/index.html @@ -36,7 +36,7 @@ Merge changes for WLE Phase II theme rename (#329) Looking at extracting the metadata registries from ICARDA’s MEL DSpace database so we can compare fields with CGSpace We can use PostgreSQL’s extended output format (-x) plus sed to format the output into quasi XML: "/> - + diff --git a/docs/2017-08/index.html b/docs/2017-08/index.html index eacbc26ab..c8331d1a7 100644 --- a/docs/2017-08/index.html +++ b/docs/2017-08/index.html @@ -60,7 +60,7 @@ This was due to newline characters in the dc.description.abstract column, which I exported a new CSV from the collection on DSpace Test and then manually removed the characters in vim using g/^$/d Then I cleaned up the author authorities and HTML characters in OpenRefine and sent the file back to Abenet "/> - + diff --git a/docs/2017-09/index.html b/docs/2017-09/index.html index 921aec376..9d6a6e555 100644 --- a/docs/2017-09/index.html +++ b/docs/2017-09/index.html @@ -32,7 +32,7 @@ Linode sent an alert that CGSpace (linode18) was using 261% CPU for the past two Ask Sisay to clean up the WLE approvers a bit, as Marianne’s user account is both in the approvers step as well as the group "/> - + diff --git a/docs/2017-10/index.html b/docs/2017-10/index.html index 3f71f4441..92b93d818 100644 --- a/docs/2017-10/index.html +++ b/docs/2017-10/index.html @@ -34,7 +34,7 @@ http://hdl.handle.net/10568/78495||http://hdl.handle.net/10568/79336 There appears to be a pattern but I’ll have to look a bit closer and try to clean them up automatically, either in SQL or in OpenRefine Add Katherine Lutz to the groups for content submission and edit steps of the CGIAR System collections "/> - + diff --git a/docs/2017-11/index.html b/docs/2017-11/index.html index 4f6bf17b8..61d2c8ba5 100644 --- a/docs/2017-11/index.html +++ b/docs/2017-11/index.html @@ -48,7 +48,7 @@ Generate list of authors on CGSpace for Peter to go through and correct: dspace=# \copy (select distinct text_value, count(*) as count from metadatavalue where metadata_field_id = (select metadata_field_id from metadatafieldregistry where element = 'contributor' and qualifier = 'author') AND resource_type_id = 2 group by text_value order by count desc) to /tmp/authors.csv with csv; COPY 54701 "/> - + diff --git a/docs/2017-12/index.html b/docs/2017-12/index.html index 9fac10b93..f7d64e0d6 100644 --- a/docs/2017-12/index.html +++ b/docs/2017-12/index.html @@ -30,7 +30,7 @@ The logs say “Timeout waiting for idle object” PostgreSQL activity says there are 115 connections currently The list of connections to XMLUI and REST API for today: "/> - + diff --git a/docs/2018-01/index.html b/docs/2018-01/index.html index 3f5d0e39a..0ff931d11 100644 --- a/docs/2018-01/index.html +++ b/docs/2018-01/index.html @@ -150,7 +150,7 @@ dspace.log.2018-01-02:34 Danny wrote to ask for help renewing the wildcard ilri.org certificate and I advised that we should probably use Let’s Encrypt if it’s just a handful of domains "/> - + diff --git a/docs/2018-02/index.html b/docs/2018-02/index.html index d0c73550f..d40f7f4e8 100644 --- a/docs/2018-02/index.html +++ b/docs/2018-02/index.html @@ -30,7 +30,7 @@ We don’t need to distinguish between internal and external works, so that Yesterday I figured out how to monitor DSpace sessions using JMX I copied the logic in the jmx_tomcat_dbpools provided by Ubuntu’s munin-plugins-java package and used the stuff I discovered about JMX in 2018-01 "/> - + diff --git a/docs/2018-03/index.html b/docs/2018-03/index.html index 430a0515b..e9b2453ec 100644 --- a/docs/2018-03/index.html +++ b/docs/2018-03/index.html @@ -24,7 +24,7 @@ Export a CSV of the IITA community metadata for Martin Mueller Export a CSV of the IITA community metadata for Martin Mueller "/> - + diff --git a/docs/2018-04/index.html b/docs/2018-04/index.html index 8aa2c53cc..70cf0d0c1 100644 --- a/docs/2018-04/index.html +++ b/docs/2018-04/index.html @@ -26,7 +26,7 @@ Catalina logs at least show some memory errors yesterday: I tried to test something on DSpace Test but noticed that it’s down since god knows when Catalina logs at least show some memory errors yesterday: "/> - + diff --git a/docs/2018-05/index.html b/docs/2018-05/index.html index ccaa9c231..17f8aef86 100644 --- a/docs/2018-05/index.html +++ b/docs/2018-05/index.html @@ -38,7 +38,7 @@ http://localhost:3000/solr/statistics/update?stream.body=%3Ccommit/%3E Then I reduced the JVM heap size from 6144 back to 5120m Also, I switched it to use OpenJDK instead of Oracle Java, as well as re-worked the Ansible infrastructure scripts to support hosts choosing which distribution they want to use "/> - + diff --git a/docs/2018-06/index.html b/docs/2018-06/index.html index f0505f51e..874535b48 100644 --- a/docs/2018-06/index.html +++ b/docs/2018-06/index.html @@ -58,7 +58,7 @@ real 74m42.646s user 8m5.056s sys 2m7.289s "/> - + diff --git a/docs/2018-07/index.html b/docs/2018-07/index.html index afd520010..9b61d92c2 100644 --- a/docs/2018-07/index.html +++ b/docs/2018-07/index.html @@ -36,7 +36,7 @@ During the mvn package stage on the 5.8 branch I kept getting issues with java r There is insufficient memory for the Java Runtime Environment to continue. "/> - + diff --git a/docs/2018-08/index.html b/docs/2018-08/index.html index a5ce701cc..69162fe38 100644 --- a/docs/2018-08/index.html +++ b/docs/2018-08/index.html @@ -46,7 +46,7 @@ Anyways, perhaps I should increase the JVM heap from 5120m to 6144m like we did The server only has 8GB of RAM so we’ll eventually need to upgrade to a larger one because we’ll start starving the OS, PostgreSQL, and command line batch processes I ran all system updates on DSpace Test and rebooted it "/> - + diff --git a/docs/2018-09/index.html b/docs/2018-09/index.html index 1b36e1f35..b8813eef2 100644 --- a/docs/2018-09/index.html +++ b/docs/2018-09/index.html @@ -30,7 +30,7 @@ I’ll update the DSpace role in our Ansible infrastructure playbooks and ru Also, I’ll re-run the postgresql tasks because the custom PostgreSQL variables are dynamic according to the system’s RAM, and we never re-ran them after migrating to larger Linodes last month I’m testing the new DSpace 5.8 branch in my Ubuntu 18.04 environment and I’m getting those autowire errors in Tomcat 8.5.30 again: "/> - + diff --git a/docs/2018-10/index.html b/docs/2018-10/index.html index af627439b..cfefe0297 100644 --- a/docs/2018-10/index.html +++ b/docs/2018-10/index.html @@ -26,7 +26,7 @@ I created a GitHub issue to track this #389, because I’m super busy in Nai Phil Thornton got an ORCID identifier so we need to add it to the list on CGSpace and tag his existing items I created a GitHub issue to track this #389, because I’m super busy in Nairobi right now "/> - + diff --git a/docs/2018-11/index.html b/docs/2018-11/index.html index 5a093d953..cadcea9d2 100644 --- a/docs/2018-11/index.html +++ b/docs/2018-11/index.html @@ -36,7 +36,7 @@ Send a note about my dspace-statistics-api to the dspace-tech mailing list Linode has been sending mails a few times a day recently that CGSpace (linode18) has had high CPU usage Today these are the top 10 IPs: "/> - + diff --git a/docs/2018-12/index.html b/docs/2018-12/index.html index 1fecf9191..ca3b7a925 100644 --- a/docs/2018-12/index.html +++ b/docs/2018-12/index.html @@ -36,7 +36,7 @@ Then I ran all system updates and restarted the server I noticed that there is another issue with PDF thumbnails on CGSpace, and I see there was another Ghostscript vulnerability last week "/> - + diff --git a/docs/2019-01/index.html b/docs/2019-01/index.html index a6bd8f7c7..4fdf185ea 100644 --- a/docs/2019-01/index.html +++ b/docs/2019-01/index.html @@ -50,7 +50,7 @@ I don’t see anything interesting in the web server logs around that time t 357 207.46.13.1 903 54.70.40.11 "/> - + diff --git a/docs/2019-02/index.html b/docs/2019-02/index.html index a92bec977..f318c02cb 100644 --- a/docs/2019-02/index.html +++ b/docs/2019-02/index.html @@ -72,7 +72,7 @@ real 0m19.873s user 0m22.203s sys 0m1.979s "/> - + diff --git a/docs/2019-03/index.html b/docs/2019-03/index.html index 3b1206daf..bee9b3b32 100644 --- a/docs/2019-03/index.html +++ b/docs/2019-03/index.html @@ -46,7 +46,7 @@ Most worryingly, there are encoding errors in the abstracts for eleven items, fo I think I will need to ask Udana to re-copy and paste the abstracts with more care using Google Docs "/> - + diff --git a/docs/2019-04/index.html b/docs/2019-04/index.html index 862b67982..033117c8f 100644 --- a/docs/2019-04/index.html +++ b/docs/2019-04/index.html @@ -64,7 +64,7 @@ $ ./fix-metadata-values.py -i /tmp/2019-02-21-fix-4-regions.csv -db dspace -u ds $ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-2-countries.csv -db dspace -u dspace -p 'fuuu' -m 228 -f cg.coverage.country -d $ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-1-region.csv -db dspace -u dspace -p 'fuuu' -m 231 -f cg.coverage.region -d "/> - + diff --git a/docs/2019-05/index.html b/docs/2019-05/index.html index 8230ecc6a..181d1fc3c 100644 --- a/docs/2019-05/index.html +++ b/docs/2019-05/index.html @@ -48,7 +48,7 @@ DELETE 1 But after this I tried to delete the item from the XMLUI and it is still present… "/> - + diff --git a/docs/2019-06/index.html b/docs/2019-06/index.html index 0d6cd05f0..1da538b42 100644 --- a/docs/2019-06/index.html +++ b/docs/2019-06/index.html @@ -34,7 +34,7 @@ Run system updates on CGSpace (linode18) and reboot it Skype with Marie-Angélique and Abenet about CG Core v2 "/> - + diff --git a/docs/2019-07/index.html b/docs/2019-07/index.html index 5afc33178..6c937e314 100644 --- a/docs/2019-07/index.html +++ b/docs/2019-07/index.html @@ -38,7 +38,7 @@ CGSpace Abenet had another similar issue a few days ago when trying to find the stats for 2018 in the RTB community "/> - + diff --git a/docs/2019-08/index.html b/docs/2019-08/index.html index df5cb3080..b4caf98f1 100644 --- a/docs/2019-08/index.html +++ b/docs/2019-08/index.html @@ -46,7 +46,7 @@ After rebooting, all statistics cores were loaded… wow, that’s luck Run system updates on DSpace Test (linode19) and reboot it "/> - + diff --git a/docs/2019-09/index.html b/docs/2019-09/index.html index fb20a382c..72c1756e9 100644 --- a/docs/2019-09/index.html +++ b/docs/2019-09/index.html @@ -72,7 +72,7 @@ Here are the top ten IPs in the nginx XMLUI and REST/OAI logs this morning: 7249 2a01:7e00::f03c:91ff:fe18:7396 9124 45.5.186.2 "/> - + diff --git a/docs/2019-10/index.html b/docs/2019-10/index.html index 5b39430a3..500dc51c0 100644 --- a/docs/2019-10/index.html +++ b/docs/2019-10/index.html @@ -18,7 +18,7 @@ - + diff --git a/docs/2019-11/index.html b/docs/2019-11/index.html index d5225a28e..9f05f35e6 100644 --- a/docs/2019-11/index.html +++ b/docs/2019-11/index.html @@ -58,7 +58,7 @@ Let’s see how many of the REST API requests were for bitstreams (because t # zcat --force /var/log/nginx/rest.log.*.gz | grep -E "[0-9]{1,2}/Oct/2019" | grep -c -E "/rest/bitstreams" 106781 "/> - + diff --git a/docs/2019-12/index.html b/docs/2019-12/index.html index 34b27137d..a3240eaf7 100644 --- a/docs/2019-12/index.html +++ b/docs/2019-12/index.html @@ -46,7 +46,7 @@ Make sure all packages are up to date and the package manager is up to date, the # dpkg -C # reboot "/> - + diff --git a/docs/2020-01/index.html b/docs/2020-01/index.html index 86994e71b..27166a3f8 100644 --- a/docs/2020-01/index.html +++ b/docs/2020-01/index.html @@ -56,7 +56,7 @@ I tweeted the CGSpace repository link "/> - + diff --git a/docs/2020-02/index.html b/docs/2020-02/index.html index 36204c0f7..08b9b1442 100644 --- a/docs/2020-02/index.html +++ b/docs/2020-02/index.html @@ -38,7 +38,7 @@ The code finally builds and runs with a fresh install "/> - + diff --git a/docs/2020-03/index.html b/docs/2020-03/index.html index 155f621a3..556d051a9 100644 --- a/docs/2020-03/index.html +++ b/docs/2020-03/index.html @@ -42,7 +42,7 @@ You need to download this into the DSpace 6.x source and compile it "/> - + diff --git a/docs/2020-04/index.html b/docs/2020-04/index.html index c1a669c8f..a1cea4623 100644 --- a/docs/2020-04/index.html +++ b/docs/2020-04/index.html @@ -48,7 +48,7 @@ The third item now has a donut with score 1 since I tweeted it last week On the same note, the one item Abenet pointed out last week now has a donut with score of 104 after I tweeted it last week "/> - + diff --git a/docs/2020-05/index.html b/docs/2020-05/index.html index 496b15bb2..0f6233ef8 100644 --- a/docs/2020-05/index.html +++ b/docs/2020-05/index.html @@ -34,7 +34,7 @@ I see that CGSpace (linode18) is still using PostgreSQL JDBC driver version 42.2 "/> - + diff --git a/docs/2020-06/index.html b/docs/2020-06/index.html index 2e9d524bd..fa3758189 100644 --- a/docs/2020-06/index.html +++ b/docs/2020-06/index.html @@ -36,7 +36,7 @@ I sent Atmire the dspace.log from today and told them to log into the server to In other news, I checked the statistics API on DSpace 6 and it’s working I tried to build the OAI registry on the freshly migrated DSpace 6 on DSpace Test and I get an error: "/> - + diff --git a/docs/2020-07/index.html b/docs/2020-07/index.html index dcc6c0543..7b9373fe2 100644 --- a/docs/2020-07/index.html +++ b/docs/2020-07/index.html @@ -38,7 +38,7 @@ I restarted Tomcat and PostgreSQL and the issue was gone Since I was restarting Tomcat anyways I decided to redeploy the latest changes from the 5_x-prod branch and I added a note about COVID-19 items to the CGSpace frontpage at Peter’s request "/> - + diff --git a/docs/2020-08/index.html b/docs/2020-08/index.html index d1cde80e0..810d273b6 100644 --- a/docs/2020-08/index.html +++ b/docs/2020-08/index.html @@ -36,7 +36,7 @@ It is class based so I can easily add support for other vocabularies, and the te "/> - + diff --git a/docs/2020-09/index.html b/docs/2020-09/index.html index 6512df2b2..6909d32cb 100644 --- a/docs/2020-09/index.html +++ b/docs/2020-09/index.html @@ -48,7 +48,7 @@ I filed a bug on OpenRXV: https://github.com/ilri/OpenRXV/issues/39 I filed an issue on OpenRXV to make some minor edits to the admin UI: https://github.com/ilri/OpenRXV/issues/40 "/> - + diff --git a/docs/2020-10/index.html b/docs/2020-10/index.html index 5854e6383..25b8d9aad 100644 --- a/docs/2020-10/index.html +++ b/docs/2020-10/index.html @@ -44,7 +44,7 @@ During the FlywayDB migration I got an error: "/> - + diff --git a/docs/2020-11/index.html b/docs/2020-11/index.html index 40987bd42..b0a1e7429 100644 --- a/docs/2020-11/index.html +++ b/docs/2020-11/index.html @@ -32,7 +32,7 @@ So far we’ve spent at least fifty hours to process the statistics and stat "/> - + diff --git a/docs/2020-12/index.html b/docs/2020-12/index.html index c874b36ae..5b0a8ce82 100644 --- a/docs/2020-12/index.html +++ b/docs/2020-12/index.html @@ -36,7 +36,7 @@ I started processing those (about 411,000 records): "/> - + diff --git a/docs/2021-01/index.html b/docs/2021-01/index.html index 2f3b19f32..44151bfa6 100644 --- a/docs/2021-01/index.html +++ b/docs/2021-01/index.html @@ -50,7 +50,7 @@ For example, this item has 51 views on CGSpace, but 0 on AReS "/> - + diff --git a/docs/2021-02/index.html b/docs/2021-02/index.html index 30f59e3eb..0adb5a8af 100644 --- a/docs/2021-02/index.html +++ b/docs/2021-02/index.html @@ -60,7 +60,7 @@ $ curl -s 'http://localhost:9200/openrxv-items-temp/_count?q=*&pretty } } "/> - + diff --git a/docs/2021-03/index.html b/docs/2021-03/index.html index 5f4e9a1ff..cdd9a17cf 100644 --- a/docs/2021-03/index.html +++ b/docs/2021-03/index.html @@ -34,7 +34,7 @@ Also, we found some issues building and running OpenRXV currently due to ecosyst "/> - + diff --git a/docs/2021-04/index.html b/docs/2021-04/index.html index fe62644ed..e1c4735ba 100644 --- a/docs/2021-04/index.html +++ b/docs/2021-04/index.html @@ -44,7 +44,7 @@ Perhaps one of the containers crashed, I should have looked closer but I was in "/> - + diff --git a/docs/2021-05/index.html b/docs/2021-05/index.html index a7c8247b3..be37789d0 100644 --- a/docs/2021-05/index.html +++ b/docs/2021-05/index.html @@ -36,7 +36,7 @@ I looked at the top user agents and IPs in the Solr statistics for last month an I will add the RI/1.0 pattern to our DSpace agents overload and purge them from Solr (we had previously seen this agent with 9,000 hits or so in 2020-09), but I think I will leave the Microsoft Word one… as that’s an actual user… "/> - + diff --git a/docs/2021-06/index.html b/docs/2021-06/index.html index b423e0e9e..39360a58d 100644 --- a/docs/2021-06/index.html +++ b/docs/2021-06/index.html @@ -36,7 +36,7 @@ I simply started it and AReS was running again: "/> - + diff --git a/docs/2021-07/index.html b/docs/2021-07/index.html index b09909bf7..ededf4c2a 100644 --- a/docs/2021-07/index.html +++ b/docs/2021-07/index.html @@ -30,7 +30,7 @@ Export another list of ALL subjects on CGSpace, including AGROVOC and non-AGROVO localhost/dspace63= > \COPY (SELECT DISTINCT LOWER(text_value) AS subject, count(*) FROM metadatavalue WHERE dspace_object_id in (SELECT dspace_object_id FROM item) AND metadata_field_id IN (119, 120, 127, 122, 128, 125, 135, 203, 208, 210, 215, 123, 236, 242, 187) GROUP BY subject ORDER BY count DESC) to /tmp/2021-07-01-all-subjects.csv WITH CSV HEADER; COPY 20994 "/> - + diff --git a/docs/2021-08/index.html b/docs/2021-08/index.html index fb0cfa7a8..dccd98e42 100644 --- a/docs/2021-08/index.html +++ b/docs/2021-08/index.html @@ -32,7 +32,7 @@ Update Docker images on AReS server (linode20) and reboot the server: I decided to upgrade linode20 from Ubuntu 18.04 to 20.04 "/> - + diff --git a/docs/2021-09/index.html b/docs/2021-09/index.html index 534db9712..c6b067c12 100644 --- a/docs/2021-09/index.html +++ b/docs/2021-09/index.html @@ -48,7 +48,7 @@ The syntax Moayad showed me last month doesn’t seem to honor the search qu "/> - + diff --git a/docs/2021-10/index.html b/docs/2021-10/index.html index dee5bc389..0e0680572 100644 --- a/docs/2021-10/index.html +++ b/docs/2021-10/index.html @@ -46,7 +46,7 @@ $ wc -l /tmp/2021-10-01-affiliations.txt So we have 1879/7100 (26.46%) matching already "/> - + diff --git a/docs/2021-11/index.html b/docs/2021-11/index.html index c3f2b2e23..97507be04 100644 --- a/docs/2021-11/index.html +++ b/docs/2021-11/index.html @@ -32,7 +32,7 @@ First I exported all the 2019 stats from CGSpace: $ ./run.sh -s http://localhost:8081/solr/statistics -f 'time:2019-*' -a export -o statistics-2019.json -k uid $ zstd statistics-2019.json "/> - + diff --git a/docs/2021-12/index.html b/docs/2021-12/index.html index cc741f89d..0caa25712 100644 --- a/docs/2021-12/index.html +++ b/docs/2021-12/index.html @@ -40,7 +40,7 @@ Purging 455 hits from WhatsApp in statistics Total number of bot hits purged: 3679 "/> - + diff --git a/docs/2022-01/index.html b/docs/2022-01/index.html index 6f08ebfad..f658a2324 100644 --- a/docs/2022-01/index.html +++ b/docs/2022-01/index.html @@ -24,7 +24,7 @@ Start a full harvest on AReS Start a full harvest on AReS "/> - + diff --git a/docs/2022-02/index.html b/docs/2022-02/index.html index 3ac09ae86..a26bf379c 100644 --- a/docs/2022-02/index.html +++ b/docs/2022-02/index.html @@ -38,7 +38,7 @@ We agreed to try to do more alignment of affiliations/funders with ROR "/> - + diff --git a/docs/2022-03/index.html b/docs/2022-03/index.html index 1aa38c87f..ce8880931 100644 --- a/docs/2022-03/index.html +++ b/docs/2022-03/index.html @@ -34,7 +34,7 @@ $ ./ilri/check-duplicates.py -i /tmp/tac4.csv -db dspace -u dspace -p 'fuuu& $ csvcut -c id,filename ~/Downloads/2022-03-01-CGSpace-TAC-ICW-batch4-701-980.csv > /tmp/tac4-filenames.csv $ csvjoin -c id /tmp/2022-03-01-tac-batch4-701-980.csv /tmp/tac4-filenames.csv > /tmp/2022-03-01-tac-batch4-701-980-filenames.csv "/> - + diff --git a/docs/2022-04/index.html b/docs/2022-04/index.html index 9a1ecabbe..9dff992bd 100644 --- a/docs/2022-04/index.html +++ b/docs/2022-04/index.html @@ -18,7 +18,7 @@ - + diff --git a/docs/2022-05/index.html b/docs/2022-05/index.html index c84c54b4e..a9be893f1 100644 --- a/docs/2022-05/index.html +++ b/docs/2022-05/index.html @@ -66,7 +66,7 @@ If I query Solr for time:2022-04* AND dns:*msnbot* AND dns:*.msn.com. I see a ha I purged 93,974 hits from these IPs using my check-spider-ip-hits.sh script "/> - + diff --git a/docs/2022-06/index.html b/docs/2022-06/index.html index f5a108e9e..1ee9a7ff3 100644 --- a/docs/2022-06/index.html +++ b/docs/2022-06/index.html @@ -48,7 +48,7 @@ There seem to be many more of these: "/> - + diff --git a/docs/2022-07/index.html b/docs/2022-07/index.html index 34269854c..f97d26ad4 100644 --- a/docs/2022-07/index.html +++ b/docs/2022-07/index.html @@ -34,7 +34,7 @@ Also, the trgm functions I’ve used before are case insensitive, but Levens "/> - + diff --git a/docs/2022-08/index.html b/docs/2022-08/index.html index 5ca6d5b4c..69e8c7150 100644 --- a/docs/2022-08/index.html +++ b/docs/2022-08/index.html @@ -24,7 +24,7 @@ Our request to add CC-BY-3.0-IGO to SPDX was approved a few weeks ago Our request to add CC-BY-3.0-IGO to SPDX was approved a few weeks ago "/> - + diff --git a/docs/2022-09/index.html b/docs/2022-09/index.html index 39a00d6ce..969354364 100644 --- a/docs/2022-09/index.html +++ b/docs/2022-09/index.html @@ -46,7 +46,7 @@ I also fixed a few bugs and improved the region-matching logic "/> - + diff --git a/docs/2022-10/index.html b/docs/2022-10/index.html index dd2ff2b61..50c9d4d15 100644 --- a/docs/2022-10/index.html +++ b/docs/2022-10/index.html @@ -36,7 +36,7 @@ I filed an issue to ask about Java 11+ support "/> - + diff --git a/docs/2022-11/index.html b/docs/2022-11/index.html index 8dff2ac1e..bc79cca0d 100644 --- a/docs/2022-11/index.html +++ b/docs/2022-11/index.html @@ -44,7 +44,7 @@ I want to make sure they use groups instead of individuals where possible! I reverted the Cocoon autosave change because it was more of a nuissance that Peter can’t upload CSVs from the web interface and is a very low severity security issue "/> - + diff --git a/docs/2022-12/index.html b/docs/2022-12/index.html index e80bf10e5..a146ce315 100644 --- a/docs/2022-12/index.html +++ b/docs/2022-12/index.html @@ -36,7 +36,7 @@ I exported the CCAFS and IITA communities, extracted just the country and region Add a few more authors to my CSV with author names and ORCID identifiers and tag 283 items! Replace “East Asia” with “Eastern Asia” region on CGSpace (UN M.49 region) "/> - + diff --git a/docs/2023-01/index.html b/docs/2023-01/index.html index 3e76e15d3..4306a99a7 100644 --- a/docs/2023-01/index.html +++ b/docs/2023-01/index.html @@ -34,7 +34,7 @@ I see we have some new ones that aren’t in our list if I combine with this "/> - + diff --git a/docs/2023-02/index.html b/docs/2023-02/index.html index 3c20f5a97..2aa57b9a8 100644 --- a/docs/2023-02/index.html +++ b/docs/2023-02/index.html @@ -32,7 +32,7 @@ I want to try to expand my use of their data to journals, publishers, volumes, i "/> - + diff --git a/docs/2023-03/index.html b/docs/2023-03/index.html index 0ed46a59c..e856c0bbf 100644 --- a/docs/2023-03/index.html +++ b/docs/2023-03/index.html @@ -28,7 +28,7 @@ Remove cg.subject.wle and cg.identifier.wletheme from CGSpace input form after c iso-codes 4.13.0 was released, which incorporates my changes to the common names for Iran, Laos, and Syria I finally got through with porting the input form from DSpace 6 to DSpace 7 "/> - + diff --git a/docs/2023-04/index.html b/docs/2023-04/index.html index 9f77b6dce..ebc12ae91 100644 --- a/docs/2023-04/index.html +++ b/docs/2023-04/index.html @@ -36,7 +36,7 @@ I also did a check for missing country/region mappings with csv-metadata-quality Start a harvest on AReS "/> - + diff --git a/docs/2023-05/index.html b/docs/2023-05/index.html index e0df28e50..25047bdae 100644 --- a/docs/2023-05/index.html +++ b/docs/2023-05/index.html @@ -46,7 +46,7 @@ Also I found at least two spelling mistakes, for example “decison support Work on cleaning, proofing, and uploading twenty-seven records for IFPRI to CGSpace "/> - + diff --git a/docs/2023-06/index.html b/docs/2023-06/index.html index 6468ed2b9..0ddb93880 100644 --- a/docs/2023-06/index.html +++ b/docs/2023-06/index.html @@ -44,7 +44,7 @@ From what I can see we need to upgrade the MODS schema from 3.1 to 3.7 and then "/> - + diff --git a/docs/2023-07/index.html b/docs/2023-07/index.html index 17f440536..0629401d1 100644 --- a/docs/2023-07/index.html +++ b/docs/2023-07/index.html @@ -18,7 +18,7 @@ - + diff --git a/docs/2023-08/index.html b/docs/2023-08/index.html index 117ea5b1c..c301e3179 100644 --- a/docs/2023-08/index.html +++ b/docs/2023-08/index.html @@ -34,7 +34,7 @@ I did some minor cleanups myself and applied them to CGSpace Start working on some batch uploads for IFPRI "/> - + diff --git a/docs/2023-09/index.html b/docs/2023-09/index.html index 57b33cb20..049f2b822 100644 --- a/docs/2023-09/index.html +++ b/docs/2023-09/index.html @@ -26,7 +26,7 @@ Start a harvest on AReS Export CGSpace to check for missing Initiative collection mappings Start a harvest on AReS "/> - + diff --git a/docs/2023-10/index.html b/docs/2023-10/index.html index edf22575a..16bedd581 100644 --- a/docs/2023-10/index.html +++ b/docs/2023-10/index.html @@ -36,7 +36,7 @@ We can be on the safe side by using only abstracts for items that are licensed u "/> - + diff --git a/docs/2023-11/index.html b/docs/2023-11/index.html index b91b09cd2..208869b04 100644 --- a/docs/2023-11/index.html +++ b/docs/2023-11/index.html @@ -42,7 +42,7 @@ I improved the filtering and wrote some Python using pandas to merge my sources Export CGSpace to check missing Initiative collection mappings Start a harvest on AReS "/> - + diff --git a/docs/2023-12/index.html b/docs/2023-12/index.html index f5c3d8811..63b7046d0 100644 --- a/docs/2023-12/index.html +++ b/docs/2023-12/index.html @@ -18,7 +18,7 @@ - + diff --git a/docs/2024-01/index.html b/docs/2024-01/index.html index 56f13166f..c6dd61fba 100644 --- a/docs/2024-01/index.html +++ b/docs/2024-01/index.html @@ -6,41 +6,27 @@ - - + - - + + - - + - + @@ -48,11 +34,11 @@ Work on IFPRI ISNAR archive cleanup { "@context": "http://schema.org", "@type": "BlogPosting", - "headline": "January, 2024", + "headline": "February, 2024", "url": "https://alanorth.github.io/cgspace-notes/2024-01/", - "wordCount": "2215", - "datePublished": "2024-01-02T10:08:00+03:00", - "dateModified": "2024-02-05T11:09:40+03:00", + "wordCount": "422", + "datePublished": "2024-01-05T11:10:00+03:00", + "dateModified": "2024-02-20T22:55:09+03:00", "author": { "@type": "Person", "name": "Alan Orth" @@ -65,7 +51,7 @@ Work on IFPRI ISNAR archive cleanup - January, 2024 | CGSpace Notes + February, 2024 | CGSpace Notes @@ -117,466 +103,98 @@ Work on IFPRI ISNAR archive cleanup
-

January, 2024

+

February, 2024

-

2024-01-02

+

2024-02-05

+
dspace=# BEGIN;
+BEGIN
+dspace=*# UPDATE metadatavalue SET text_value=LOWER(text_value) WHERE dspace_object_id IN (SELECT uuid FROM item) AND metadata_field_id=187 AND text_value ~ '[[:upper:]]';
+UPDATE 180
+dspace=*# COMMIT;
+COMMIT
+

2024-02-06

- -
  • Continue testing and debugging the cgspace-java-helpers on DSpace 7
  • -
  • Work on IFPRI ISNAR archive cleanup
  • - -

    2024-01-03

    - -
    stream {
    -    upstream handle_tcp_9000 {
    -       server 188.34.177.10:9000;
    -    }
    -
    -    server {
    -        listen 9000;
    -        proxy_connect_timeout 1s;
    -        proxy_timeout 3s;
    -        proxy_pass handle_tcp_9000;
    -    }
    -}
    -
    -

    2024-01-04

    - -

    2024-01-05

    - -

    2024-01-06

    - -

    2024-01-07

    - -
    47.76.35.19 - - [07/Jan/2024:00:00:02 +0100] "HEAD /search/?f.accessRights=Open+Access%2Cequals&f.actionArea=Resilient+Agrifood+Systems%2Cequals&f.author=Burkart%2C+Stefan%2Cequals&f.country=Kenya%2Cequals&f.impactArea=Climate+adaptation+and+mitigation%2Cequals&f.itemtype=Brief%2Cequals&f.publisher=CGIAR+System+Organization%2Cequals&f.region=Asia%2Cequals&f.sdg=SDG+12+-+Responsible+consumption+and+production%2Cequals&f.sponsorship=CGIAR+Trust+Fund%2Cequals&f.subject=environmental+factors%2Cequals&spc.page=1 HTTP/1.1" 499 0 "-" "Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.2504.63 Safari/537.36"
    -
    -
    $ wget https://asn.ipinfo.app/api/text/list/AS16276 \
    -     https://asn.ipinfo.app/api/text/list/AS23576 \
    -     https://asn.ipinfo.app/api/text/list/AS24940 \
    -     https://asn.ipinfo.app/api/text/list/AS13238 \
    -     https://asn.ipinfo.app/api/text/list/AS14061 \
    -     https://asn.ipinfo.app/api/text/list/AS12876 \
    -     https://asn.ipinfo.app/api/text/list/AS55286 \
    -     https://asn.ipinfo.app/api/text/list/AS203020 \
    -     https://asn.ipinfo.app/api/text/list/AS204287 \
    -     https://asn.ipinfo.app/api/text/list/AS50245 \
    -     https://asn.ipinfo.app/api/text/list/AS6939 \
    -     https://asn.ipinfo.app/api/text/list/AS45102 \
    -     https://asn.ipinfo.app/api/text/list/AS21859
    -$ cat AS* | sort | uniq | wc -l
    -4897
    -$ cat AS* | ~/go/bin/mapcidr -a > /tmp/networks.txt
    -$ wc -l /tmp/networks.txt
    -2017 /tmp/networks.txt
    +
    $ dspace metadata-export -i 10568/16814 -f /tmp/iwmi.csv
    +$ csvcut -c 'cg.creator.identifier,cg.creator.identifier[en_US]' ~/Downloads/2024-02-06-iwmi.csv \
    +  | grep -oE '[A-Z0-9]{4}-[A-Z0-9]{4}-[A-Z0-9]{4}-[A-Z0-9]{4}' \
    +  | sort -u \
    +  | tee /tmp/iwmi-orcids.txt \
    +  | wc -l
    +353
    +$ ./ilri/resolve_orcids.py -i /tmp/iwmi-orcids.txt -o /tmp/iwmi-orcids-names.csv -d
     
      -
    • I’m surprised to see the number of networks reduced from my current ones… hmmm.
    • -
    • I will also update my list of Bing networks:
    • +
    • I noticed some similar looking names in our list so I clustered them in OpenRefine and manually checked a dozen or so to update our list
    -
    $ ./ilri/bing-networks-to-ips.sh
    -$ ~/go/bin/mapcidr -a < /tmp/bing-ips.txt > /tmp/bing-networks.txt
    -$ wc -l /tmp/bing-networks.txt
    -250 /tmp/bing-networks.txt
    -

    2024-01-08

    +

    2024-02-07

      -
    • Export list of publishers for Peter to select some amount to use as a controlled vocabulary:
    • +
    • Maria asked me about the “missing” item from last week again +
        +
      • I can see it when I used the Admin search, but not in her workflow
      • +
      • It was submitted by TIP so I checked that user’s workspace and found it there
      • +
      • After depositing, it went into the workflow so Maria should be able to see it now
      -
      localhost/dspace7= ☘ \COPY (SELECT DISTINCT text_value AS "dcterms.publisher", count(*) FROM metadatavalue WHERE dspace_object_id in (SELECT dspace_object_id FROM item) AND metadata_field_id = 178 GROUP BY "dcterms.publisher" ORDER BY count DESC) to /tmp/2024-01-publishers.csv WITH CSV HEADER;
      -COPY 4332
      +
    • +
    +

    2024-02-09

    +
      +
    • Minor edits to CGSpace submission form
    • +
    • Upload 55 ISNAR book chapters to CGSpace from Peter
    • +
    +

    2024-02-19

    + +

    2024-02-20

    +
      +
    • Minor work on OpenRXV to fix a bug in the ng-select drop downs
    • +
    • Minor work on the DSpace 7 nginx configuration to allow requesting robots.txt and sitemaps without hitting rate limits
    • +
    +

    2024-02-21

    +
      +
    • Minor updates on OpenRXV, including one bug fix for missing mapped collections +
        +
      • Salem had to re-work the harvester for DSpace 7 since the mapped collections and parent collection list are separate!
      • +
      +
    • +
    +

    2024-02-22

    +
      +
    • Discuss tagging of datasets and re-work the submission form to encourage use of DOI field for any item that has a DOI, and the normal URL field if not +
        +
      • The “cg.identifier.dataurl” field will be used for “related” datasets
      • +
      • I still have to check and move some metadata for existing datasets
      • +
      +
    • +
    +

    2024-02-23

    +
      +
    • This morning Tomcat died due to an OOM kill from the kernel:
    • +
    +
    kernel: Out of memory: Killed process 698 (java) total-vm:14151300kB, anon-rss:9665812kB, file-rss:320kB, shmem-rss:0kB, UID:997 pgtables:20436kB oom_score_adj:0
     
      -
    • Address some feedback on DSpace 7 from users, including fileing some issues on GitHub +
    • I don’t see any abnormal pattern in my Grafana graphs, for JVM or system load… very weird
    • +
    • I updated the submission form on CGSpace to include the new changes to URLs for datasets -
    • -
    • The Alliance TIP team was having issues posting to one collection via the legacy DSpace 6 REST API -
        -
      • In the DSpace logs I see the same issue that they had last month:
      • -
      -
    • -
    -
    ERROR unknown unknown org.dspace.rest.Resource @ Something get wrong. Aborting context in finally statement.
    -

    2024-01-09

    -
      -
    • I restarted Tomcat to see if it helps the REST issue
    • -
    • After talking with Peter about publishers we decided to get a clean list of the top ~100 publishers and then make sure all CGIAR centers, Initiatives, and Impact Platforms are there as well -
        -
      • I exported a list from PostgreSQL and then filtered by count > 40 in OpenRefine and then extracted the metadata values:
      • -
      -
    • -
    -
    $ csvcut -c dcterms.publisher ~/Downloads/2024-01-09-publishers4.csv | sed -e 1d -e 's/"//g' > /tmp/top-publishers.txt
    -
      -
    • Export a list of ORCID identifiers from PostgreSQL to look them up on ORCID and update our controlled vocabulary:
    • -
    -
    localhost/dspace7= ☘ \COPY (SELECT DISTINCT(text_value) FROM metadatavalue WHERE dspace_object_id IN (SELECT uuid FROM item) AND metadata_field_id=247) to /tmp/2024-01-09-orcid-identifiers.txt;
    -localhost/dspace7= ☘ \q
    -$ cat ~/src/git/DSpace/dspace/config/controlled-vocabularies/cg-creator-identifier.xml /tmp/2024-01-09-orcid-identifiers.txt | grep -oE '[A-Z0-9]{4}-[A-Z0-9]{4}-[A-Z0-9]{4}-[A-Z0-9]{4}' | sort -u > /tmp/2024-01-09-orcids.txt
    -$ ./ilri/resolve_orcids.py -i /tmp/2024-01-09-orcids.txt -o /tmp/2024-01-09-orcids-names.txt -d
    -
      -
    • Then I updated existing ORCID identifiers in CGSpace:
    • -
    -
    $ ./ilri/update_orcids.py -i /tmp/2024-01-09-orcids-names.txt -db dspace -u dspace -p bahhhh
    -
      -
    • Bizu seems to be having issues due to belonging to too many groups -
        -
      • I see some messages from Solr in the DSpace log:
      • -
      -
    • -
    -
    2024-01-09 06:23:35,893 ERROR unknown unknown org.dspace.authorize.AuthorizeServiceImpl @ Failed getting getting community/collection admin status for bahhhhh@cgiar.org The search error is: Error from server at http://localhost:8983/solr/search: org.apache.solr.search.SyntaxError: Cannot parse 'search.resourcetype:Community AND (admin:eef481147-daf3-4fd2-bb8d-e18af8131d8c OR admin:g80199ef9-bcd6-4961-9512-501dea076607 OR admin:g4ac29263-cf0c-48d0-8be7-7f09317d50ec OR admin:g0e594148-a0f6-4f00-970d-6b7812f89540 OR admin:g0265b87a-2183-4357-a971-7a5b0c7add3a OR admin:g371ae807-f014-4305-b4ec-f2a8f6f0dcfa OR admin:gdc5cb27c-4a5a-45c2-b656-a399fded70de OR admin:ge36d0ece-7a52-4925-afeb-6641d6a348cc OR admin:g15dc1173-7ddf-43cf-a89a-77a7f81c4cfc OR admin:gc3a599d3-c758-46cd-9855-c98f6ab58ae4 OR admin:g3d648c3e-58c3-4342-b500-07cba10ba52d OR admin:g82bf5168-65c1-4627-8eb4-724fa0ea51a7 OR admin:ge751e973-697d-419c-b59b-5a5644702874 OR admin:g44dd0a80-c1e6-4274-9be4-9f342d74928c OR admin:g4842f9c2-73ed-476a-a81a-7167d8aa7946 OR admin:g5f279b3f-c2ce-4c75-b151-1de52c1a540e OR admin:ga6df8adc-2e1d-40f2-8f1e-f77796d0eecd OR admin:gfdfc1621-382e-437a-8674-c9007627565c OR admin:g15cd114a-0b89-442b-a1b4-1febb6959571 OR admin:g12aede99-d018-4c00-b4d4-a732541d0017 OR admin:gc59529d7-002a-4216-b2e1-d909afd2d4a9 OR admin:gd0806714-bc13-460d-bedd-121bdd5436a4 OR admin:gce70739a-8820-4d56-b19c-f191855479e4 OR admin:g7d3409eb-81e3-4156-afb1-7f02de22065f OR admin:g54bc009e-2954-4dad-8c30-be6a09dc5093 OR admin:gc5e1d6b7-4603-40d7-852f-6654c159dec9 OR admin:g0046214d-c85b-4f12-a5e6-2f57a2c3abb0 OR admin:g4c7b4fd0-938f-40e9-ab3e-447c317296c1 OR admin:gcfae9b69-d8dd-4cf3-9a4e-d6e31ff68731 OR ... admin:g20f366c0-96c0-4416-ad0b-46884010925f)': too many boolean clauses The search resourceType filter was: search.resourcetype:Community
    -
      -
    • There are 1,805 OR clauses in the full log! -
        -
      • We previous had this issue in 2020-01 and 2020-02 with DSpace 5 and DSpace 6
      • -
      • At the time the solution was to increase the maxBooleanClauses in Solr and to disable access rights awareness, but I don’t think we want to do the second one now
      • -
      • I saw many users of Solr in other applications increasing this to obscenely high numbers, so I think we should be OK to increase it from 1024 to 2048
      • -
      -
    • -
    • Re-visiting the DSpace user groomer to delete inactive users -
        -
      • In 2023-08 I noticed that this was now possible in DSpace 7
      • -
      • As a test I tried to delete all users who have been inactive since six years ago (Janury 9, 2018):
      • -
      -
    • -
    -
    $ dspace dsrun org.dspace.eperson.Groomer -a -b 01/09/2018 -d
    -
      -
    • I tested it on DSpace 7 Test and it worked… I am debating running it on CGSpace… -
        -
      • I see we have almost 9,000 users:
      • -
      -
    • -
    -
    $ dspace user -L > /tmp/users-before.txt
    -$ wc -l /tmp/users-before.txt
    -8943 /tmp/users-before.txt
    -
      -
    • I decided to do the same on CGSpace and it worked without errors
    • -
    • I finished working on the controlled vocabulary for publishers
    • -
    -

    2024-01-10

    -
      -
    • I spent some time deleting old groups on CGSpace
    • -
    • I looked into the use of the cg.identifier.ciatproject field and found there are only a handful of uses, with some even seeming to be a mistake:
    • -
    -
    localhost/dspace7= ☘ SELECT DISTINCT text_value AS "cg.identifier.ciatproject", count(*) FROM metadatavalue WHERE dspace_object_id in (SELECT dspace_object_id FROM item) AND metadata
    -_field_id = 232 GROUP BY "cg.identifier.ciatproject" ORDER BY count DESC;
    - cg.identifier.ciatproject │ count
    -───────────────────────────┼───────
    - D145                      │     4
    - LAM_LivestockPlus         │     2
    - A215                      │     1
    - A217                      │     1
    - A220                      │     1
    - A223                      │     1
    - A224                      │     1
    - A227                      │     1
    - A229                      │     1
    - A230                      │     1
    - CLIMATE CHANGE MITIGATION │     1
    - LIVESTOCK                 │     1
    -(12 rows)
    -
    -Time: 240.041 ms
    -
      -
    • I think we can move those to a new cg.identifier.project if we create one
    • -
    • The cg.identifier.cpwfproject field is similarly sparse, but the CCAFS ones are widely used
    • -
    -

    2024-01-12

    -
      -
    • Export a list of affiliations to do some cleanup:
    • -
    -
    localhost/dspace7= ☘ \COPY (SELECT DISTINCT text_value AS "cg.contributor.affiliation", count(*) FROM metadatavalue WHERE dspace_object_id in (SELECT dspace_object_id FROM item) AND metadata_field_id = 211 GROUP BY "cg.contributor.affiliation" ORDER BY count DESC) to /tmp/2024-01-affiliations.csv WITH CSV HEADER;
    -COPY 11719
    -
      -
    • I first did some clustering and editing in OpenRefine, then I’ll import those back into CGSpace and then do another export
    • -
    • Troubleshooting the statistics pages that aren’t working on DSpace 7 -
        -
      • On a hunch, I queried for for Solr statistics documents that did not have an id matching the 36-character UUID pattern:
      • -
      -
    • -
    -
    $ curl 'http://localhost:8983/solr/statistics/select?q=-id%3A%2F.\{36\}%2F&rows=0'
    -{
    -  "responseHeader":{
    -    "status":0,
    -    "QTime":0,
    -    "params":{
    -      "q":"-id:/.{36}/",
    -      "rows":"0"}},
    -  "response":{"numFound":800167,"start":0,"numFoundExact":true,"docs":[]
    -  }}
    -
      -
    • They seem to come mostly from 2020, 2023, and 2024:
    • -
    -
    $ curl 'http://localhost:8983/solr/statistics/select?q=-id%3A%2F.\{36\}%2F&facet.range=time&facet=true&facet.range.start=2010-01-01T00:00:00Z&facet.range.end=NOW&facet.range.gap=%2B1YEAR&rows=0'
    -{
    -  "responseHeader":{
    -    "status":0,
    -    "QTime":13,
    -    "params":{
    -      "facet.range":"time",
    -      "q":"-id:/.{36}/",
    -      "facet.range.gap":"+1YEAR",
    -      "rows":"0",
    -      "facet":"true",
    -      "facet.range.start":"2010-01-01T00:00:00Z",
    -      "facet.range.end":"NOW"}},
    -  "response":{"numFound":800168,"start":0,"numFoundExact":true,"docs":[]
    -  },
    -  "facet_counts":{
    -    "facet_queries":{},
    -    "facet_fields":{},
    -    "facet_ranges":{
    -      "time":{
    -        "counts":[
    -          "2010-01-01T00:00:00Z",0,
    -          "2011-01-01T00:00:00Z",0,
    -          "2012-01-01T00:00:00Z",0,
    -          "2013-01-01T00:00:00Z",0,
    -          "2014-01-01T00:00:00Z",0,
    -          "2015-01-01T00:00:00Z",89,
    -          "2016-01-01T00:00:00Z",11,
    -          "2017-01-01T00:00:00Z",0,
    -          "2018-01-01T00:00:00Z",0,
    -          "2019-01-01T00:00:00Z",0,
    -          "2020-01-01T00:00:00Z",1339,
    -          "2021-01-01T00:00:00Z",0,
    -          "2022-01-01T00:00:00Z",0,
    -          "2023-01-01T00:00:00Z",653736,
    -          "2024-01-01T00:00:00Z",144993],
    -        "gap":"+1YEAR",
    -        "start":"2010-01-01T00:00:00Z",
    -        "end":"2025-01-01T00:00:00Z"}},
    -    "facet_intervals":{},
    -    "facet_heatmaps":{}}}
    -
      -
    • They seem to come from 2023-08 until now (so way before we migrated to DSpace 7):
    • -
    -
    $ curl 'http://localhost:8983/solr/statistics/select?q=-id%3A%2F.\{36\}%2F&facet.range=time&facet=true&facet.range.start=2023-01-01T00:00:00Z&facet.range.end=NOW&facet.range.gap=%2B1MONTH&rows=0'
    -{
    -  "responseHeader":{
    -    "status":0,
    -    "QTime":196,
    -    "params":{
    -      "facet.range":"time",
    -      "q":"-id:/.{36}/",
    -      "facet.range.gap":"+1MONTH",
    -      "rows":"0",
    -      "facet":"true",
    -      "facet.range.start":"2023-01-01T00:00:00Z",
    -      "facet.range.end":"NOW"}},
    -  "response":{"numFound":800168,"start":0,"numFoundExact":true,"docs":[]
    -  },
    -  "facet_counts":{
    -    "facet_queries":{},
    -    "facet_fields":{},
    -    "facet_ranges":{
    -      "time":{
    -        "counts":[
    -          "2023-01-01T00:00:00Z",1,
    -          "2023-02-01T00:00:00Z",0,
    -          "2023-03-01T00:00:00Z",0,
    -          "2023-04-01T00:00:00Z",0,
    -          "2023-05-01T00:00:00Z",0,
    -          "2023-06-01T00:00:00Z",0,
    -          "2023-07-01T00:00:00Z",0,
    -          "2023-08-01T00:00:00Z",27621,
    -          "2023-09-01T00:00:00Z",59165,
    -          "2023-10-01T00:00:00Z",115338,
    -          "2023-11-01T00:00:00Z",96147,
    -          "2023-12-01T00:00:00Z",355464,
    -          "2024-01-01T00:00:00Z",125429],
    -        "gap":"+1MONTH",
    -        "start":"2023-01-01T00:00:00Z",
    -        "end":"2024-02-01T00:00:00Z"}},
    -    "facet_intervals":{},
    -    "facet_heatmaps":{}}}
    -
      -
    • I see that we had 31,744 statistic events yesterday, and 799 have no id!
    • -
    • I asked about this on Slack and will file an issue on GitHub if someone else also finds such records -
        -
      • Several people said they have them, so it’s a bug of some sort in DSpace, not our configuration
      • -
      -
    • -
    -

    2024-01-13

    -
      -
    • Yesterday alone we had 37,000 unique IPs making requests to nginx -
        -
      • I looked up the ASNs and found 6,000 IPs from this network in Amazon Singapore: 47.128.0.0/14
      • -
      -
    • -
    -

    2024-01-15

    -
      -
    • Investigating the CSS selector warning that I’ve seen in PM2 logs:
    • -
    -
    0|dspace-ui  | 1 rules skipped due to selector errors:
    -0|dspace-ui  |   .custom-file-input:lang(en)~.custom-file-label -> unmatched pseudo-class :lang
    -
    -
    # zcat -f /var/log/nginx/*access.log  /var/log/nginx/*access.log.1 /var/log/nginx/*access.log.2.gz /var/log/nginx/*access.log.3.gz /var/log/nginx/*access.log.4.gz /var/log/nginx/*access.log.5.gz /var/log/nginx/*access.log.6.gz | awk '{print $1}' | sort -u |
    -tee /tmp/ips.txt | wc -l
    -196493
    -
      -
    • Looking these IPs up I see there are 18,000 coming from Comcast, 10,000 from AT&T, 4110 from Charter, 3500 from Cox and dozens of other residential IPs -
        -
      • I highly doubt these are home users browsing CGSpace… seems super fishy
      • -
      • Also, over 1,000 IPs from SpaceX Starlink in the last week. RIGHT
      • -
      • I will temporarily add a few new datacenter ISP network blocks to our rate limit: -
          -
        • 16509 Amazon-02
        • -
        • 701 UUNET
        • -
        • 8075 Microsoft
        • -
        • 15169 Google
        • -
        • 14618 Amazon-AES
        • -
        • 396982 Google Cloud
        • -
        -
      • -
      • The load on the server immediately dropped
      • -
      -
    • -
    -

    2024-01-17

    -
      -
    • It turns out AS701 (UUNET) is Verizon Business, which is used as an ISP for many staff at IFPRI -
        -
      • This was causing them to see HTTP 429 “too many requests” errors on CGSpace
      • -
      • I removed this ASN from the rate limiting
      • -
      -
    • -
    -

    2024-01-18

    -
      -
    • Start looking at Solr stats again -
        -
      • I found one statistics record that has 22,000 of the same collection in owningColl and 22,000 of the same community in owningComm
      • -
      • The record is from 2015 and think it would be easier to delete it than fix it:
      • -
      -
    • -
    -
    $ curl http://localhost:8983/solr/statistics/update -H "Content-type: text/xml" --data-binary '<delete><query>uid:3b4eefba-a302-4172-a286-dcb25d70129e</query></delete>'
    -
      -
    • Looking again, there are at least 1,000 of these so I will need to come up with an actual solution to fix these
    • -
    • I’m noticing we have 1,800+ links to defunct resources on bioversityinternational.org in the cg.link.permalink field -
        -
      • I should ask Alliance if they have any plans to fix those, or upload them to CGSpace
      • -
      -
    • -
    -

    2024-01-22

    - -

    2024-01-23

    -
      -
    • Meeting with IWMI about ORCID integration and the DSpace API for use with WordPress
    • -
    • IFPRI sent me an list of their author ORCIDs to add to our controlled vocabulary -
        -
      • I joined them with our current list and resolved their names on ORCID and updated them in our database:
      • -
      -
    • -
    -
    $ cat ~/src/git/DSpace/dspace/config/controlled-vocabularies/cg-creator-identifier.xml ~/Downloads/IFPRI\ ORCiD\ All.csv | grep -oE '[A-Z0-9]{4}-[A-Z0-9]{4}-[A-Z0-9]{4}-[A-Z0-9]{4}' | sort -u > /tmp/2024-01-23-orcids.txt
    -$ ./ilri/resolve_orcids.py -i /tmp/2024-01-23-orcids.txt -o /tmp/2024-01-23-orcids-names.txt -d
    -$ ./ilri/update_orcids.py -i /tmp/2024-01-23-orcids-names.txt -db dspace -u dspace -p fuuu
    -
      -
    • This adds about 400 new identifiers to the controlled vocabulary
    • -
    • I consolidated our various project identifier fields for closed programs into one cg.identifer.project: -
        -
      • cg.identifier.ccafsproject
      • -
      • cg.identifier.ccafsprojectpii
      • -
      • cg.identifier.ciatproject
      • -
      • cg.identifier.cpwfproject
      • -
      -
    • -
    • I prefixed the existing 2,644 metadata values with “CCAFS”, “CIAT”, or “CPWF” so we can figure out where they came from if need be, and deleted the old fields from the metadata registry
    • -
    -

    2024-01-26

    -
      -
    • Minor work on dspace-angular to clean up component styles
    • -
    • Add cg.identifier.publicationRank to CGSpace metadata registry and submission form
    • -
    -

    2024-01-29

    -
      -
    • Rework the nginx bot and network limits slightly to remove some old patterns/networks and remove Google -
        -
      • The Google Scholar team contacted me to ask why their requests were timing out (well…)
      • +
      • I also updated about 80 datasets to move the URLs to the correct field
    diff --git a/docs/404.html b/docs/404.html index 6c827faac..d5627e6f4 100644 --- a/docs/404.html +++ b/docs/404.html @@ -17,7 +17,7 @@ - + diff --git a/docs/categories/index.html b/docs/categories/index.html index 4625028dc..7298caa45 100644 --- a/docs/categories/index.html +++ b/docs/categories/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/categories/index.xml b/docs/categories/index.xml index 94a7a65bb..460e54326 100644 --- a/docs/categories/index.xml +++ b/docs/categories/index.xml @@ -6,7 +6,7 @@ Recent content in Categories on CGSpace Notes Hugo -- gohugo.io en-us - Mon, 19 Feb 2024 16:48:20 +0300 + Tue, 20 Feb 2024 22:55:09 +0300 Notes diff --git a/docs/categories/notes/index.html b/docs/categories/notes/index.html index f43697c81..30bc93ef5 100644 --- a/docs/categories/notes/index.html +++ b/docs/categories/notes/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/categories/notes/index.xml b/docs/categories/notes/index.xml index 2c058d511..55fdcc4ed 100644 --- a/docs/categories/notes/index.xml +++ b/docs/categories/notes/index.xml @@ -6,7 +6,7 @@ Recent content in Notes on CGSpace Notes Hugo -- gohugo.io en-us - Mon, 19 Feb 2024 16:48:20 +0300 + Tue, 20 Feb 2024 22:55:09 +0300 February, 2024 diff --git a/docs/categories/notes/page/2/index.html b/docs/categories/notes/page/2/index.html index ddf6123b0..da36877ad 100644 --- a/docs/categories/notes/page/2/index.html +++ b/docs/categories/notes/page/2/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/categories/notes/page/3/index.html b/docs/categories/notes/page/3/index.html index 453de5f9d..ab6b8bdc3 100644 --- a/docs/categories/notes/page/3/index.html +++ b/docs/categories/notes/page/3/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/categories/notes/page/4/index.html b/docs/categories/notes/page/4/index.html index d08fdcbf9..cf9dd4abd 100644 --- a/docs/categories/notes/page/4/index.html +++ b/docs/categories/notes/page/4/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/categories/notes/page/5/index.html b/docs/categories/notes/page/5/index.html index 7f083bff4..580d89926 100644 --- a/docs/categories/notes/page/5/index.html +++ b/docs/categories/notes/page/5/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/categories/notes/page/6/index.html b/docs/categories/notes/page/6/index.html index 9b3a3d6e2..12dd065fe 100644 --- a/docs/categories/notes/page/6/index.html +++ b/docs/categories/notes/page/6/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/categories/notes/page/7/index.html b/docs/categories/notes/page/7/index.html index 3150e53bb..feb39e38c 100644 --- a/docs/categories/notes/page/7/index.html +++ b/docs/categories/notes/page/7/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/categories/notes/page/8/index.html b/docs/categories/notes/page/8/index.html index 8c63791af..e725df8c0 100644 --- a/docs/categories/notes/page/8/index.html +++ b/docs/categories/notes/page/8/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/cgiar-library-migration/index.html b/docs/cgiar-library-migration/index.html index c82a6df14..ebaf35aac 100644 --- a/docs/cgiar-library-migration/index.html +++ b/docs/cgiar-library-migration/index.html @@ -18,7 +18,7 @@ - + diff --git a/docs/cgspace-cgcorev2-migration/index.html b/docs/cgspace-cgcorev2-migration/index.html index 755a8f562..40552fb3a 100644 --- a/docs/cgspace-cgcorev2-migration/index.html +++ b/docs/cgspace-cgcorev2-migration/index.html @@ -18,7 +18,7 @@ - + diff --git a/docs/cgspace-dspace6-upgrade/index.html b/docs/cgspace-dspace6-upgrade/index.html index faf111546..bf1f3c3d1 100644 --- a/docs/cgspace-dspace6-upgrade/index.html +++ b/docs/cgspace-dspace6-upgrade/index.html @@ -18,7 +18,7 @@ - + diff --git a/docs/index.html b/docs/index.html index 3d5b80f66..12f8ca961 100644 --- a/docs/index.html +++ b/docs/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/index.xml b/docs/index.xml index 48f410ba9..2494a80be 100644 --- a/docs/index.xml +++ b/docs/index.xml @@ -6,7 +6,7 @@ Recent content on CGSpace Notes Hugo -- gohugo.io en-us - Mon, 19 Feb 2024 16:48:20 +0300 + Tue, 20 Feb 2024 22:55:09 +0300 February, 2024 diff --git a/docs/page/10/index.html b/docs/page/10/index.html index 345894b6e..b34cbe70d 100644 --- a/docs/page/10/index.html +++ b/docs/page/10/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/page/11/index.html b/docs/page/11/index.html index 499afd0a2..b4cef55f9 100644 --- a/docs/page/11/index.html +++ b/docs/page/11/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/page/2/index.html b/docs/page/2/index.html index f90a4c43c..61e942f86 100644 --- a/docs/page/2/index.html +++ b/docs/page/2/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/page/3/index.html b/docs/page/3/index.html index f2a9ee259..cdb673cdd 100644 --- a/docs/page/3/index.html +++ b/docs/page/3/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/page/4/index.html b/docs/page/4/index.html index ce6c0a436..3152a51c6 100644 --- a/docs/page/4/index.html +++ b/docs/page/4/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/page/5/index.html b/docs/page/5/index.html index 8261bd217..2d65602da 100644 --- a/docs/page/5/index.html +++ b/docs/page/5/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/page/6/index.html b/docs/page/6/index.html index be39362c6..9ec4886d5 100644 --- a/docs/page/6/index.html +++ b/docs/page/6/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/page/7/index.html b/docs/page/7/index.html index 48cef6760..d68561933 100644 --- a/docs/page/7/index.html +++ b/docs/page/7/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/page/8/index.html b/docs/page/8/index.html index fc477fc1b..a97d45299 100644 --- a/docs/page/8/index.html +++ b/docs/page/8/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/page/9/index.html b/docs/page/9/index.html index 14b2fe808..124b62ba3 100644 --- a/docs/page/9/index.html +++ b/docs/page/9/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/posts/index.html b/docs/posts/index.html index d5cb6e2e4..28031eb73 100644 --- a/docs/posts/index.html +++ b/docs/posts/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/posts/index.xml b/docs/posts/index.xml index 58b472b55..cfb4a28bd 100644 --- a/docs/posts/index.xml +++ b/docs/posts/index.xml @@ -6,7 +6,7 @@ Recent content in Posts on CGSpace Notes Hugo -- gohugo.io en-us - Mon, 19 Feb 2024 16:48:20 +0300 + Tue, 20 Feb 2024 22:55:09 +0300 February, 2024 diff --git a/docs/posts/page/10/index.html b/docs/posts/page/10/index.html index 0f07ad2d4..81d3126ea 100644 --- a/docs/posts/page/10/index.html +++ b/docs/posts/page/10/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/posts/page/11/index.html b/docs/posts/page/11/index.html index 80d855d37..037cc98e0 100644 --- a/docs/posts/page/11/index.html +++ b/docs/posts/page/11/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/posts/page/2/index.html b/docs/posts/page/2/index.html index 4af5600a9..4d4c05b40 100644 --- a/docs/posts/page/2/index.html +++ b/docs/posts/page/2/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/posts/page/3/index.html b/docs/posts/page/3/index.html index 467ce1050..d47c48459 100644 --- a/docs/posts/page/3/index.html +++ b/docs/posts/page/3/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/posts/page/4/index.html b/docs/posts/page/4/index.html index 1a87e632d..e9448d1ed 100644 --- a/docs/posts/page/4/index.html +++ b/docs/posts/page/4/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/posts/page/5/index.html b/docs/posts/page/5/index.html index f6e8d215d..56caa66a8 100644 --- a/docs/posts/page/5/index.html +++ b/docs/posts/page/5/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/posts/page/6/index.html b/docs/posts/page/6/index.html index 9edf07fda..2ed6076f0 100644 --- a/docs/posts/page/6/index.html +++ b/docs/posts/page/6/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/posts/page/7/index.html b/docs/posts/page/7/index.html index fddfbe7ab..8b9793211 100644 --- a/docs/posts/page/7/index.html +++ b/docs/posts/page/7/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/posts/page/8/index.html b/docs/posts/page/8/index.html index 60e440fa2..ba380e041 100644 --- a/docs/posts/page/8/index.html +++ b/docs/posts/page/8/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/posts/page/9/index.html b/docs/posts/page/9/index.html index 495d08335..9698933e3 100644 --- a/docs/posts/page/9/index.html +++ b/docs/posts/page/9/index.html @@ -10,14 +10,14 @@ - + - + diff --git a/docs/sitemap.xml b/docs/sitemap.xml index d8cc28d94..dbf62fcfe 100644 --- a/docs/sitemap.xml +++ b/docs/sitemap.xml @@ -3,19 +3,19 @@ xmlns:xhtml="http://www.w3.org/1999/xhtml"> https://alanorth.github.io/cgspace-notes/categories/ - 2024-02-19T16:48:20+03:00 + 2024-02-20T22:55:09+03:00 https://alanorth.github.io/cgspace-notes/ - 2024-02-19T16:48:20+03:00 + 2024-02-20T22:55:09+03:00 https://alanorth.github.io/cgspace-notes/2024-01/ - 2024-02-19T16:48:20+03:00 + 2024-02-20T22:55:09+03:00 https://alanorth.github.io/cgspace-notes/categories/notes/ - 2024-02-19T16:48:20+03:00 + 2024-02-20T22:55:09+03:00 https://alanorth.github.io/cgspace-notes/posts/ - 2024-02-19T16:48:20+03:00 + 2024-02-20T22:55:09+03:00 https://alanorth.github.io/cgspace-notes/2024-01/ 2024-02-05T11:09:40+03:00 diff --git a/docs/tags/index.html b/docs/tags/index.html index b94a0bf4b..652b04d81 100644 --- a/docs/tags/index.html +++ b/docs/tags/index.html @@ -17,7 +17,7 @@ - + diff --git a/docs/tags/migration/index.html b/docs/tags/migration/index.html index d4dcdf1c8..0ad59c78f 100644 --- a/docs/tags/migration/index.html +++ b/docs/tags/migration/index.html @@ -17,7 +17,7 @@ - + diff --git a/docs/tags/notes/index.html b/docs/tags/notes/index.html index 6cf5c283b..a1bd0808d 100644 --- a/docs/tags/notes/index.html +++ b/docs/tags/notes/index.html @@ -17,7 +17,7 @@ - + diff --git a/docs/tags/notes/page/2/index.html b/docs/tags/notes/page/2/index.html index 25f4bf7c0..6388ecfc2 100644 --- a/docs/tags/notes/page/2/index.html +++ b/docs/tags/notes/page/2/index.html @@ -17,7 +17,7 @@ - + diff --git a/docs/tags/notes/page/3/index.html b/docs/tags/notes/page/3/index.html index ccf3d459f..751c65a74 100644 --- a/docs/tags/notes/page/3/index.html +++ b/docs/tags/notes/page/3/index.html @@ -17,7 +17,7 @@ - +