Add notes for 2024-02-29

This commit is contained in:
Alan Orth 2024-02-29 09:41:44 +03:00
parent 483a170f06
commit 1e87242956
Signed by: alanorth
GPG Key ID: 0FB860CC9C45B1B9
145 changed files with 288 additions and 642 deletions

View File

@ -107,4 +107,8 @@ $ xmllint --xpath '//node/isComposedBy/node()' dspace/config/controlled-vocabula
$ cat /tmp/authors /tmp/ifpri-authors | sort -u > /tmp/new-authors $ cat /tmp/authors /tmp/ifpri-authors | sort -u > /tmp/new-authors
``` ```
## 2024-02-28
- I figured out a way to add a new Angular component to handle all our relation fields
<!-- vim: set sw=2 ts=2: --> <!-- vim: set sw=2 ts=2: -->

View File

@ -34,7 +34,7 @@ Last week I had increased the limit from 30 to 60, which seemed to help, but now
$ psql -c &#39;SELECT * from pg_stat_activity;&#39; | grep idle | grep -c cgspace $ psql -c &#39;SELECT * from pg_stat_activity;&#39; | grep idle | grep -c cgspace
78 78
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -36,7 +36,7 @@ Replace lzop with xz in log compression cron jobs on DSpace Test—it uses less
-rw-rw-r-- 1 tomcat7 tomcat7 387K Nov 18 23:59 dspace.log.2015-11-18.lzo -rw-rw-r-- 1 tomcat7 tomcat7 387K Nov 18 23:59 dspace.log.2015-11-18.lzo
-rw-rw-r-- 1 tomcat7 tomcat7 169K Nov 18 23:59 dspace.log.2015-11-18.xz -rw-rw-r-- 1 tomcat7 tomcat7 169K Nov 18 23:59 dspace.log.2015-11-18.xz
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -28,7 +28,7 @@ Move ILRI collection 10568/12503 from 10568/27869 to 10568/27629 using the move_
I realized it is only necessary to clear the Cocoon cache after moving collections—rather than reindexing—as no metadata has changed, and therefore no search or browse indexes need to be updated. I realized it is only necessary to clear the Cocoon cache after moving collections—rather than reindexing—as no metadata has changed, and therefore no search or browse indexes need to be updated.
Update GitHub wiki for documentation of maintenance tasks. Update GitHub wiki for documentation of maintenance tasks.
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -38,7 +38,7 @@ I noticed we have a very interesting list of countries on CGSpace:
Not only are there 49,000 countries, we have some blanks (25)&hellip; Not only are there 49,000 countries, we have some blanks (25)&hellip;
Also, lots of things like &ldquo;COTE D`LVOIRE&rdquo; and &ldquo;COTE D IVOIRE&rdquo; Also, lots of things like &ldquo;COTE D`LVOIRE&rdquo; and &ldquo;COTE D IVOIRE&rdquo;
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -28,7 +28,7 @@ Looking at issues with author authorities on CGSpace
For some reason we still have the index-lucene-update cron job active on CGSpace, but I&rsquo;m pretty sure we don&rsquo;t need it as of the latest few versions of Atmire&rsquo;s Listings and Reports module For some reason we still have the index-lucene-update cron job active on CGSpace, but I&rsquo;m pretty sure we don&rsquo;t need it as of the latest few versions of Atmire&rsquo;s Listings and Reports module
Reinstall my local (Mac OS X) DSpace stack with Tomcat 7, PostgreSQL 9.3, and Java JDK 1.7 to match environment on CGSpace server Reinstall my local (Mac OS X) DSpace stack with Tomcat 7, PostgreSQL 9.3, and Java JDK 1.7 to match environment on CGSpace server
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -32,7 +32,7 @@ After running DSpace for over five years I&rsquo;ve never needed to look in any
This will save us a few gigs of backup space we&rsquo;re paying for on S3 This will save us a few gigs of backup space we&rsquo;re paying for on S3
Also, I noticed the checker log has some errors we should pay attention to: Also, I noticed the checker log has some errors we should pay attention to:
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -34,7 +34,7 @@ There are 3,000 IPs accessing the REST API in a 24-hour period!
# awk &#39;{print $1}&#39; /var/log/nginx/rest.log | uniq | wc -l # awk &#39;{print $1}&#39; /var/log/nginx/rest.log | uniq | wc -l
3168 3168
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -34,7 +34,7 @@ This is their publications set: http://ebrary.ifpri.org/oai/oai.php?verb=ListRec
You can see the others by using the OAI ListSets verb: http://ebrary.ifpri.org/oai/oai.php?verb=ListSets You can see the others by using the OAI ListSets verb: http://ebrary.ifpri.org/oai/oai.php?verb=ListSets
Working on second phase of metadata migration, looks like this will work for moving CPWF-specific data in dc.identifier.fund to cg.identifier.cpwfproject and then the rest to dc.description.sponsorship Working on second phase of metadata migration, looks like this will work for moving CPWF-specific data in dc.identifier.fund to cg.identifier.cpwfproject and then the rest to dc.description.sponsorship
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -44,7 +44,7 @@ dspacetest=# select text_value from metadatavalue where metadata_field_id=3 and
In this case the select query was showing 95 results before the update In this case the select query was showing 95 results before the update
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -42,7 +42,7 @@ $ git checkout -b 55new 5_x-prod
$ git reset --hard ilri/5_x-prod $ git reset --hard ilri/5_x-prod
$ git rebase -i dspace-5.5 $ git rebase -i dspace-5.5
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -34,7 +34,7 @@ It looks like we might be able to use OUs now, instead of DCs:
$ ldapsearch -x -H ldaps://svcgroot2.cgiarad.org:3269/ -b &#34;dc=cgiarad,dc=org&#34; -D &#34;admigration1@cgiarad.org&#34; -W &#34;(sAMAccountName=admigration1)&#34; $ ldapsearch -x -H ldaps://svcgroot2.cgiarad.org:3269/ -b &#34;dc=cgiarad,dc=org&#34; -D &#34;admigration1@cgiarad.org&#34; -W &#34;(sAMAccountName=admigration1)&#34;
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -42,7 +42,7 @@ I exported a random item&rsquo;s metadata as CSV, deleted all columns except id
0000-0002-6115-0956||0000-0002-3812-8793||0000-0001-7462-405X 0000-0002-6115-0956||0000-0002-3812-8793||0000-0001-7462-405X
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -26,7 +26,7 @@ Add dc.type to the output options for Atmire&rsquo;s Listings and Reports module
Add dc.type to the output options for Atmire&rsquo;s Listings and Reports module (#286) Add dc.type to the output options for Atmire&rsquo;s Listings and Reports module (#286)
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -46,7 +46,7 @@ I see thousands of them in the logs for the last few months, so it&rsquo;s not r
I&rsquo;ve raised a ticket with Atmire to ask I&rsquo;ve raised a ticket with Atmire to ask
Another worrying error from dspace.log is: Another worrying error from dspace.log is:
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -28,7 +28,7 @@ I checked to see if the Solr sharding task that is supposed to run on January 1s
I tested on DSpace Test as well and it doesn&rsquo;t work there either I tested on DSpace Test as well and it doesn&rsquo;t work there either
I asked on the dspace-tech mailing list because it seems to be broken, and actually now I&rsquo;m not sure if we&rsquo;ve ever had the sharding task run successfully over all these years I asked on the dspace-tech mailing list because it seems to be broken, and actually now I&rsquo;m not sure if we&rsquo;ve ever had the sharding task run successfully over all these years
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -50,7 +50,7 @@ DELETE 1
Create issue on GitHub to track the addition of CCAFS Phase II project tags (#301) Create issue on GitHub to track the addition of CCAFS Phase II project tags (#301)
Looks like we&rsquo;ll be using cg.identifier.ccafsprojectpii as the field name Looks like we&rsquo;ll be using cg.identifier.ccafsprojectpii as the field name
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -54,7 +54,7 @@ Interestingly, it seems DSpace 4.x&rsquo;s thumbnails were sRGB, but forcing reg
$ identify ~/Desktop/alc_contrastes_desafios.jpg $ identify ~/Desktop/alc_contrastes_desafios.jpg
/Users/aorth/Desktop/alc_contrastes_desafios.jpg JPEG 464x600 464x600&#43;0&#43;0 8-bit CMYK 168KB 0.000u 0:00.000 /Users/aorth/Desktop/alc_contrastes_desafios.jpg JPEG 464x600 464x600&#43;0&#43;0 8-bit CMYK 168KB 0.000u 0:00.000
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -40,7 +40,7 @@ Testing the CMYK patch on a collection with 650 items:
$ [dspace]/bin/dspace filter-media -f -i 10568/16498 -p &#34;ImageMagick PDF Thumbnail&#34; -v &gt;&amp; /tmp/filter-media-cmyk.txt $ [dspace]/bin/dspace filter-media -f -i 10568/16498 -p &#34;ImageMagick PDF Thumbnail&#34; -v &gt;&amp; /tmp/filter-media-cmyk.txt
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -18,7 +18,7 @@
<meta name="twitter:card" content="summary"/> <meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="May, 2017"/> <meta name="twitter:title" content="May, 2017"/>
<meta name="twitter:description" content="2017-05-01 ICARDA apparently started working on CG Core on their MEL repository They have done a few cg.* fields, but not very consistent and even copy some of CGSpace items: https://mel.cgiar.org/xmlui/handle/20.500.11766/6911?show=full https://cgspace.cgiar.org/handle/10568/73683 2017-05-02 Atmire got back about the Workflow Statistics issue, and apparently it&rsquo;s a bug in the CUA module so they will send us a pull request 2017-05-04 Sync DSpace Test with database and assetstore from CGSpace Re-deploy DSpace Test with Atmire&rsquo;s CUA patch for workflow statistics, run system updates, and restart the server Now I can see the workflow statistics and am able to select users, but everything returns 0 items Megan says there are still some mapped items are not appearing since last week, so I forced a full index-discovery -b Need to remember to check if the collection has more items (currently 39 on CGSpace, but 118 on the freshly reindexed DSPace Test) tomorrow: https://cgspace."/> <meta name="twitter:description" content="2017-05-01 ICARDA apparently started working on CG Core on their MEL repository They have done a few cg.* fields, but not very consistent and even copy some of CGSpace items: https://mel.cgiar.org/xmlui/handle/20.500.11766/6911?show=full https://cgspace.cgiar.org/handle/10568/73683 2017-05-02 Atmire got back about the Workflow Statistics issue, and apparently it&rsquo;s a bug in the CUA module so they will send us a pull request 2017-05-04 Sync DSpace Test with database and assetstore from CGSpace Re-deploy DSpace Test with Atmire&rsquo;s CUA patch for workflow statistics, run system updates, and restart the server Now I can see the workflow statistics and am able to select users, but everything returns 0 items Megan says there are still some mapped items are not appearing since last week, so I forced a full index-discovery -b Need to remember to check if the collection has more items (currently 39 on CGSpace, but 118 on the freshly reindexed DSPace Test) tomorrow: https://cgspace."/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -18,7 +18,7 @@
<meta name="twitter:card" content="summary"/> <meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="June, 2017"/> <meta name="twitter:title" content="June, 2017"/>
<meta name="twitter:description" content="2017-06-01 After discussion with WLE and CGSpace content people, we decided to just add one metadata field for the WLE Research Themes The cg.identifier.wletheme field will be used for both Phase I and Phase II Research Themes Then we&rsquo;ll create a new sub-community for Phase II and create collections for the research themes there The current &ldquo;Research Themes&rdquo; community will be renamed to &ldquo;WLE Phase I Research Themes&rdquo; Tagged all items in the current Phase I collections with their appropriate themes Create pull request to add Phase II research themes to the submission form: #328 Add cg."/> <meta name="twitter:description" content="2017-06-01 After discussion with WLE and CGSpace content people, we decided to just add one metadata field for the WLE Research Themes The cg.identifier.wletheme field will be used for both Phase I and Phase II Research Themes Then we&rsquo;ll create a new sub-community for Phase II and create collections for the research themes there The current &ldquo;Research Themes&rdquo; community will be renamed to &ldquo;WLE Phase I Research Themes&rdquo; Tagged all items in the current Phase I collections with their appropriate themes Create pull request to add Phase II research themes to the submission form: #328 Add cg."/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -36,7 +36,7 @@ Merge changes for WLE Phase II theme rename (#329)
Looking at extracting the metadata registries from ICARDA&rsquo;s MEL DSpace database so we can compare fields with CGSpace Looking at extracting the metadata registries from ICARDA&rsquo;s MEL DSpace database so we can compare fields with CGSpace
We can use PostgreSQL&rsquo;s extended output format (-x) plus sed to format the output into quasi XML: We can use PostgreSQL&rsquo;s extended output format (-x) plus sed to format the output into quasi XML:
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -60,7 +60,7 @@ This was due to newline characters in the dc.description.abstract column, which
I exported a new CSV from the collection on DSpace Test and then manually removed the characters in vim using g/^$/d I exported a new CSV from the collection on DSpace Test and then manually removed the characters in vim using g/^$/d
Then I cleaned up the author authorities and HTML characters in OpenRefine and sent the file back to Abenet Then I cleaned up the author authorities and HTML characters in OpenRefine and sent the file back to Abenet
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -32,7 +32,7 @@ Linode sent an alert that CGSpace (linode18) was using 261% CPU for the past two
Ask Sisay to clean up the WLE approvers a bit, as Marianne&rsquo;s user account is both in the approvers step as well as the group Ask Sisay to clean up the WLE approvers a bit, as Marianne&rsquo;s user account is both in the approvers step as well as the group
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -34,7 +34,7 @@ http://hdl.handle.net/10568/78495||http://hdl.handle.net/10568/79336
There appears to be a pattern but I&rsquo;ll have to look a bit closer and try to clean them up automatically, either in SQL or in OpenRefine There appears to be a pattern but I&rsquo;ll have to look a bit closer and try to clean them up automatically, either in SQL or in OpenRefine
Add Katherine Lutz to the groups for content submission and edit steps of the CGIAR System collections Add Katherine Lutz to the groups for content submission and edit steps of the CGIAR System collections
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -48,7 +48,7 @@ Generate list of authors on CGSpace for Peter to go through and correct:
dspace=# \copy (select distinct text_value, count(*) as count from metadatavalue where metadata_field_id = (select metadata_field_id from metadatafieldregistry where element = &#39;contributor&#39; and qualifier = &#39;author&#39;) AND resource_type_id = 2 group by text_value order by count desc) to /tmp/authors.csv with csv; dspace=# \copy (select distinct text_value, count(*) as count from metadatavalue where metadata_field_id = (select metadata_field_id from metadatafieldregistry where element = &#39;contributor&#39; and qualifier = &#39;author&#39;) AND resource_type_id = 2 group by text_value order by count desc) to /tmp/authors.csv with csv;
COPY 54701 COPY 54701
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -30,7 +30,7 @@ The logs say &ldquo;Timeout waiting for idle object&rdquo;
PostgreSQL activity says there are 115 connections currently PostgreSQL activity says there are 115 connections currently
The list of connections to XMLUI and REST API for today: The list of connections to XMLUI and REST API for today:
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -150,7 +150,7 @@ dspace.log.2018-01-02:34
Danny wrote to ask for help renewing the wildcard ilri.org certificate and I advised that we should probably use Let&rsquo;s Encrypt if it&rsquo;s just a handful of domains Danny wrote to ask for help renewing the wildcard ilri.org certificate and I advised that we should probably use Let&rsquo;s Encrypt if it&rsquo;s just a handful of domains
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -30,7 +30,7 @@ We don&rsquo;t need to distinguish between internal and external works, so that
Yesterday I figured out how to monitor DSpace sessions using JMX Yesterday I figured out how to monitor DSpace sessions using JMX
I copied the logic in the jmx_tomcat_dbpools provided by Ubuntu&rsquo;s munin-plugins-java package and used the stuff I discovered about JMX in 2018-01 I copied the logic in the jmx_tomcat_dbpools provided by Ubuntu&rsquo;s munin-plugins-java package and used the stuff I discovered about JMX in 2018-01
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -24,7 +24,7 @@ Export a CSV of the IITA community metadata for Martin Mueller
Export a CSV of the IITA community metadata for Martin Mueller Export a CSV of the IITA community metadata for Martin Mueller
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -26,7 +26,7 @@ Catalina logs at least show some memory errors yesterday:
I tried to test something on DSpace Test but noticed that it&rsquo;s down since god knows when I tried to test something on DSpace Test but noticed that it&rsquo;s down since god knows when
Catalina logs at least show some memory errors yesterday: Catalina logs at least show some memory errors yesterday:
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -38,7 +38,7 @@ http://localhost:3000/solr/statistics/update?stream.body=%3Ccommit/%3E
Then I reduced the JVM heap size from 6144 back to 5120m Then I reduced the JVM heap size from 6144 back to 5120m
Also, I switched it to use OpenJDK instead of Oracle Java, as well as re-worked the Ansible infrastructure scripts to support hosts choosing which distribution they want to use Also, I switched it to use OpenJDK instead of Oracle Java, as well as re-worked the Ansible infrastructure scripts to support hosts choosing which distribution they want to use
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -58,7 +58,7 @@ real 74m42.646s
user 8m5.056s user 8m5.056s
sys 2m7.289s sys 2m7.289s
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -36,7 +36,7 @@ During the mvn package stage on the 5.8 branch I kept getting issues with java r
There is insufficient memory for the Java Runtime Environment to continue. There is insufficient memory for the Java Runtime Environment to continue.
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -46,7 +46,7 @@ Anyways, perhaps I should increase the JVM heap from 5120m to 6144m like we did
The server only has 8GB of RAM so we&rsquo;ll eventually need to upgrade to a larger one because we&rsquo;ll start starving the OS, PostgreSQL, and command line batch processes The server only has 8GB of RAM so we&rsquo;ll eventually need to upgrade to a larger one because we&rsquo;ll start starving the OS, PostgreSQL, and command line batch processes
I ran all system updates on DSpace Test and rebooted it I ran all system updates on DSpace Test and rebooted it
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -30,7 +30,7 @@ I&rsquo;ll update the DSpace role in our Ansible infrastructure playbooks and ru
Also, I&rsquo;ll re-run the postgresql tasks because the custom PostgreSQL variables are dynamic according to the system&rsquo;s RAM, and we never re-ran them after migrating to larger Linodes last month Also, I&rsquo;ll re-run the postgresql tasks because the custom PostgreSQL variables are dynamic according to the system&rsquo;s RAM, and we never re-ran them after migrating to larger Linodes last month
I&rsquo;m testing the new DSpace 5.8 branch in my Ubuntu 18.04 environment and I&rsquo;m getting those autowire errors in Tomcat 8.5.30 again: I&rsquo;m testing the new DSpace 5.8 branch in my Ubuntu 18.04 environment and I&rsquo;m getting those autowire errors in Tomcat 8.5.30 again:
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -26,7 +26,7 @@ I created a GitHub issue to track this #389, because I&rsquo;m super busy in Nai
Phil Thornton got an ORCID identifier so we need to add it to the list on CGSpace and tag his existing items Phil Thornton got an ORCID identifier so we need to add it to the list on CGSpace and tag his existing items
I created a GitHub issue to track this #389, because I&rsquo;m super busy in Nairobi right now I created a GitHub issue to track this #389, because I&rsquo;m super busy in Nairobi right now
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -36,7 +36,7 @@ Send a note about my dspace-statistics-api to the dspace-tech mailing list
Linode has been sending mails a few times a day recently that CGSpace (linode18) has had high CPU usage Linode has been sending mails a few times a day recently that CGSpace (linode18) has had high CPU usage
Today these are the top 10 IPs: Today these are the top 10 IPs:
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -36,7 +36,7 @@ Then I ran all system updates and restarted the server
I noticed that there is another issue with PDF thumbnails on CGSpace, and I see there was another Ghostscript vulnerability last week I noticed that there is another issue with PDF thumbnails on CGSpace, and I see there was another Ghostscript vulnerability last week
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -50,7 +50,7 @@ I don&rsquo;t see anything interesting in the web server logs around that time t
357 207.46.13.1 357 207.46.13.1
903 54.70.40.11 903 54.70.40.11
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -72,7 +72,7 @@ real 0m19.873s
user 0m22.203s user 0m22.203s
sys 0m1.979s sys 0m1.979s
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -46,7 +46,7 @@ Most worryingly, there are encoding errors in the abstracts for eleven items, fo
I think I will need to ask Udana to re-copy and paste the abstracts with more care using Google Docs I think I will need to ask Udana to re-copy and paste the abstracts with more care using Google Docs
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -64,7 +64,7 @@ $ ./fix-metadata-values.py -i /tmp/2019-02-21-fix-4-regions.csv -db dspace -u ds
$ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-2-countries.csv -db dspace -u dspace -p &#39;fuuu&#39; -m 228 -f cg.coverage.country -d $ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-2-countries.csv -db dspace -u dspace -p &#39;fuuu&#39; -m 228 -f cg.coverage.country -d
$ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-1-region.csv -db dspace -u dspace -p &#39;fuuu&#39; -m 231 -f cg.coverage.region -d $ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-1-region.csv -db dspace -u dspace -p &#39;fuuu&#39; -m 231 -f cg.coverage.region -d
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -48,7 +48,7 @@ DELETE 1
But after this I tried to delete the item from the XMLUI and it is still present&hellip; But after this I tried to delete the item from the XMLUI and it is still present&hellip;
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -34,7 +34,7 @@ Run system updates on CGSpace (linode18) and reboot it
Skype with Marie-Angélique and Abenet about CG Core v2 Skype with Marie-Angélique and Abenet about CG Core v2
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -38,7 +38,7 @@ CGSpace
Abenet had another similar issue a few days ago when trying to find the stats for 2018 in the RTB community Abenet had another similar issue a few days ago when trying to find the stats for 2018 in the RTB community
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -46,7 +46,7 @@ After rebooting, all statistics cores were loaded&hellip; wow, that&rsquo;s luck
Run system updates on DSpace Test (linode19) and reboot it Run system updates on DSpace Test (linode19) and reboot it
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -72,7 +72,7 @@ Here are the top ten IPs in the nginx XMLUI and REST/OAI logs this morning:
7249 2a01:7e00::f03c:91ff:fe18:7396 7249 2a01:7e00::f03c:91ff:fe18:7396
9124 45.5.186.2 9124 45.5.186.2
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -18,7 +18,7 @@
<meta name="twitter:card" content="summary"/> <meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="October, 2019"/> <meta name="twitter:title" content="October, 2019"/>
<meta name="twitter:description" content="2019-10-01 Udana from IWMI asked me for a CSV export of their community on CGSpace I exported it, but a quick run through the csv-metadata-quality tool shows that there are some low-hanging fruits we can fix before I send him the data I will limit the scope to the titles, regions, subregions, and river basins for now to manually fix some non-breaking spaces (U&#43;00A0) there that would otherwise be removed by the csv-metadata-quality script&rsquo;s &ldquo;unneccesary Unicode&rdquo; fix: $ csvcut -c &#39;id,dc."/> <meta name="twitter:description" content="2019-10-01 Udana from IWMI asked me for a CSV export of their community on CGSpace I exported it, but a quick run through the csv-metadata-quality tool shows that there are some low-hanging fruits we can fix before I send him the data I will limit the scope to the titles, regions, subregions, and river basins for now to manually fix some non-breaking spaces (U&#43;00A0) there that would otherwise be removed by the csv-metadata-quality script&rsquo;s &ldquo;unneccesary Unicode&rdquo; fix: $ csvcut -c &#39;id,dc."/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -58,7 +58,7 @@ Let&rsquo;s see how many of the REST API requests were for bitstreams (because t
# zcat --force /var/log/nginx/rest.log.*.gz | grep -E &#34;[0-9]{1,2}/Oct/2019&#34; | grep -c -E &#34;/rest/bitstreams&#34; # zcat --force /var/log/nginx/rest.log.*.gz | grep -E &#34;[0-9]{1,2}/Oct/2019&#34; | grep -c -E &#34;/rest/bitstreams&#34;
106781 106781
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -46,7 +46,7 @@ Make sure all packages are up to date and the package manager is up to date, the
# dpkg -C # dpkg -C
# reboot # reboot
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -56,7 +56,7 @@ I tweeted the CGSpace repository link
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -38,7 +38,7 @@ The code finally builds and runs with a fresh install
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -42,7 +42,7 @@ You need to download this into the DSpace 6.x source and compile it
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -48,7 +48,7 @@ The third item now has a donut with score 1 since I tweeted it last week
On the same note, the one item Abenet pointed out last week now has a donut with score of 104 after I tweeted it last week On the same note, the one item Abenet pointed out last week now has a donut with score of 104 after I tweeted it last week
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -34,7 +34,7 @@ I see that CGSpace (linode18) is still using PostgreSQL JDBC driver version 42.2
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -36,7 +36,7 @@ I sent Atmire the dspace.log from today and told them to log into the server to
In other news, I checked the statistics API on DSpace 6 and it&rsquo;s working In other news, I checked the statistics API on DSpace 6 and it&rsquo;s working
I tried to build the OAI registry on the freshly migrated DSpace 6 on DSpace Test and I get an error: I tried to build the OAI registry on the freshly migrated DSpace 6 on DSpace Test and I get an error:
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -38,7 +38,7 @@ I restarted Tomcat and PostgreSQL and the issue was gone
Since I was restarting Tomcat anyways I decided to redeploy the latest changes from the 5_x-prod branch and I added a note about COVID-19 items to the CGSpace frontpage at Peter&rsquo;s request Since I was restarting Tomcat anyways I decided to redeploy the latest changes from the 5_x-prod branch and I added a note about COVID-19 items to the CGSpace frontpage at Peter&rsquo;s request
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -36,7 +36,7 @@ It is class based so I can easily add support for other vocabularies, and the te
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -48,7 +48,7 @@ I filed a bug on OpenRXV: https://github.com/ilri/OpenRXV/issues/39
I filed an issue on OpenRXV to make some minor edits to the admin UI: https://github.com/ilri/OpenRXV/issues/40 I filed an issue on OpenRXV to make some minor edits to the admin UI: https://github.com/ilri/OpenRXV/issues/40
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -44,7 +44,7 @@ During the FlywayDB migration I got an error:
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -32,7 +32,7 @@ So far we&rsquo;ve spent at least fifty hours to process the statistics and stat
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -36,7 +36,7 @@ I started processing those (about 411,000 records):
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -50,7 +50,7 @@ For example, this item has 51 views on CGSpace, but 0 on AReS
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -60,7 +60,7 @@ $ curl -s &#39;http://localhost:9200/openrxv-items-temp/_count?q=*&amp;pretty&#3
} }
} }
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -34,7 +34,7 @@ Also, we found some issues building and running OpenRXV currently due to ecosyst
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -44,7 +44,7 @@ Perhaps one of the containers crashed, I should have looked closer but I was in
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -36,7 +36,7 @@ I looked at the top user agents and IPs in the Solr statistics for last month an
I will add the RI/1.0 pattern to our DSpace agents overload and purge them from Solr (we had previously seen this agent with 9,000 hits or so in 2020-09), but I think I will leave the Microsoft Word one&hellip; as that&rsquo;s an actual user&hellip; I will add the RI/1.0 pattern to our DSpace agents overload and purge them from Solr (we had previously seen this agent with 9,000 hits or so in 2020-09), but I think I will leave the Microsoft Word one&hellip; as that&rsquo;s an actual user&hellip;
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -36,7 +36,7 @@ I simply started it and AReS was running again:
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -30,7 +30,7 @@ Export another list of ALL subjects on CGSpace, including AGROVOC and non-AGROVO
localhost/dspace63= &gt; \COPY (SELECT DISTINCT LOWER(text_value) AS subject, count(*) FROM metadatavalue WHERE dspace_object_id in (SELECT dspace_object_id FROM item) AND metadata_field_id IN (119, 120, 127, 122, 128, 125, 135, 203, 208, 210, 215, 123, 236, 242, 187) GROUP BY subject ORDER BY count DESC) to /tmp/2021-07-01-all-subjects.csv WITH CSV HEADER; localhost/dspace63= &gt; \COPY (SELECT DISTINCT LOWER(text_value) AS subject, count(*) FROM metadatavalue WHERE dspace_object_id in (SELECT dspace_object_id FROM item) AND metadata_field_id IN (119, 120, 127, 122, 128, 125, 135, 203, 208, 210, 215, 123, 236, 242, 187) GROUP BY subject ORDER BY count DESC) to /tmp/2021-07-01-all-subjects.csv WITH CSV HEADER;
COPY 20994 COPY 20994
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -32,7 +32,7 @@ Update Docker images on AReS server (linode20) and reboot the server:
I decided to upgrade linode20 from Ubuntu 18.04 to 20.04 I decided to upgrade linode20 from Ubuntu 18.04 to 20.04
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -48,7 +48,7 @@ The syntax Moayad showed me last month doesn&rsquo;t seem to honor the search qu
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -46,7 +46,7 @@ $ wc -l /tmp/2021-10-01-affiliations.txt
So we have 1879/7100 (26.46%) matching already So we have 1879/7100 (26.46%) matching already
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -32,7 +32,7 @@ First I exported all the 2019 stats from CGSpace:
$ ./run.sh -s http://localhost:8081/solr/statistics -f &#39;time:2019-*&#39; -a export -o statistics-2019.json -k uid $ ./run.sh -s http://localhost:8081/solr/statistics -f &#39;time:2019-*&#39; -a export -o statistics-2019.json -k uid
$ zstd statistics-2019.json $ zstd statistics-2019.json
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -40,7 +40,7 @@ Purging 455 hits from WhatsApp in statistics
Total number of bot hits purged: 3679 Total number of bot hits purged: 3679
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -24,7 +24,7 @@ Start a full harvest on AReS
Start a full harvest on AReS Start a full harvest on AReS
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -38,7 +38,7 @@ We agreed to try to do more alignment of affiliations/funders with ROR
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -34,7 +34,7 @@ $ ./ilri/check-duplicates.py -i /tmp/tac4.csv -db dspace -u dspace -p &#39;fuuu&
$ csvcut -c id,filename ~/Downloads/2022-03-01-CGSpace-TAC-ICW-batch4-701-980.csv &gt; /tmp/tac4-filenames.csv $ csvcut -c id,filename ~/Downloads/2022-03-01-CGSpace-TAC-ICW-batch4-701-980.csv &gt; /tmp/tac4-filenames.csv
$ csvjoin -c id /tmp/2022-03-01-tac-batch4-701-980.csv /tmp/tac4-filenames.csv &gt; /tmp/2022-03-01-tac-batch4-701-980-filenames.csv $ csvjoin -c id /tmp/2022-03-01-tac-batch4-701-980.csv /tmp/tac4-filenames.csv &gt; /tmp/2022-03-01-tac-batch4-701-980-filenames.csv
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -18,7 +18,7 @@
<meta name="twitter:card" content="summary"/> <meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="April, 2022"/> <meta name="twitter:title" content="April, 2022"/>
<meta name="twitter:description" content="2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia&rsquo;s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54."/> <meta name="twitter:description" content="2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia&rsquo;s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54."/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -66,7 +66,7 @@ If I query Solr for time:2022-04* AND dns:*msnbot* AND dns:*.msn.com. I see a ha
I purged 93,974 hits from these IPs using my check-spider-ip-hits.sh script I purged 93,974 hits from these IPs using my check-spider-ip-hits.sh script
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -48,7 +48,7 @@ There seem to be many more of these:
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -34,7 +34,7 @@ Also, the trgm functions I&rsquo;ve used before are case insensitive, but Levens
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -24,7 +24,7 @@ Our request to add CC-BY-3.0-IGO to SPDX was approved a few weeks ago
Our request to add CC-BY-3.0-IGO to SPDX was approved a few weeks ago Our request to add CC-BY-3.0-IGO to SPDX was approved a few weeks ago
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -46,7 +46,7 @@ I also fixed a few bugs and improved the region-matching logic
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -36,7 +36,7 @@ I filed an issue to ask about Java 11&#43; support
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -44,7 +44,7 @@ I want to make sure they use groups instead of individuals where possible!
I reverted the Cocoon autosave change because it was more of a nuissance that Peter can&rsquo;t upload CSVs from the web interface and is a very low severity security issue I reverted the Cocoon autosave change because it was more of a nuissance that Peter can&rsquo;t upload CSVs from the web interface and is a very low severity security issue
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -36,7 +36,7 @@ I exported the CCAFS and IITA communities, extracted just the country and region
Add a few more authors to my CSV with author names and ORCID identifiers and tag 283 items! Add a few more authors to my CSV with author names and ORCID identifiers and tag 283 items!
Replace &ldquo;East Asia&rdquo; with &ldquo;Eastern Asia&rdquo; region on CGSpace (UN M.49 region) Replace &ldquo;East Asia&rdquo; with &ldquo;Eastern Asia&rdquo; region on CGSpace (UN M.49 region)
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -34,7 +34,7 @@ I see we have some new ones that aren&rsquo;t in our list if I combine with this
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -32,7 +32,7 @@ I want to try to expand my use of their data to journals, publishers, volumes, i
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -28,7 +28,7 @@ Remove cg.subject.wle and cg.identifier.wletheme from CGSpace input form after c
iso-codes 4.13.0 was released, which incorporates my changes to the common names for Iran, Laos, and Syria iso-codes 4.13.0 was released, which incorporates my changes to the common names for Iran, Laos, and Syria
I finally got through with porting the input form from DSpace 6 to DSpace 7 I finally got through with porting the input form from DSpace 6 to DSpace 7
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -36,7 +36,7 @@ I also did a check for missing country/region mappings with csv-metadata-quality
Start a harvest on AReS Start a harvest on AReS
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -46,7 +46,7 @@ Also I found at least two spelling mistakes, for example &ldquo;decison support
Work on cleaning, proofing, and uploading twenty-seven records for IFPRI to CGSpace Work on cleaning, proofing, and uploading twenty-seven records for IFPRI to CGSpace
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -44,7 +44,7 @@ From what I can see we need to upgrade the MODS schema from 3.1 to 3.7 and then
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -18,7 +18,7 @@
<meta name="twitter:card" content="summary"/> <meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="July, 2023"/> <meta name="twitter:title" content="July, 2023"/>
<meta name="twitter:description" content="2023-07-01 Export CGSpace to check for missing Initiative collection mappings Start harvesting on AReS 2023-07-02 Minor edits to the crossref_doi_lookup.py script while running some checks from 22,000 CGSpace DOIs 2023-07-03 I analyzed the licenses declared by Crossref and found with high confidence that ~400 of ours were incorrect I took the more accurate ones from Crossref and updated the items on CGSpace I took a few hundred ISBNs as well for where we were missing them I also tagged ~4,700 items with missing licenses as &ldquo;Copyrighted; all rights reserved&rdquo; based on their Crossref license status being TDM, mostly from Elsevier, Wiley, and Springer Checking a dozen or so manually, I confirmed that if Crossref only has a TDM license then it&rsquo;s usually copyrighted (could still be open access, but we can&rsquo;t tell via Crossref) I would be curious to write a script to check the Unpaywall API for open access status&hellip; In the past I found that their license status was not very accurate, but the open access status might be more reliable More minor work on the DSpace 7 item views I learned some new Angular template syntax I created a custom component to show Creative Commons licenses on the simple item page I also decided that I don&rsquo;t like the Impact Area icons as a component because they don&rsquo;t have any visual meaning 2023-07-04 Focus group meeting with CGSpace partners about DSpace 7 I added a themed file selection component to the CGSpace theme It displays the bistream description instead of the file name, just like we did in DSpace 6 XMLUI I added a custom component to show share icons 2023-07-05 I spent some time trying to update OpenRXV from Angular 9 to 10 to 11 to 12 to 13 Most things work but there are some minor bugs it seems Mishell from CIP emailed me to say she was having problems approving an item on CGSpace Looking at PostgreSQL I saw there were a dozen or so locks that were several hours and even over one day old so I killed those processes and told her to try again 2023-07-06 Types meeting I wrote a Python script to check Unpaywall for some information about DOIs 2023-07-7 Continue exploring Unpaywall data for some of our DOIs In the past I&rsquo;ve found their licensing information to not be very reliable (preferring Crossref), but I think their open access status is more reliable, especially when the provider is listed as being the publisher Even so, sometimes the version can be &ldquo;acceptedVersion&rdquo;, which is presumably the author&rsquo;s version, as opposed to the &ldquo;publishedVersion&rdquo;, which means it&rsquo;s available as open access on the publisher&rsquo;s website I did some quality assurance and found ~100 that were marked as Limited Access, but should have been Open Access, and fixed a handful of licenses Delete duplicate metadata as described in my DSpace issue from last year: https://github."/> <meta name="twitter:description" content="2023-07-01 Export CGSpace to check for missing Initiative collection mappings Start harvesting on AReS 2023-07-02 Minor edits to the crossref_doi_lookup.py script while running some checks from 22,000 CGSpace DOIs 2023-07-03 I analyzed the licenses declared by Crossref and found with high confidence that ~400 of ours were incorrect I took the more accurate ones from Crossref and updated the items on CGSpace I took a few hundred ISBNs as well for where we were missing them I also tagged ~4,700 items with missing licenses as &ldquo;Copyrighted; all rights reserved&rdquo; based on their Crossref license status being TDM, mostly from Elsevier, Wiley, and Springer Checking a dozen or so manually, I confirmed that if Crossref only has a TDM license then it&rsquo;s usually copyrighted (could still be open access, but we can&rsquo;t tell via Crossref) I would be curious to write a script to check the Unpaywall API for open access status&hellip; In the past I found that their license status was not very accurate, but the open access status might be more reliable More minor work on the DSpace 7 item views I learned some new Angular template syntax I created a custom component to show Creative Commons licenses on the simple item page I also decided that I don&rsquo;t like the Impact Area icons as a component because they don&rsquo;t have any visual meaning 2023-07-04 Focus group meeting with CGSpace partners about DSpace 7 I added a themed file selection component to the CGSpace theme It displays the bistream description instead of the file name, just like we did in DSpace 6 XMLUI I added a custom component to show share icons 2023-07-05 I spent some time trying to update OpenRXV from Angular 9 to 10 to 11 to 12 to 13 Most things work but there are some minor bugs it seems Mishell from CIP emailed me to say she was having problems approving an item on CGSpace Looking at PostgreSQL I saw there were a dozen or so locks that were several hours and even over one day old so I killed those processes and told her to try again 2023-07-06 Types meeting I wrote a Python script to check Unpaywall for some information about DOIs 2023-07-7 Continue exploring Unpaywall data for some of our DOIs In the past I&rsquo;ve found their licensing information to not be very reliable (preferring Crossref), but I think their open access status is more reliable, especially when the provider is listed as being the publisher Even so, sometimes the version can be &ldquo;acceptedVersion&rdquo;, which is presumably the author&rsquo;s version, as opposed to the &ldquo;publishedVersion&rdquo;, which means it&rsquo;s available as open access on the publisher&rsquo;s website I did some quality assurance and found ~100 that were marked as Limited Access, but should have been Open Access, and fixed a handful of licenses Delete duplicate metadata as described in my DSpace issue from last year: https://github."/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -34,7 +34,7 @@ I did some minor cleanups myself and applied them to CGSpace
Start working on some batch uploads for IFPRI Start working on some batch uploads for IFPRI
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -26,7 +26,7 @@ Start a harvest on AReS
Export CGSpace to check for missing Initiative collection mappings Export CGSpace to check for missing Initiative collection mappings
Start a harvest on AReS Start a harvest on AReS
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -36,7 +36,7 @@ We can be on the safe side by using only abstracts for items that are licensed u
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -42,7 +42,7 @@ I improved the filtering and wrote some Python using pandas to merge my sources
Export CGSpace to check missing Initiative collection mappings Export CGSpace to check missing Initiative collection mappings
Start a harvest on AReS Start a harvest on AReS
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -18,7 +18,7 @@
<meta name="twitter:card" content="summary"/> <meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="December, 2023"/> <meta name="twitter:title" content="December, 2023"/>
<meta name="twitter:description" content="2023-12-01 There is still high load on CGSpace and I don&rsquo;t know why I don&rsquo;t see a high number of sessions compared to previous days in the last few weeks $ for file in dspace.log.2023-11-[23]*; do echo &#34;$file&#34;; grep -a -oE &#39;session_id=[A-Z0-9]{32}&#39; &#34;$file&#34; | sort | uniq | wc -l; done dspace.log.2023-11-20 22865 dspace.log.2023-11-21 20296 dspace.log.2023-11-22 19688 dspace.log.2023-11-23 17906 dspace.log.2023-11-24 18453 dspace.log.2023-11-25 17513 dspace.log.2023-11-26 19037 dspace.log.2023-11-27 21103 dspace.log.2023-11-28 23023 dspace.log.2023-11-29 23545 dspace."/> <meta name="twitter:description" content="2023-12-01 There is still high load on CGSpace and I don&rsquo;t know why I don&rsquo;t see a high number of sessions compared to previous days in the last few weeks $ for file in dspace.log.2023-11-[23]*; do echo &#34;$file&#34;; grep -a -oE &#39;session_id=[A-Z0-9]{32}&#39; &#34;$file&#34; | sort | uniq | wc -l; done dspace.log.2023-11-20 22865 dspace.log.2023-11-21 20296 dspace.log.2023-11-22 19688 dspace.log.2023-11-23 17906 dspace.log.2023-11-24 18453 dspace.log.2023-11-25 17513 dspace.log.2023-11-26 19037 dspace.log.2023-11-27 21103 dspace.log.2023-11-28 23023 dspace.log.2023-11-29 23545 dspace."/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">

View File

@ -6,41 +6,27 @@
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no"> <meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<meta property="og:title" content="January, 2024" /> <meta property="og:title" content="February, 2024" />
<meta property="og:description" content="2024-01-02 <meta property="og:description" content="2024-02-05
Work on preparation of new server for DSpace 7 migration Delete duplicate metadata as described in my DSpace issue from last year: https://github.com/DSpace/DSpace/issues/8253
Lower case all the AGROVOC subjects on CGSpace
I&rsquo;m not quite sure what we need to do for the Handle server
For now I just ran the dspace make-handle-config script and diffed it with the one from DSpace 6
I sent the bundle to the Handle admins to make sure it&rsquo;s OK before we do the migration
Continue testing and debugging the cgspace-java-helpers on DSpace 7
Work on IFPRI ISNAR archive cleanup
" /> " />
<meta property="og:type" content="article" /> <meta property="og:type" content="article" />
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/2024-01/" /> <meta property="og:url" content="https://alanorth.github.io/cgspace-notes/2024-01/" />
<meta property="article:published_time" content="2024-01-02T10:08:00+03:00" /> <meta property="article:published_time" content="2024-01-05T11:10:00+03:00" />
<meta property="article:modified_time" content="2024-02-05T11:09:40+03:00" /> <meta property="article:modified_time" content="2024-02-27T17:18:35+03:00" />
<meta name="twitter:card" content="summary"/> <meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="January, 2024"/> <meta name="twitter:title" content="February, 2024"/>
<meta name="twitter:description" content="2024-01-02 <meta name="twitter:description" content="2024-02-05
Work on preparation of new server for DSpace 7 migration Delete duplicate metadata as described in my DSpace issue from last year: https://github.com/DSpace/DSpace/issues/8253
Lower case all the AGROVOC subjects on CGSpace
I&rsquo;m not quite sure what we need to do for the Handle server
For now I just ran the dspace make-handle-config script and diffed it with the one from DSpace 6
I sent the bundle to the Handle admins to make sure it&rsquo;s OK before we do the migration
Continue testing and debugging the cgspace-java-helpers on DSpace 7
Work on IFPRI ISNAR archive cleanup
"/> "/>
<meta name="generator" content="Hugo 0.123.3"> <meta name="generator" content="Hugo 0.123.6">
@ -48,11 +34,11 @@ Work on IFPRI ISNAR archive cleanup
{ {
"@context": "http://schema.org", "@context": "http://schema.org",
"@type": "BlogPosting", "@type": "BlogPosting",
"headline": "January, 2024", "headline": "February, 2024",
"url": "https://alanorth.github.io/cgspace-notes/2024-01/", "url": "https://alanorth.github.io/cgspace-notes/2024-01/",
"wordCount": "2215", "wordCount": "551",
"datePublished": "2024-01-02T10:08:00+03:00", "datePublished": "2024-01-05T11:10:00+03:00",
"dateModified": "2024-02-05T11:09:40+03:00", "dateModified": "2024-02-27T17:18:35+03:00",
"author": { "author": {
"@type": "Person", "@type": "Person",
"name": "Alan Orth" "name": "Alan Orth"
@ -65,7 +51,7 @@ Work on IFPRI ISNAR archive cleanup
<link rel="canonical" href="https://alanorth.github.io/cgspace-notes/2024-01/"> <link rel="canonical" href="https://alanorth.github.io/cgspace-notes/2024-01/">
<title>January, 2024 | CGSpace Notes</title> <title>February, 2024 | CGSpace Notes</title>
<!-- combined, minified CSS --> <!-- combined, minified CSS -->
@ -117,468 +103,124 @@ Work on IFPRI ISNAR archive cleanup
<article class="blog-post"> <article class="blog-post">
<header> <header>
<h2 class="blog-post-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/2024-01/">January, 2024</a></h2> <h2 class="blog-post-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/2024-01/">February, 2024</a></h2>
<p class="blog-post-meta"> <p class="blog-post-meta">
<time datetime="2024-01-02T10:08:00+03:00">Tue Jan 02, 2024</time> <time datetime="2024-01-05T11:10:00+03:00">Fri Jan 05, 2024</time>
in in
<span class="fas fa-folder" aria-hidden="true"></span>&nbsp;<a href="/categories/notes/" rel="category tag">Notes</a> <span class="fas fa-folder" aria-hidden="true"></span>&nbsp;<a href="/categories/notes/" rel="category tag">Notes</a>
</p> </p>
</header> </header>
<h2 id="2024-01-02">2024-01-02</h2> <h2 id="2024-02-05">2024-02-05</h2>
<ul> <ul>
<li>Work on preparation of new server for DSpace 7 migration <li>Delete duplicate metadata as described in my DSpace issue from last year: <a href="https://github.com/DSpace/DSpace/issues/8253">https://github.com/DSpace/DSpace/issues/8253</a></li>
<li>Lower case all the AGROVOC subjects on CGSpace</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-sql" data-lang="sql"><span style="display:flex;"><span>dspace<span style="color:#f92672">=#</span> <span style="color:#66d9ef">BEGIN</span>;
</span></span><span style="display:flex;"><span><span style="color:#66d9ef">BEGIN</span>
</span></span><span style="display:flex;"><span>dspace<span style="color:#f92672">=*#</span> <span style="color:#66d9ef">UPDATE</span> metadatavalue <span style="color:#66d9ef">SET</span> text_value<span style="color:#f92672">=</span><span style="color:#66d9ef">LOWER</span>(text_value) <span style="color:#66d9ef">WHERE</span> dspace_object_id <span style="color:#66d9ef">IN</span> (<span style="color:#66d9ef">SELECT</span> uuid <span style="color:#66d9ef">FROM</span> item) <span style="color:#66d9ef">AND</span> metadata_field_id<span style="color:#f92672">=</span><span style="color:#ae81ff">187</span> <span style="color:#66d9ef">AND</span> text_value <span style="color:#f92672">~</span> <span style="color:#e6db74">&#39;[[:upper:]]&#39;</span>;
</span></span><span style="display:flex;"><span><span style="color:#66d9ef">UPDATE</span> <span style="color:#ae81ff">180</span>
</span></span><span style="display:flex;"><span>dspace<span style="color:#f92672">=*#</span> <span style="color:#66d9ef">COMMIT</span>;
</span></span><span style="display:flex;"><span><span style="color:#66d9ef">COMMIT</span>
</span></span></code></pre></div><h2 id="2024-02-06">2024-02-06</h2>
<ul> <ul>
<li>I&rsquo;m not quite sure what we need to do for the Handle server</li> <li>Discuss IWMI using the CGSpace REST API for their new website</li>
<li>For now I just ran the <code>dspace make-handle-config</code> script and diffed it with the one from DSpace 6</li> <li>Export the IWMI community to extract their ORCID identifiers:</li>
<li>I sent the bundle to the Handle admins to make sure it&rsquo;s OK before we do the migration</li>
</ul> </ul>
</li> <div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ dspace metadata-export -i 10568/16814 -f /tmp/iwmi.csv
<li>Continue testing and debugging the cgspace-java-helpers on DSpace 7</li> </span></span><span style="display:flex;"><span>$ csvcut -c <span style="color:#e6db74">&#39;cg.creator.identifier,cg.creator.identifier[en_US]&#39;</span> ~/Downloads/2024-02-06-iwmi.csv <span style="color:#ae81ff">\
<li>Work on IFPRI ISNAR archive cleanup</li> </span></span></span><span style="display:flex;"><span><span style="color:#ae81ff"></span> | grep -oE &#39;[A-Z0-9]{4}-[A-Z0-9]{4}-[A-Z0-9]{4}-[A-Z0-9]{4}&#39; \
</ul> </span></span><span style="display:flex;"><span> | sort -u \
<h2 id="2024-01-03">2024-01-03</h2> </span></span><span style="display:flex;"><span> | tee /tmp/iwmi-orcids.txt \
<ul> </span></span><span style="display:flex;"><span> | wc -l
<li>I haven&rsquo;t heard from the Handle admins so I&rsquo;m preparing a backup solution using nginx streams</li> </span></span><span style="display:flex;"><span>353
<li>This seems to work in my simple tests (this must be outside the <code>http {}</code> block):</li> </span></span><span style="display:flex;"><span>$ ./ilri/resolve_orcids.py -i /tmp/iwmi-orcids.txt -o /tmp/iwmi-orcids-names.csv -d
</ul>
<pre tabindex="0"><code>stream {
upstream handle_tcp_9000 {
server 188.34.177.10:9000;
}
server {
listen 9000;
proxy_connect_timeout 1s;
proxy_timeout 3s;
proxy_pass handle_tcp_9000;
}
}
</code></pre><ul>
<li>Here I forwarded a test TCP port 9000 from one server to another and was able to retrieve a test HTML that was running on the target
<ul>
<li>I will have to do TCP and UDP on port 2641, and TCP/HTTP on port 8000.</li>
</ul>
</li>
<li>I did some more minor work on the IFPRI ISNAR archive
<ul>
<li>I got some PDFs from the UMN AgEcon search and fixed some metadata</li>
<li>Then I did some duplicate checking and found five items already on CGSpace</li>
</ul>
</li>
</ul>
<h2 id="2024-01-04">2024-01-04</h2>
<ul>
<li>Upload 692 items for the ISNAR archive to CGSpace: <a href="https://cgspace.cgiar.org/handle/10568/136192">https://cgspace.cgiar.org/handle/10568/136192</a></li>
<li>Help Peter proof and upload 252 items from the 2023 Gender conference to CGSpace</li>
<li>Meeting with IFPRI to discuss their migration to CGSpace
<ul>
<li>We agreed to add two new fields, one for IFPRI project and one for IFPRI publication ranking</li>
<li>Most likely we will use <code>cg.identifier.project</code> as a general field and consolidate other project fields there</li>
<li>Not sure which field to use for the publication rank&hellip;</li>
</ul>
</li>
</ul>
<h2 id="2024-01-05">2024-01-05</h2>
<ul>
<li>Proof and upload 51 items in bulk for IFPRI</li>
<li>I did a big cleanup of user groups in anticipation of complaints about slow workflow tasks etc in DSpace 7
<ul>
<li>I removed ILRI editors from all the dozens of CCAFS community and collection groups, and I should do the same for other CRPs since they are closed for two years now</li>
</ul>
</li>
</ul>
<h2 id="2024-01-06">2024-01-06</h2>
<ul>
<li>Migrate CGSpace to DSpace 7</li>
</ul>
<h2 id="2024-01-07">2024-01-07</h2>
<ul>
<li>High load on the server and UptimeRobot saying the frontend is flapping
<ul>
<li>I noticed tons of logs from pm2 in the systemd journal, so I disabled those in the systemd unit because they are available from pm2&rsquo;s log directory anyway</li>
<li>I also noticed the same for Solr, so I disabled stdout for that systemd unit as well</li>
</ul>
</li>
<li>I spent a lot of time bringing back the nginx rate limits we used in DSpace 6 and it seems to have helped</li>
<li>I see some client doing weird HEAD requests to search pages:</li>
</ul>
<pre tabindex="0"><code>47.76.35.19 - - [07/Jan/2024:00:00:02 +0100] &#34;HEAD /search/?f.accessRights=Open+Access%2Cequals&amp;f.actionArea=Resilient+Agrifood+Systems%2Cequals&amp;f.author=Burkart%2C+Stefan%2Cequals&amp;f.country=Kenya%2Cequals&amp;f.impactArea=Climate+adaptation+and+mitigation%2Cequals&amp;f.itemtype=Brief%2Cequals&amp;f.publisher=CGIAR+System+Organization%2Cequals&amp;f.region=Asia%2Cequals&amp;f.sdg=SDG+12+-+Responsible+consumption+and+production%2Cequals&amp;f.sponsorship=CGIAR+Trust+Fund%2Cequals&amp;f.subject=environmental+factors%2Cequals&amp;spc.page=1 HTTP/1.1&#34; 499 0 &#34;-&#34; &#34;Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.2504.63 Safari/537.36&#34;
</code></pre><ul>
<li>I will add their network blocks (AS45102) and regenerate my list of bot networks:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ wget https://asn.ipinfo.app/api/text/list/AS16276 <span style="color:#ae81ff">\
</span></span></span><span style="display:flex;"><span><span style="color:#ae81ff"></span> https://asn.ipinfo.app/api/text/list/AS23576 \
</span></span><span style="display:flex;"><span> https://asn.ipinfo.app/api/text/list/AS24940 \
</span></span><span style="display:flex;"><span> https://asn.ipinfo.app/api/text/list/AS13238 \
</span></span><span style="display:flex;"><span> https://asn.ipinfo.app/api/text/list/AS14061 \
</span></span><span style="display:flex;"><span> https://asn.ipinfo.app/api/text/list/AS12876 \
</span></span><span style="display:flex;"><span> https://asn.ipinfo.app/api/text/list/AS55286 \
</span></span><span style="display:flex;"><span> https://asn.ipinfo.app/api/text/list/AS203020 \
</span></span><span style="display:flex;"><span> https://asn.ipinfo.app/api/text/list/AS204287 \
</span></span><span style="display:flex;"><span> https://asn.ipinfo.app/api/text/list/AS50245 \
</span></span><span style="display:flex;"><span> https://asn.ipinfo.app/api/text/list/AS6939 \
</span></span><span style="display:flex;"><span> https://asn.ipinfo.app/api/text/list/AS45102 \
</span></span><span style="display:flex;"><span> https://asn.ipinfo.app/api/text/list/AS21859
</span></span><span style="display:flex;"><span>$ cat AS* | sort | uniq | wc -l
</span></span><span style="display:flex;"><span>4897
</span></span><span style="display:flex;"><span>$ cat AS* | ~/go/bin/mapcidr -a &gt; /tmp/networks.txt
</span></span><span style="display:flex;"><span>$ wc -l /tmp/networks.txt
</span></span><span style="display:flex;"><span>2017 /tmp/networks.txt
</span></span></code></pre></div><ul> </span></span></code></pre></div><ul>
<li>I&rsquo;m surprised to see the number of networks reduced from my current ones&hellip; hmmm.</li> <li>I noticed some similar looking names in our list so I clustered them in OpenRefine and manually checked a dozen or so to update our list</li>
<li>I will also update my list of Bing networks:</li>
</ul> </ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ ./ilri/bing-networks-to-ips.sh <h2 id="2024-02-07">2024-02-07</h2>
</span></span><span style="display:flex;"><span>$ ~/go/bin/mapcidr -a &lt; /tmp/bing-ips.txt &gt; /tmp/bing-networks.txt
</span></span><span style="display:flex;"><span>$ wc -l /tmp/bing-networks.txt
</span></span><span style="display:flex;"><span>250 /tmp/bing-networks.txt
</span></span></code></pre></div><h2 id="2024-01-08">2024-01-08</h2>
<ul> <ul>
<li>Export list of publishers for Peter to select some amount to use as a controlled vocabulary:</li> <li>Maria asked me about the &ldquo;missing&rdquo; item from last week again
<ul>
<li>I can see it when I used the Admin search, but not in her workflow</li>
<li>It was submitted by TIP so I checked that user&rsquo;s workspace and found it there</li>
<li>After depositing, it went into the workflow so Maria should be able to see it now</li>
</ul> </ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>localhost/dspace7= ☘ \COPY (SELECT DISTINCT text_value AS &#34;dcterms.publisher&#34;, count(*) FROM metadatavalue WHERE dspace_object_id in (SELECT dspace_object_id FROM item) AND metadata_field_id = 178 GROUP BY &#34;dcterms.publisher&#34; ORDER BY count DESC) to /tmp/2024-01-publishers.csv WITH CSV HEADER; </li>
</span></span><span style="display:flex;"><span>COPY 4332 </ul>
<h2 id="2024-02-09">2024-02-09</h2>
<ul>
<li>Minor edits to CGSpace submission form</li>
<li>Upload 55 ISNAR book chapters to CGSpace from Peter</li>
</ul>
<h2 id="2024-02-19">2024-02-19</h2>
<ul>
<li>Looking into the collection mapping issue on CGSpace
<ul>
<li>It seems to be by design in DSpace 7: <a href="https://github.com/DSpace/dspace-angular/issues/1203">https://github.com/DSpace/dspace-angular/issues/1203</a></li>
<li>This is a massive setback for us&hellip;</li>
</ul>
</li>
</ul>
<h2 id="2024-02-20">2024-02-20</h2>
<ul>
<li>Minor work on OpenRXV to fix a bug in the ng-select drop downs</li>
<li>Minor work on the DSpace 7 nginx configuration to allow requesting robots.txt and sitemaps without hitting rate limits</li>
</ul>
<h2 id="2024-02-21">2024-02-21</h2>
<ul>
<li>Minor updates on OpenRXV, including one bug fix for missing mapped collections
<ul>
<li>Salem had to re-work the harvester for DSpace 7 since the mapped collections and parent collection list are separate!</li>
</ul>
</li>
</ul>
<h2 id="2024-02-22">2024-02-22</h2>
<ul>
<li>Discuss tagging of datasets and re-work the submission form to encourage use of DOI field for any item that has a DOI, and the normal URL field if not
<ul>
<li>The &ldquo;cg.identifier.dataurl&rdquo; field will be used for &ldquo;related&rdquo; datasets</li>
<li>I still have to check and move some metadata for existing datasets</li>
</ul>
</li>
</ul>
<h2 id="2024-02-23">2024-02-23</h2>
<ul>
<li>This morning Tomcat died due to an OOM kill from the kernel:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>kernel: Out of memory: Killed process 698 (java) total-vm:14151300kB, anon-rss:9665812kB, file-rss:320kB, shmem-rss:0kB, UID:997 pgtables:20436kB oom_score_adj:0
</span></span></code></pre></div><ul> </span></span></code></pre></div><ul>
<li>Address some feedback on DSpace 7 from users, including fileing some issues on GitHub <li>I don&rsquo;t see any abnormal pattern in my Grafana graphs, for JVM or system load&hellip; very weird</li>
<li>I updated the submission form on CGSpace to include the new changes to URLs for datasets
<ul> <ul>
<li><a href="https://github.com/DSpace/dspace-angular/issues/2730">https://github.com/DSpace/dspace-angular/issues/2730</a>: List of available metadata fields is truncated when adding new metadata in &ldquo;Edit Item&rdquo;</li> <li>I also updated about 80 datasets to move the URLs to the correct field</li>
</ul>
</li>
<li>The Alliance TIP team was having issues posting to one collection via the legacy DSpace 6 REST API
<ul>
<li>In the DSpace logs I see the same issue that they had last month:</li>
</ul> </ul>
</li> </li>
</ul> </ul>
<pre tabindex="0"><code>ERROR unknown unknown org.dspace.rest.Resource @ Something get wrong. Aborting context in finally statement. <h2 id="2024-02-25">2024-02-25</h2>
</code></pre><h2 id="2024-01-09">2024-01-09</h2>
<ul> <ul>
<li>I restarted Tomcat to see if it helps the REST issue</li> <li>This morning Tomcat died while I was doing a CSV export, with an OOM kill from the kernel:</li>
<li>After talking with Peter about publishers we decided to get a clean list of the top ~100 publishers and then make sure all CGIAR centers, Initiatives, and Impact Platforms are there as well
<ul>
<li>I exported a list from PostgreSQL and then filtered by count &gt; 40 in OpenRefine and then extracted the metadata values:</li>
</ul> </ul>
</li> <div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>kernel: Out of memory: Killed process 720768 (java) total-vm:14079976kB, anon-rss:9301684kB, file-rss:152kB, shmem-rss:0kB, UID:997 pgtables:19488kB oom_score_adj:0
</ul>
<pre tabindex="0"><code>$ csvcut -c dcterms.publisher ~/Downloads/2024-01-09-publishers4.csv | sed -e 1d -e &#39;s/&#34;//g&#39; &gt; /tmp/top-publishers.txt
</code></pre><ul>
<li>Export a list of ORCID identifiers from PostgreSQL to look them up on ORCID and update our controlled vocabulary:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>localhost/dspace7= ☘ \COPY (SELECT DISTINCT(text_value) FROM metadatavalue WHERE dspace_object_id IN (SELECT uuid FROM item) AND metadata_field_id=247) to /tmp/2024-01-09-orcid-identifiers.txt;
</span></span><span style="display:flex;"><span>localhost/dspace7= ☘ \q
</span></span><span style="display:flex;"><span>$ cat ~/src/git/DSpace/dspace/config/controlled-vocabularies/cg-creator-identifier.xml /tmp/2024-01-09-orcid-identifiers.txt | grep -oE <span style="color:#e6db74">&#39;[A-Z0-9]{4}-[A-Z0-9]{4}-[A-Z0-9]{4}-[A-Z0-9]{4}&#39;</span> | sort -u &gt; /tmp/2024-01-09-orcids.txt
</span></span><span style="display:flex;"><span>$ ./ilri/resolve_orcids.py -i /tmp/2024-01-09-orcids.txt -o /tmp/2024-01-09-orcids-names.txt -d
</span></span></code></pre></div><ul> </span></span></code></pre></div><ul>
<li>Then I updated existing ORCID identifiers in CGSpace:</li> <li>I don&rsquo;t know why this is happening so often recently&hellip;</li>
</ul> </ul>
<pre tabindex="0"><code>$ ./ilri/update_orcids.py -i /tmp/2024-01-09-orcids-names.txt -db dspace -u dspace -p bahhhh <h2 id="2024-02-27">2024-02-27</h2>
</code></pre><ul>
<li>Bizu seems to be having issues due to belonging to too many groups
<ul> <ul>
<li>I see some messages from Solr in the DSpace log:</li> <li>IFPRI sent me a list of authors to add to our list for now, until we can find a better way of doing it
<ul>
<li>I extracted the existing authors from our controlled vocabulary and combined them with IFPRI&rsquo;s:</li>
</ul> </ul>
</li> </li>
</ul> </ul>
<pre tabindex="0"><code>2024-01-09 06:23:35,893 ERROR unknown unknown org.dspace.authorize.AuthorizeServiceImpl @ Failed getting getting community/collection admin status for bahhhhh@cgiar.org The search error is: Error from server at http://localhost:8983/solr/search: org.apache.solr.search.SyntaxError: Cannot parse &#39;search.resourcetype:Community AND (admin:eef481147-daf3-4fd2-bb8d-e18af8131d8c OR admin:g80199ef9-bcd6-4961-9512-501dea076607 OR admin:g4ac29263-cf0c-48d0-8be7-7f09317d50ec OR admin:g0e594148-a0f6-4f00-970d-6b7812f89540 OR admin:g0265b87a-2183-4357-a971-7a5b0c7add3a OR admin:g371ae807-f014-4305-b4ec-f2a8f6f0dcfa OR admin:gdc5cb27c-4a5a-45c2-b656-a399fded70de OR admin:ge36d0ece-7a52-4925-afeb-6641d6a348cc OR admin:g15dc1173-7ddf-43cf-a89a-77a7f81c4cfc OR admin:gc3a599d3-c758-46cd-9855-c98f6ab58ae4 OR admin:g3d648c3e-58c3-4342-b500-07cba10ba52d OR admin:g82bf5168-65c1-4627-8eb4-724fa0ea51a7 OR admin:ge751e973-697d-419c-b59b-5a5644702874 OR admin:g44dd0a80-c1e6-4274-9be4-9f342d74928c OR admin:g4842f9c2-73ed-476a-a81a-7167d8aa7946 OR admin:g5f279b3f-c2ce-4c75-b151-1de52c1a540e OR admin:ga6df8adc-2e1d-40f2-8f1e-f77796d0eecd OR admin:gfdfc1621-382e-437a-8674-c9007627565c OR admin:g15cd114a-0b89-442b-a1b4-1febb6959571 OR admin:g12aede99-d018-4c00-b4d4-a732541d0017 OR admin:gc59529d7-002a-4216-b2e1-d909afd2d4a9 OR admin:gd0806714-bc13-460d-bedd-121bdd5436a4 OR admin:gce70739a-8820-4d56-b19c-f191855479e4 OR admin:g7d3409eb-81e3-4156-afb1-7f02de22065f OR admin:g54bc009e-2954-4dad-8c30-be6a09dc5093 OR admin:gc5e1d6b7-4603-40d7-852f-6654c159dec9 OR admin:g0046214d-c85b-4f12-a5e6-2f57a2c3abb0 OR admin:g4c7b4fd0-938f-40e9-ab3e-447c317296c1 OR admin:gcfae9b69-d8dd-4cf3-9a4e-d6e31ff68731 OR ... admin:g20f366c0-96c0-4416-ad0b-46884010925f)&#39;: too many boolean clauses The search resourceType filter was: search.resourcetype:Community <div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ xmllint --xpath <span style="color:#e6db74">&#39;//node/isComposedBy/node()&#39;</span> dspace/config/controlled-vocabularies/dc-contributor-author.xml <span style="color:#ae81ff">\
</code></pre><ul> </span></span></span><span style="display:flex;"><span><span style="color:#ae81ff"></span> | grep -oE &#39;label=&#34;.*&#34;&#39; \
<li>There are 1,805 OR clauses in the full log! </span></span><span style="display:flex;"><span> | sed -e &#39;s/label=&#34;//&#39; -e &#39;s/&#34;$//&#39; &gt; /tmp/authors
</span></span><span style="display:flex;"><span>$ cat /tmp/authors /tmp/ifpri-authors | sort -u &gt; /tmp/new-authors
</span></span></code></pre></div><h2 id="2024-02-28">2024-02-28</h2>
<ul> <ul>
<li>We previous had this issue in 2020-01 and 2020-02 with DSpace 5 and DSpace 6</li> <li>I figured out a way to add a new Angular component to handle all our relation fields</li>
<li>At the time the solution was to increase the <code>maxBooleanClauses</code> in Solr and to disable access rights awareness, but I don&rsquo;t think we want to do the second one now</li>
<li>I saw many users of Solr in other applications increasing this to obscenely high numbers, so I think we should be OK to increase it from 1024 to 2048</li>
</ul>
</li>
<li>Re-visiting the DSpace user groomer to delete inactive users
<ul>
<li>In 2023-08 I noticed that this was now <a href="https://github.com/DSpace/DSpace/pull/2928">possible in DSpace 7</a></li>
<li>As a test I tried to delete all users who have been inactive since six years ago (Janury 9, 2018):</li>
</ul>
</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ dspace dsrun org.dspace.eperson.Groomer -a -b 01/09/2018 -d
</span></span></code></pre></div><ul>
<li>I tested it on DSpace 7 Test and it worked&hellip; I am debating running it on CGSpace&hellip;
<ul>
<li>I see we have almost 9,000 users:</li>
</ul>
</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ dspace user -L &gt; /tmp/users-before.txt
</span></span><span style="display:flex;"><span>$ wc -l /tmp/users-before.txt
</span></span><span style="display:flex;"><span>8943 /tmp/users-before.txt
</span></span></code></pre></div><ul>
<li>I decided to do the same on CGSpace and it worked without errors</li>
<li>I finished working on the controlled vocabulary for publishers</li>
</ul>
<h2 id="2024-01-10">2024-01-10</h2>
<ul>
<li>I spent some time deleting old groups on CGSpace</li>
<li>I looked into the use of the <code>cg.identifier.ciatproject</code> field and found there are only a handful of uses, with some even seeming to be a mistake:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>localhost/dspace7= ☘ SELECT DISTINCT text_value AS &#34;cg.identifier.ciatproject&#34;, count(*) FROM metadatavalue WHERE dspace_object_id in (SELECT dspace_object_id FROM item) AND metadata
</span></span><span style="display:flex;"><span>_field_id = 232 GROUP BY &#34;cg.identifier.ciatproject&#34; ORDER BY count DESC;
</span></span><span style="display:flex;"><span> cg.identifier.ciatproject │ count
</span></span><span style="display:flex;"><span>───────────────────────────┼───────
</span></span><span style="display:flex;"><span> D145 │ 4
</span></span><span style="display:flex;"><span> LAM_LivestockPlus │ 2
</span></span><span style="display:flex;"><span> A215 │ 1
</span></span><span style="display:flex;"><span> A217 │ 1
</span></span><span style="display:flex;"><span> A220 │ 1
</span></span><span style="display:flex;"><span> A223 │ 1
</span></span><span style="display:flex;"><span> A224 │ 1
</span></span><span style="display:flex;"><span> A227 │ 1
</span></span><span style="display:flex;"><span> A229 │ 1
</span></span><span style="display:flex;"><span> A230 │ 1
</span></span><span style="display:flex;"><span> CLIMATE CHANGE MITIGATION │ 1
</span></span><span style="display:flex;"><span> LIVESTOCK │ 1
</span></span><span style="display:flex;"><span>(12 rows)
</span></span><span style="display:flex;"><span><span style="color:#960050;background-color:#1e0010">
</span></span></span><span style="display:flex;"><span><span style="color:#960050;background-color:#1e0010"></span>Time: 240.041 ms
</span></span></code></pre></div><ul>
<li>I think we can move those to a new <code>cg.identifier.project</code> if we create one</li>
<li>The <code>cg.identifier.cpwfproject</code> field is similarly sparse, but the CCAFS ones are widely used</li>
</ul>
<h2 id="2024-01-12">2024-01-12</h2>
<ul>
<li>Export a list of affiliations to do some cleanup:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>localhost/dspace7= ☘ \COPY (SELECT DISTINCT text_value AS &#34;cg.contributor.affiliation&#34;, count(*) FROM metadatavalue WHERE dspace_object_id in (SELECT dspace_object_id FROM item) AND metadata_field_id = 211 GROUP BY &#34;cg.contributor.affiliation&#34; ORDER BY count DESC) to /tmp/2024-01-affiliations.csv WITH CSV HEADER;
</span></span><span style="display:flex;"><span>COPY 11719
</span></span></code></pre></div><ul>
<li>I first did some clustering and editing in OpenRefine, then I&rsquo;ll import those back into CGSpace and then do another export</li>
<li>Troubleshooting the statistics pages that aren&rsquo;t working on DSpace 7
<ul>
<li>On a hunch, I queried for for Solr statistics documents that <strong>did not have an <code>id</code> matching the 36-character UUID pattern</strong>:</li>
</ul>
</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ curl <span style="color:#e6db74">&#39;http://localhost:8983/solr/statistics/select?q=-id%3A%2F.\{36\}%2F&amp;rows=0&#39;</span>
</span></span><span style="display:flex;"><span>{
</span></span><span style="display:flex;"><span> &#34;responseHeader&#34;:{
</span></span><span style="display:flex;"><span> &#34;status&#34;:0,
</span></span><span style="display:flex;"><span> &#34;QTime&#34;:0,
</span></span><span style="display:flex;"><span> &#34;params&#34;:{
</span></span><span style="display:flex;"><span> &#34;q&#34;:&#34;-id:/.{36}/&#34;,
</span></span><span style="display:flex;"><span> &#34;rows&#34;:&#34;0&#34;}},
</span></span><span style="display:flex;"><span> &#34;response&#34;:{&#34;numFound&#34;:800167,&#34;start&#34;:0,&#34;numFoundExact&#34;:true,&#34;docs&#34;:[]
</span></span><span style="display:flex;"><span> }}
</span></span></code></pre></div><ul>
<li>They seem to come mostly from 2020, 2023, and 2024:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ curl <span style="color:#e6db74">&#39;http://localhost:8983/solr/statistics/select?q=-id%3A%2F.\{36\}%2F&amp;facet.range=time&amp;facet=true&amp;facet.range.start=2010-01-01T00:00:00Z&amp;facet.range.end=NOW&amp;facet.range.gap=%2B1YEAR&amp;rows=0&#39;</span>
</span></span><span style="display:flex;"><span>{
</span></span><span style="display:flex;"><span> &#34;responseHeader&#34;:{
</span></span><span style="display:flex;"><span> &#34;status&#34;:0,
</span></span><span style="display:flex;"><span> &#34;QTime&#34;:13,
</span></span><span style="display:flex;"><span> &#34;params&#34;:{
</span></span><span style="display:flex;"><span> &#34;facet.range&#34;:&#34;time&#34;,
</span></span><span style="display:flex;"><span> &#34;q&#34;:&#34;-id:/.{36}/&#34;,
</span></span><span style="display:flex;"><span> &#34;facet.range.gap&#34;:&#34;+1YEAR&#34;,
</span></span><span style="display:flex;"><span> &#34;rows&#34;:&#34;0&#34;,
</span></span><span style="display:flex;"><span> &#34;facet&#34;:&#34;true&#34;,
</span></span><span style="display:flex;"><span> &#34;facet.range.start&#34;:&#34;2010-01-01T00:00:00Z&#34;,
</span></span><span style="display:flex;"><span> &#34;facet.range.end&#34;:&#34;NOW&#34;}},
</span></span><span style="display:flex;"><span> &#34;response&#34;:{&#34;numFound&#34;:800168,&#34;start&#34;:0,&#34;numFoundExact&#34;:true,&#34;docs&#34;:[]
</span></span><span style="display:flex;"><span> },
</span></span><span style="display:flex;"><span> &#34;facet_counts&#34;:{
</span></span><span style="display:flex;"><span> &#34;facet_queries&#34;:{},
</span></span><span style="display:flex;"><span> &#34;facet_fields&#34;:{},
</span></span><span style="display:flex;"><span> &#34;facet_ranges&#34;:{
</span></span><span style="display:flex;"><span> &#34;time&#34;:{
</span></span><span style="display:flex;"><span> &#34;counts&#34;:[
</span></span><span style="display:flex;"><span> &#34;2010-01-01T00:00:00Z&#34;,0,
</span></span><span style="display:flex;"><span> &#34;2011-01-01T00:00:00Z&#34;,0,
</span></span><span style="display:flex;"><span> &#34;2012-01-01T00:00:00Z&#34;,0,
</span></span><span style="display:flex;"><span> &#34;2013-01-01T00:00:00Z&#34;,0,
</span></span><span style="display:flex;"><span> &#34;2014-01-01T00:00:00Z&#34;,0,
</span></span><span style="display:flex;"><span> &#34;2015-01-01T00:00:00Z&#34;,89,
</span></span><span style="display:flex;"><span> &#34;2016-01-01T00:00:00Z&#34;,11,
</span></span><span style="display:flex;"><span> &#34;2017-01-01T00:00:00Z&#34;,0,
</span></span><span style="display:flex;"><span> &#34;2018-01-01T00:00:00Z&#34;,0,
</span></span><span style="display:flex;"><span> &#34;2019-01-01T00:00:00Z&#34;,0,
</span></span><span style="display:flex;"><span> &#34;2020-01-01T00:00:00Z&#34;,1339,
</span></span><span style="display:flex;"><span> &#34;2021-01-01T00:00:00Z&#34;,0,
</span></span><span style="display:flex;"><span> &#34;2022-01-01T00:00:00Z&#34;,0,
</span></span><span style="display:flex;"><span> &#34;2023-01-01T00:00:00Z&#34;,653736,
</span></span><span style="display:flex;"><span> &#34;2024-01-01T00:00:00Z&#34;,144993],
</span></span><span style="display:flex;"><span> &#34;gap&#34;:&#34;+1YEAR&#34;,
</span></span><span style="display:flex;"><span> &#34;start&#34;:&#34;2010-01-01T00:00:00Z&#34;,
</span></span><span style="display:flex;"><span> &#34;end&#34;:&#34;2025-01-01T00:00:00Z&#34;}},
</span></span><span style="display:flex;"><span> &#34;facet_intervals&#34;:{},
</span></span><span style="display:flex;"><span> &#34;facet_heatmaps&#34;:{}}}
</span></span></code></pre></div><ul>
<li>They seem to come from 2023-08 until now (so way before we migrated to DSpace 7):</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ curl <span style="color:#e6db74">&#39;http://localhost:8983/solr/statistics/select?q=-id%3A%2F.\{36\}%2F&amp;facet.range=time&amp;facet=true&amp;facet.range.start=2023-01-01T00:00:00Z&amp;facet.range.end=NOW&amp;facet.range.gap=%2B1MONTH&amp;rows=0&#39;</span>
</span></span><span style="display:flex;"><span>{
</span></span><span style="display:flex;"><span> &#34;responseHeader&#34;:{
</span></span><span style="display:flex;"><span> &#34;status&#34;:0,
</span></span><span style="display:flex;"><span> &#34;QTime&#34;:196,
</span></span><span style="display:flex;"><span> &#34;params&#34;:{
</span></span><span style="display:flex;"><span> &#34;facet.range&#34;:&#34;time&#34;,
</span></span><span style="display:flex;"><span> &#34;q&#34;:&#34;-id:/.{36}/&#34;,
</span></span><span style="display:flex;"><span> &#34;facet.range.gap&#34;:&#34;+1MONTH&#34;,
</span></span><span style="display:flex;"><span> &#34;rows&#34;:&#34;0&#34;,
</span></span><span style="display:flex;"><span> &#34;facet&#34;:&#34;true&#34;,
</span></span><span style="display:flex;"><span> &#34;facet.range.start&#34;:&#34;2023-01-01T00:00:00Z&#34;,
</span></span><span style="display:flex;"><span> &#34;facet.range.end&#34;:&#34;NOW&#34;}},
</span></span><span style="display:flex;"><span> &#34;response&#34;:{&#34;numFound&#34;:800168,&#34;start&#34;:0,&#34;numFoundExact&#34;:true,&#34;docs&#34;:[]
</span></span><span style="display:flex;"><span> },
</span></span><span style="display:flex;"><span> &#34;facet_counts&#34;:{
</span></span><span style="display:flex;"><span> &#34;facet_queries&#34;:{},
</span></span><span style="display:flex;"><span> &#34;facet_fields&#34;:{},
</span></span><span style="display:flex;"><span> &#34;facet_ranges&#34;:{
</span></span><span style="display:flex;"><span> &#34;time&#34;:{
</span></span><span style="display:flex;"><span> &#34;counts&#34;:[
</span></span><span style="display:flex;"><span> &#34;2023-01-01T00:00:00Z&#34;,1,
</span></span><span style="display:flex;"><span> &#34;2023-02-01T00:00:00Z&#34;,0,
</span></span><span style="display:flex;"><span> &#34;2023-03-01T00:00:00Z&#34;,0,
</span></span><span style="display:flex;"><span> &#34;2023-04-01T00:00:00Z&#34;,0,
</span></span><span style="display:flex;"><span> &#34;2023-05-01T00:00:00Z&#34;,0,
</span></span><span style="display:flex;"><span> &#34;2023-06-01T00:00:00Z&#34;,0,
</span></span><span style="display:flex;"><span> &#34;2023-07-01T00:00:00Z&#34;,0,
</span></span><span style="display:flex;"><span> &#34;2023-08-01T00:00:00Z&#34;,27621,
</span></span><span style="display:flex;"><span> &#34;2023-09-01T00:00:00Z&#34;,59165,
</span></span><span style="display:flex;"><span> &#34;2023-10-01T00:00:00Z&#34;,115338,
</span></span><span style="display:flex;"><span> &#34;2023-11-01T00:00:00Z&#34;,96147,
</span></span><span style="display:flex;"><span> &#34;2023-12-01T00:00:00Z&#34;,355464,
</span></span><span style="display:flex;"><span> &#34;2024-01-01T00:00:00Z&#34;,125429],
</span></span><span style="display:flex;"><span> &#34;gap&#34;:&#34;+1MONTH&#34;,
</span></span><span style="display:flex;"><span> &#34;start&#34;:&#34;2023-01-01T00:00:00Z&#34;,
</span></span><span style="display:flex;"><span> &#34;end&#34;:&#34;2024-02-01T00:00:00Z&#34;}},
</span></span><span style="display:flex;"><span> &#34;facet_intervals&#34;:{},
</span></span><span style="display:flex;"><span> &#34;facet_heatmaps&#34;:{}}}
</span></span></code></pre></div><ul>
<li>I see that we had 31,744 statistic events yesterday, and 799 have no <code>id</code>!</li>
<li>I asked about this on Slack and will file an issue on GitHub if someone else also finds such records
<ul>
<li>Several people said they have them, so it&rsquo;s a bug of some sort in DSpace, not our configuration</li>
</ul>
</li>
</ul>
<h2 id="2024-01-13">2024-01-13</h2>
<ul>
<li>Yesterday alone we had 37,000 unique IPs making requests to nginx
<ul>
<li>I looked up the ASNs and found 6,000 IPs from this network in Amazon Singapore: 47.128.0.0/14</li>
</ul>
</li>
</ul>
<h2 id="2024-01-15">2024-01-15</h2>
<ul>
<li>Investigating the CSS selector warning that I&rsquo;ve seen in PM2 logs:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>0|dspace-ui | 1 rules skipped due to selector errors:
</span></span><span style="display:flex;"><span>0|dspace-ui | .custom-file-input:lang(en)~.custom-file-label -&gt; unmatched pseudo-class :lang
</span></span></code></pre></div><ul>
<li>It seems to be a bug in Angular, as this selector comes from Bootstrap 4.6.x and is not invalid
<ul>
<li>But that led me to a more interesting issue with <code>inlineCritical</code> optimization for styles in Angular SSR that might be responsible for causing high load in the frontend</li>
<li>See: <a href="https://github.com/angular/angular/issues/42098">https://github.com/angular/angular/issues/42098</a></li>
<li>See: <a href="https://github.com/angular/universal/issues/2106">https://github.com/angular/universal/issues/2106</a></li>
<li>See: <a href="https://github.com/GoogleChromeLabs/critters/issues/78">https://github.com/GoogleChromeLabs/critters/issues/78</a></li>
</ul>
</li>
<li>Since the production site was flapping a lot I decided to try disabling inlineCriticalCss</li>
<li>There have been on and off load issues with the Angular frontend today
<ul>
<li>I think I will just block all data center network blocks for now</li>
<li>In the last week I see almost 200,000 unique IPs:</li>
</ul>
</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span># zcat -f /var/log/nginx/*access.log /var/log/nginx/*access.log.1 /var/log/nginx/*access.log.2.gz /var/log/nginx/*access.log.3.gz /var/log/nginx/*access.log.4.gz /var/log/nginx/*access.log.5.gz /var/log/nginx/*access.log.6.gz | awk <span style="color:#e6db74">&#39;{print $1}&#39;</span> | sort -u |
</span></span><span style="display:flex;"><span>tee /tmp/ips.txt | wc -l
</span></span><span style="display:flex;"><span>196493
</span></span></code></pre></div><ul>
<li>Looking these IPs up I see there are 18,000 coming from Comcast, 10,000 from AT&amp;T, 4110 from Charter, 3500 from Cox and dozens of other residential IPs
<ul>
<li>I highly doubt these are home users browsing CGSpace&hellip; seems super fishy</li>
<li>Also, over 1,000 IPs from SpaceX Starlink in the last week. RIGHT</li>
<li>I will temporarily add a few new datacenter ISP network blocks to our rate limit:
<ul>
<li>16509 Amazon-02</li>
<li>701 UUNET</li>
<li>8075 Microsoft</li>
<li>15169 Google</li>
<li>14618 Amazon-AES</li>
<li>396982 Google Cloud</li>
</ul>
</li>
<li>The load on the server <em>immediately</em> dropped</li>
</ul>
</li>
</ul>
<h2 id="2024-01-17">2024-01-17</h2>
<ul>
<li>It turns out AS701 (UUNET) is Verizon Business, which is used as an ISP for many staff at IFPRI
<ul>
<li>This was causing them to see HTTP 429 &ldquo;too many requests&rdquo; errors on CGSpace</li>
<li>I removed this ASN from the rate limiting</li>
</ul>
</li>
</ul>
<h2 id="2024-01-18">2024-01-18</h2>
<ul>
<li>Start looking at Solr stats again
<ul>
<li>I found one statistics record that has 22,000 of the same collection in <code>owningColl</code> and 22,000 of the same community in <code>owningComm</code></li>
<li>The record is from 2015 and think it would be easier to delete it than fix it:</li>
</ul>
</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ curl http://localhost:8983/solr/statistics/update -H <span style="color:#e6db74">&#34;Content-type: text/xml&#34;</span> --data-binary <span style="color:#e6db74">&#39;&lt;delete&gt;&lt;query&gt;uid:3b4eefba-a302-4172-a286-dcb25d70129e&lt;/query&gt;&lt;/delete&gt;&#39;</span>
</span></span></code></pre></div><ul>
<li>Looking again, there are at least 1,000 of these so I will need to come up with an actual solution to fix these</li>
<li>I&rsquo;m noticing we have 1,800+ links to defunct resources on bioversityinternational.org in the <code>cg.link.permalink</code> field
<ul>
<li>I should ask Alliance if they have any plans to fix those, or upload them to CGSpace</li>
</ul>
</li>
</ul>
<h2 id="2024-01-22">2024-01-22</h2>
<ul>
<li>Meeting with IWMI about ORCID integration on CGSpace now that we&rsquo;ve migrated to DSpace 7</li>
<li>File an issue for the inaccurate DSpace statistics: <a href="https://github.com/DSpace/DSpace/issues/9275">https://github.com/DSpace/DSpace/issues/9275</a></li>
</ul>
<h2 id="2024-01-23">2024-01-23</h2>
<ul>
<li>Meeting with IWMI about ORCID integration and the DSpace API for use with WordPress</li>
<li>IFPRI sent me an list of their author ORCIDs to add to our controlled vocabulary
<ul>
<li>I joined them with our current list and resolved their names on ORCID and updated them in our database:</li>
</ul>
</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ cat ~/src/git/DSpace/dspace/config/controlled-vocabularies/cg-creator-identifier.xml ~/Downloads/IFPRI<span style="color:#ae81ff">\ </span>ORCiD<span style="color:#ae81ff">\ </span>All.csv | grep -oE <span style="color:#e6db74">&#39;[A-Z0-9]{4}-[A-Z0-9]{4}-[A-Z0-9]{4}-[A-Z0-9]{4}&#39;</span> | sort -u &gt; /tmp/2024-01-23-orcids.txt
</span></span><span style="display:flex;"><span>$ ./ilri/resolve_orcids.py -i /tmp/2024-01-23-orcids.txt -o /tmp/2024-01-23-orcids-names.txt -d
</span></span><span style="display:flex;"><span>$ ./ilri/update_orcids.py -i /tmp/2024-01-23-orcids-names.txt -db dspace -u dspace -p fuuu
</span></span></code></pre></div><ul>
<li>This adds about 400 new identifiers to the controlled vocabulary</li>
<li>I consolidated our various project identifier fields for closed programs into one <code>cg.identifer.project</code>:
<ul>
<li><code>cg.identifier.ccafsproject</code></li>
<li><code>cg.identifier.ccafsprojectpii</code></li>
<li><code>cg.identifier.ciatproject</code></li>
<li><code>cg.identifier.cpwfproject</code></li>
</ul>
</li>
<li>I prefixed the existing 2,644 metadata values with &ldquo;CCAFS&rdquo;, &ldquo;CIAT&rdquo;, or &ldquo;CPWF&rdquo; so we can figure out where they came from if need be, and deleted the old fields from the metadata registry</li>
</ul>
<h2 id="2024-01-26">2024-01-26</h2>
<ul>
<li>Minor work on dspace-angular to clean up component styles</li>
<li>Add <code>cg.identifier.publicationRank</code> to CGSpace metadata registry and submission form</li>
</ul>
<h2 id="2024-01-29">2024-01-29</h2>
<ul>
<li>Rework the nginx bot and network limits slightly to remove some old patterns/networks and remove Google
<ul>
<li>The Google Scholar team contacted me to ask why their requests were timing out (well&hellip;)</li>
</ul>
</li>
</ul> </ul>
<!-- raw HTML omitted --> <!-- raw HTML omitted -->

Some files were not shown because too many files have changed in this diff Show More