diff --git a/content/posts/2018-10.md b/content/posts/2018-10.md
index 2c82bab4d..2090d00cd 100644
--- a/content/posts/2018-10.md
+++ b/content/posts/2018-10.md
@@ -17,8 +17,7 @@ categories: ["Notes"]
- I see Moayad was busy collecting item views and downloads from CGSpace yesterday:
```
-# zcat --force /var/log/nginx/*.log /var/log/nginx/*.log.1 | grep -E "02/Oct/2018" | awk '{print $1}
-' | sort | uniq -c | sort -n | tail -n 10
+# zcat --force /var/log/nginx/*.log /var/log/nginx/*.log.1 | grep -E "02/Oct/2018" | awk '{print $1}' | sort | uniq -c | sort -n | tail -n 10
933 40.77.167.90
971 95.108.181.88
1043 41.204.190.40
diff --git a/content/posts/2020-02.md b/content/posts/2020-02.md
index ef849dad6..4c9561a86 100644
--- a/content/posts/2020-02.md
+++ b/content/posts/2020-02.md
@@ -486,10 +486,182 @@ Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko)
- Another IP address (31.6.77.23) in the UK making a few hundred requests without a user agent
- I will add the IP addresses to the nginx badbots list
+- 31.6.77.23 is in the UK and judging by its DNS it belongs to a [web marketing company called Bronco](https://www.bronco.co.uk/)
+ - I looked for its DNS entry in Solr statistics and found a few hundred thousand over the years:
+
+```
+$ curl -s "http://localhost:8081/solr/statistics/select" -d "q=dns:/squeeze3.bronco.co.uk./&rows=0"
+
+
# zcat --force /var/log/nginx/*.log /var/log/nginx/*.log.1 | grep -E "02/Oct/2018" | awk '{print $1}
-' | sort | uniq -c | sort -n | tail -n 10
+# zcat --force /var/log/nginx/*.log /var/log/nginx/*.log.1 | grep -E "02/Oct/2018" | awk '{print $1}' | sort | uniq -c | sort -n | tail -n 10
933 40.77.167.90
971 95.108.181.88
1043 41.204.190.40
diff --git a/docs/2018-11/index.html b/docs/2018-11/index.html
index 74a6896b6..65bc3448c 100644
--- a/docs/2018-11/index.html
+++ b/docs/2018-11/index.html
@@ -33,7 +33,7 @@ Send a note about my dspace-statistics-api to the dspace-tech mailing list
Linode has been sending mails a few times a day recently that CGSpace (linode18) has had high CPU usage
Today these are the top 10 IPs:
"/>
-
+
diff --git a/docs/2018-12/index.html b/docs/2018-12/index.html
index e6612672f..05a91b036 100644
--- a/docs/2018-12/index.html
+++ b/docs/2018-12/index.html
@@ -33,7 +33,7 @@ Then I ran all system updates and restarted the server
I noticed that there is another issue with PDF thumbnails on CGSpace, and I see there was another Ghostscript vulnerability last week
"/>
-
+
diff --git a/docs/2019-01/index.html b/docs/2019-01/index.html
index 4d092664c..15d6b9a06 100644
--- a/docs/2019-01/index.html
+++ b/docs/2019-01/index.html
@@ -47,7 +47,7 @@ I don’t see anything interesting in the web server logs around that time t
357 207.46.13.1
903 54.70.40.11
"/>
-
+
diff --git a/docs/2019-02/index.html b/docs/2019-02/index.html
index ce5c482f1..f1c270350 100644
--- a/docs/2019-02/index.html
+++ b/docs/2019-02/index.html
@@ -69,7 +69,7 @@ real 0m19.873s
user 0m22.203s
sys 0m1.979s
"/>
-
+
diff --git a/docs/2019-03/index.html b/docs/2019-03/index.html
index f5687d980..0afb743ee 100644
--- a/docs/2019-03/index.html
+++ b/docs/2019-03/index.html
@@ -43,7 +43,7 @@ Most worryingly, there are encoding errors in the abstracts for eleven items, fo
I think I will need to ask Udana to re-copy and paste the abstracts with more care using Google Docs
"/>
-
+
diff --git a/docs/2019-04/index.html b/docs/2019-04/index.html
index 2e3a7fe0e..8b5f92f6e 100644
--- a/docs/2019-04/index.html
+++ b/docs/2019-04/index.html
@@ -61,7 +61,7 @@ $ ./fix-metadata-values.py -i /tmp/2019-02-21-fix-4-regions.csv -db dspace -u ds
$ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-2-countries.csv -db dspace -u dspace -p 'fuuu' -m 228 -f cg.coverage.country -d
$ ./delete-metadata-values.py -i /tmp/2019-02-21-delete-1-region.csv -db dspace -u dspace -p 'fuuu' -m 231 -f cg.coverage.region -d
"/>
-
+
diff --git a/docs/2019-05/index.html b/docs/2019-05/index.html
index d689d8977..bd260ecb7 100644
--- a/docs/2019-05/index.html
+++ b/docs/2019-05/index.html
@@ -45,7 +45,7 @@ DELETE 1
But after this I tried to delete the item from the XMLUI and it is still present…
"/>
-
+
diff --git a/docs/2019-06/index.html b/docs/2019-06/index.html
index 493cf201b..b6c16e857 100644
--- a/docs/2019-06/index.html
+++ b/docs/2019-06/index.html
@@ -31,7 +31,7 @@ Run system updates on CGSpace (linode18) and reboot it
Skype with Marie-Angélique and Abenet about CG Core v2
"/>
-
+
diff --git a/docs/2019-07/index.html b/docs/2019-07/index.html
index 3922f80df..b71c49e68 100644
--- a/docs/2019-07/index.html
+++ b/docs/2019-07/index.html
@@ -35,7 +35,7 @@ CGSpace
Abenet had another similar issue a few days ago when trying to find the stats for 2018 in the RTB community
"/>
-
+
diff --git a/docs/2019-08/index.html b/docs/2019-08/index.html
index 1b7bf62fd..92a203d94 100644
--- a/docs/2019-08/index.html
+++ b/docs/2019-08/index.html
@@ -43,7 +43,7 @@ After rebooting, all statistics cores were loaded… wow, that’s luck
Run system updates on DSpace Test (linode19) and reboot it
"/>
-
+
diff --git a/docs/2019-09/index.html b/docs/2019-09/index.html
index b3c8d6338..48b607971 100644
--- a/docs/2019-09/index.html
+++ b/docs/2019-09/index.html
@@ -69,7 +69,7 @@ Here are the top ten IPs in the nginx XMLUI and REST/OAI logs this morning:
7249 2a01:7e00::f03c:91ff:fe18:7396
9124 45.5.186.2
"/>
-
+
diff --git a/docs/2019-10/index.html b/docs/2019-10/index.html
index fc8794257..943c7de21 100644
--- a/docs/2019-10/index.html
+++ b/docs/2019-10/index.html
@@ -15,7 +15,7 @@
-
+
diff --git a/docs/2019-11/index.html b/docs/2019-11/index.html
index 302314234..76883f3ad 100644
--- a/docs/2019-11/index.html
+++ b/docs/2019-11/index.html
@@ -55,7 +55,7 @@ Let’s see how many of the REST API requests were for bitstreams (because t
# zcat --force /var/log/nginx/rest.log.*.gz | grep -E "[0-9]{1,2}/Oct/2019" | grep -c -E "/rest/bitstreams"
106781
"/>
-
+
diff --git a/docs/2019-12/index.html b/docs/2019-12/index.html
index e6335fca0..3a265c7ed 100644
--- a/docs/2019-12/index.html
+++ b/docs/2019-12/index.html
@@ -43,7 +43,7 @@ Make sure all packages are up to date and the package manager is up to date, the
# dpkg -C
# reboot
"/>
-
+
diff --git a/docs/2020-01/index.html b/docs/2020-01/index.html
index b53e14dc9..e5b1372f0 100644
--- a/docs/2020-01/index.html
+++ b/docs/2020-01/index.html
@@ -53,7 +53,7 @@ I tweeted the CGSpace repository link
"/>
-
+
diff --git a/docs/2020-02/index.html b/docs/2020-02/index.html
index e8f46d1d3..e0bf0385b 100644
--- a/docs/2020-02/index.html
+++ b/docs/2020-02/index.html
@@ -20,7 +20,7 @@ The code finally builds and runs with a fresh install
-
+
@@ -35,7 +35,7 @@ The code finally builds and runs with a fresh install
"/>
-
+
@@ -45,9 +45,9 @@ The code finally builds and runs with a fresh install
"@type": "BlogPosting",
"headline": "February, 2020",
"url": "https:\/\/alanorth.github.io\/cgspace-notes\/2020-02\/",
- "wordCount": "3245",
+ "wordCount": "4210",
"datePublished": "2020-02-02T11:56:30+02:00",
- "dateModified": "2020-02-19T15:17:32+02:00",
+ "dateModified": "2020-02-23T09:16:50+02:00",
"author": {
"@type": "Person",
"name": "Alan Orth"
@@ -613,6 +613,34 @@ UPDATE 26
- Another IP address (31.6.77.23) in the UK making a few hundred requests without a user agent
- I will add the IP addresses to the nginx badbots list
+- 31.6.77.23 is in the UK and judging by its DNS it belongs to a web marketing company called Bronco
+
+- I looked for its DNS entry in Solr statistics and found a few hundred thousand over the years:
+
+
+
+$ curl -s "http://localhost:8081/solr/statistics/select" -d "q=dns:/squeeze3.bronco.co.uk./&rows=0"
+<?xml version="1.0" encoding="UTF-8"?>
+<response>
+<lst name="responseHeader"><int name="status">0</int><int name="QTime">4</int><lst name="params"><str name="q">dns:/squeeze3.bronco.co.uk./</str><str name="rows">0</str></lst></lst><result name="response" numFound="86044" start="0"></result>
+</response>
+
+- The totals in each core are:
+
+- statistics: 86044
+- statistics-2018: 65144
+- statistics-2017: 79405
+- statistics-2016: 121316
+- statistics-2015: 30720
+- statistics-2014: 4524
+- … so about 387,000 hits!
+
+
+- I will purge them from each core one by one, ie:
+
+$ curl -s "http://localhost:8081/solr/statistics-2015/update?softCommit=true" -H "Content-Type: text/xml" --data-binary "<delete><query>dns:squeeze3.bronco.co.uk.</query></delete>"
+$ curl -s "http://localhost:8081/solr/statistics-2014/update?softCommit=true" -H "Content-Type: text/xml" --data-binary "<delete><query>dns:squeeze3.bronco.co.uk.</query></delete>"
+
- Deploy latest Tomcat and PostgreSQL JDBC driver changes on CGSpace (linode18)
- Deploy latest
5_x-prod
branch on CGSpace (linode18)
- Run all system updates on CGSpace (linode18) server and reboot it
@@ -621,6 +649,145 @@ UPDATE 26
- Luckily after restarting Tomcat once more they all came back up
+I ran the dspace cleanup -v
process on CGSpace and got an error:
+
+Error: ERROR: update or delete on table "bitstream" violates foreign key constraint "bundle_primary_bitstream_id_fkey" on table "bundle"
+ Detail: Key (bitstream_id)=(183996) is still referenced from table "bundle".
+
+- The solution is, as always:
+
+# su - postgres
+$ psql dspace -c 'update bundle set primary_bitstream_id=NULL where primary_bitstream_id in (183996);'
+UPDATE 1
+
+- Аdd one more new Bioversity ORCID iD to the controlled vocabulary on CGSpace
+- Felix Shaw from Earlham emailed me to ask about his admin account on DSpace Test
+
+- His old one got lost when I re-sync’d DSpace Test with CGSpace a few weeks ago
+- I added a new account for him and added it to the Administrators group:
+
+
+
+$ dspace user -a -m wow@me.com -g Felix -s Shaw -p 'fuananaaa'
+
+- For some reason the Atmire Content and Usage Analysis (CUA) module’s Usage Statistics is drawing blank graphs
+
+- I looked in the dspace.log and see:
+
+
+
+2020-02-23 11:28:13,696 ERROR org.dspace.app.xmlui.cocoon.DSpaceCocoonServletFilter @ Serious Error Occurred Processing Request!
+org.springframework.web.util.NestedServletException: Handler processing failed; nested exception is java.lang.NoClassDefFoundError: Could not
+ initialize class org.jfree.chart.JFreeChart
+
+- The same error happens on DSpace Test, but graphs are working on my local instance
+
+- The only thing I’ve changed recently is the Tomcat version, but it’s working locally…
+- I see the following file on my local instance, CGSpace, and DSpace Test:
dspace/webapps/xmlui/WEB-INF/lib/jfreechart-1.0.5.jar
+- I deployed Tomcat 7.0.99 on DSpace Test but the JFreeChart classs still can’t be found…
+- So it must be somthing with the library search path…
+- Strange it works with Tomcat 7.0.100 on my local machine
+
+
+- I copied the
jfreechart-1.0.5.jar
file to the Tomcat lib folder and then there was a different error when I loaded Atmire CUA:
+
+2020-02-23 16:25:10,841 ERROR org.dspace.app.xmlui.cocoon.DSpaceCocoonServletFilter @ Serious Error Occurred Processing Request! org.springframework.web.util.NestedServletException: Handler processing failed; nested exception is java.awt.AWTError: Assistive Technology not found: org.GNOME.Accessibility.AtkWrapper
+
+- Some search results suggested commenting out the following line in
/etc/java-8-openjdk/accessibility.properties
:
+
+assistive_technologies=org.GNOME.Accessibility.AtkWrapper
+
+- And removing the extra jfreechart library and restarting Tomcat I was able to load the usage statistics graph on DSpace Test…
+
+- Hmm, actually I think this is an Java bug, perhaps introduced or at least present in 18.04, with lots of references to it happening in other configurations like Debian 9 with Jenkins, etc…
+- Apparently if you use the non-headless version of openjdk this doesn’t happen… but that pulls in X11 stuff so no thanks
+- Also, I see dozens of occurences of this going back over one month (we have logs for about that period):
+
+
+
+# grep -c 'initialize class org.jfree.chart.JFreeChart' dspace.log.2020-0*
+dspace.log.2020-01-12:4
+dspace.log.2020-01-13:66
+dspace.log.2020-01-14:4
+dspace.log.2020-01-15:36
+dspace.log.2020-01-16:88
+dspace.log.2020-01-17:4
+dspace.log.2020-01-18:4
+dspace.log.2020-01-19:4
+dspace.log.2020-01-20:4
+dspace.log.2020-01-21:4
+...
+
+- I deployed the fix on CGSpace (linode18) and I was able to see the graphs in the Atmire CUA Usage Statistics…
+- On an unrelated note there is something weird going on in that I see millions of hits from IP 34.218.226.147 in Solr statistics, but if I remember correctly that IP belongs to CodeObia’s AReS explorer, but it should only be using REST and therefore no Solr statistics…?
+
+$ curl -s "http://localhost:8081/solr/statistics-2018/select" -d "q=ip:34.218.226.147&rows=0"
+<?xml version="1.0" encoding="UTF-8"?>
+<response>
+<lst name="responseHeader"><int name="status">0</int><int name="QTime">811</int><lst name="params"><str name="q">ip:34.218.226.147</str><str name="rows">0</str></lst></lst><result name="response" numFound="5536097" start="0"></result>
+</response>
+
+- And there are apparently two million from last month (2020-01):
+
+$ curl -s "http://localhost:8081/solr/statistics/select" -d "q=ip:34.218.226.147&fq=dateYearMonth:2020-01&rows=0"
+<?xml version="1.0" encoding="UTF-8"?>
+<response>
+<lst name="responseHeader"><int name="status">0</int><int name="QTime">248</int><lst name="params"><str name="q">ip:34.218.226.147</str><str name="fq">dateYearMonth:2020-01</str><str name="rows">0</str></lst></lst><result name="response" numFound="2173455" start="0"></result>
+</response>
+
+- But when I look at the nginx access logs for the past month or so I only see 84,000, all of which are on
/rest
and none of which are to XMLUI:
+
+# zcat /var/log/nginx/*.log.*.gz | grep -c 34.218.226.147
+84322
+# zcat /var/log/nginx/*.log.*.gz | grep 34.218.226.147 | grep -c '/rest'
+84322
+
+- Either the requests didn’t get logged, or there is some mixup with the Solr documents (fuck!)
+
+- On second inspection, I do see lots of notes here about 34.218.226.147, including 150,000 on one day in October, 2018 alone…
+
+
+- To make matters worse, I see hits from REST in the regular nginx access log!
+
+- I did a few tests and I can’t figure out, but it seems that hits appear in either (not both)
+- Also, I see zero hits to
/rest
in the access.log on DSpace Test (linode19)
+
+
+- Anyways, I faceted by IP in 2020-01 and see:
+
+$ curl -s 'http://localhost:8081/solr/statistics/select?q=*:*&fq=dateYearMonth:2020-01&rows=0&wt=json&indent=true&facet=true&facet.field=ip'
+...
+ "172.104.229.92",2686876,
+ "34.218.226.147",2173455,
+ "163.172.70.248",80945,
+ "163.172.71.24",55211,
+ "163.172.68.99",38427,
+
+- Surprise surprise, the top two IPs are from AReS servers… wtf.
+- The next three are from Online in France and they are all using this weird user agent and making tens of thousands of requests to Discovery:
+
+Mozilla/5.0 ((Windows; U; Windows NT 6.1; fr; rv:1.9.2) Gecko/20100115 Firefox/3.6)
+
+- And all the same three are already inflating the statistics for 2020-02… hmmm.
+- I need to see why AReS harvesting is inflating the stats, as it should only be making REST requests…
+- Shiiiiit, I see 84,000 requests from the AReS IP today alone:
+
+$ curl -s 'http://localhost:8081/solr/statistics/select?q=time:2020-02-22*+AND+ip:172.104.229.92&rows=0&wt=json&indent=true'
+...
+ "response":{"numFound":84594,"start":0,"docs":[]
+
+- Fuck! And of course the ILRI websites doing their daily REST harvesting are causing issues too, from today alone:
+
+ "2a01:7e00::f03c:91ff:fe9a:3a37",35512,
+ "2a01:7e00::f03c:91ff:fe18:7396",26155,
+
+- I need to try to make some requests for these URLs and observe if they make a statistics hit:
+
+/rest/items?expand=metadata,bitstreams,parentCommunityList&limit=50&offset=82450
+/rest/handle/10568/28702?expand=all
+
+
+- Those are the requests AReS and ILRI servers are making… nearly 150,000 per day!
diff --git a/docs/404.html b/docs/404.html
index 7c82c2402..7cc18bcfb 100644
--- a/docs/404.html
+++ b/docs/404.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/categories/index.html b/docs/categories/index.html
index a89066222..649c57020 100644
--- a/docs/categories/index.html
+++ b/docs/categories/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/categories/notes/index.html b/docs/categories/notes/index.html
index b4f60bedc..10f1540cf 100644
--- a/docs/categories/notes/index.html
+++ b/docs/categories/notes/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/categories/notes/page/2/index.html b/docs/categories/notes/page/2/index.html
index dcfa5b252..9d24965fd 100644
--- a/docs/categories/notes/page/2/index.html
+++ b/docs/categories/notes/page/2/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/categories/notes/page/3/index.html b/docs/categories/notes/page/3/index.html
index 5d4021189..ed229bebe 100644
--- a/docs/categories/notes/page/3/index.html
+++ b/docs/categories/notes/page/3/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/categories/notes/page/4/index.html b/docs/categories/notes/page/4/index.html
index e698b2e67..d933a0644 100644
--- a/docs/categories/notes/page/4/index.html
+++ b/docs/categories/notes/page/4/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/categories/page/2/index.html b/docs/categories/page/2/index.html
index a4b9ffb92..372b9cd31 100644
--- a/docs/categories/page/2/index.html
+++ b/docs/categories/page/2/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/categories/page/3/index.html b/docs/categories/page/3/index.html
index 3c5f6568f..c83fb3683 100644
--- a/docs/categories/page/3/index.html
+++ b/docs/categories/page/3/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/categories/page/4/index.html b/docs/categories/page/4/index.html
index d8645d9ce..861e93237 100644
--- a/docs/categories/page/4/index.html
+++ b/docs/categories/page/4/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/categories/page/5/index.html b/docs/categories/page/5/index.html
index b2522131c..335d4db8c 100644
--- a/docs/categories/page/5/index.html
+++ b/docs/categories/page/5/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/categories/page/6/index.html b/docs/categories/page/6/index.html
index 7c7a9a217..cdd7c9416 100644
--- a/docs/categories/page/6/index.html
+++ b/docs/categories/page/6/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/cgiar-library-migration/index.html b/docs/cgiar-library-migration/index.html
index 54fac94a5..0793c7e6f 100644
--- a/docs/cgiar-library-migration/index.html
+++ b/docs/cgiar-library-migration/index.html
@@ -15,7 +15,7 @@
-
+
diff --git a/docs/cgspace-cgcorev2-migration/index.html b/docs/cgspace-cgcorev2-migration/index.html
index 148e9d7a0..a5d9cd71f 100644
--- a/docs/cgspace-cgcorev2-migration/index.html
+++ b/docs/cgspace-cgcorev2-migration/index.html
@@ -15,7 +15,7 @@
-
+
diff --git a/docs/index.html b/docs/index.html
index 52e985f9c..c3955757d 100644
--- a/docs/index.html
+++ b/docs/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/page/2/index.html b/docs/page/2/index.html
index 2b8554649..fe0f7bb74 100644
--- a/docs/page/2/index.html
+++ b/docs/page/2/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/page/3/index.html b/docs/page/3/index.html
index 71e4bb1d5..feb0f1c97 100644
--- a/docs/page/3/index.html
+++ b/docs/page/3/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/page/4/index.html b/docs/page/4/index.html
index d66f0d232..9d6f6cc92 100644
--- a/docs/page/4/index.html
+++ b/docs/page/4/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/page/5/index.html b/docs/page/5/index.html
index 1de218e43..b6d070073 100644
--- a/docs/page/5/index.html
+++ b/docs/page/5/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/page/6/index.html b/docs/page/6/index.html
index 96fe21194..5b02fc54f 100644
--- a/docs/page/6/index.html
+++ b/docs/page/6/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/posts/index.html b/docs/posts/index.html
index 3ae5a4774..c2d008ecb 100644
--- a/docs/posts/index.html
+++ b/docs/posts/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/posts/page/2/index.html b/docs/posts/page/2/index.html
index 37f460692..265f5fed0 100644
--- a/docs/posts/page/2/index.html
+++ b/docs/posts/page/2/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/posts/page/3/index.html b/docs/posts/page/3/index.html
index fb2c4fb54..c08cc70b7 100644
--- a/docs/posts/page/3/index.html
+++ b/docs/posts/page/3/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/posts/page/4/index.html b/docs/posts/page/4/index.html
index e00171180..c148f2f6c 100644
--- a/docs/posts/page/4/index.html
+++ b/docs/posts/page/4/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/posts/page/5/index.html b/docs/posts/page/5/index.html
index 88441183c..fb652d265 100644
--- a/docs/posts/page/5/index.html
+++ b/docs/posts/page/5/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/posts/page/6/index.html b/docs/posts/page/6/index.html
index 06c3ffae6..87d789f7a 100644
--- a/docs/posts/page/6/index.html
+++ b/docs/posts/page/6/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/sitemap.xml b/docs/sitemap.xml
index a1a3dd284..773560b7f 100644
--- a/docs/sitemap.xml
+++ b/docs/sitemap.xml
@@ -4,27 +4,27 @@
https://alanorth.github.io/cgspace-notes/categories/
- 2020-02-19T15:17:32+02:00
+ 2020-02-23T09:16:50+02:00
https://alanorth.github.io/cgspace-notes/
- 2020-02-19T15:17:32+02:00
+ 2020-02-23T09:16:50+02:00
https://alanorth.github.io/cgspace-notes/2020-02/
- 2020-02-19T15:17:32+02:00
+ 2020-02-23T09:16:50+02:00
https://alanorth.github.io/cgspace-notes/categories/notes/
- 2020-02-19T15:17:32+02:00
+ 2020-02-23T09:16:50+02:00
https://alanorth.github.io/cgspace-notes/posts/
- 2020-02-19T15:17:32+02:00
+ 2020-02-23T09:16:50+02:00
diff --git a/docs/tags/index.html b/docs/tags/index.html
index e337f29ef..247b4b093 100644
--- a/docs/tags/index.html
+++ b/docs/tags/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/tags/migration/index.html b/docs/tags/migration/index.html
index c517c59d7..d3998a390 100644
--- a/docs/tags/migration/index.html
+++ b/docs/tags/migration/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/tags/notes/index.html b/docs/tags/notes/index.html
index 33707669b..4ada1151f 100644
--- a/docs/tags/notes/index.html
+++ b/docs/tags/notes/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/tags/notes/page/2/index.html b/docs/tags/notes/page/2/index.html
index c647cbd81..32e5a89b9 100644
--- a/docs/tags/notes/page/2/index.html
+++ b/docs/tags/notes/page/2/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/tags/notes/page/3/index.html b/docs/tags/notes/page/3/index.html
index 8e030f47c..dc010b89d 100644
--- a/docs/tags/notes/page/3/index.html
+++ b/docs/tags/notes/page/3/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/tags/page/2/index.html b/docs/tags/page/2/index.html
index e005120d3..10f147696 100644
--- a/docs/tags/page/2/index.html
+++ b/docs/tags/page/2/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/tags/page/3/index.html b/docs/tags/page/3/index.html
index 964444b68..1a7518b57 100644
--- a/docs/tags/page/3/index.html
+++ b/docs/tags/page/3/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/tags/page/4/index.html b/docs/tags/page/4/index.html
index 5f18fee0b..05b6328b6 100644
--- a/docs/tags/page/4/index.html
+++ b/docs/tags/page/4/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/tags/page/5/index.html b/docs/tags/page/5/index.html
index f5bb5cb7d..7dd5dd84d 100644
--- a/docs/tags/page/5/index.html
+++ b/docs/tags/page/5/index.html
@@ -14,7 +14,7 @@
-
+
diff --git a/docs/tags/page/6/index.html b/docs/tags/page/6/index.html
index 2c4402e2c..5d42daa9e 100644
--- a/docs/tags/page/6/index.html
+++ b/docs/tags/page/6/index.html
@@ -14,7 +14,7 @@
-
+