mirror of
https://github.com/alanorth/cgspace-notes.git
synced 2024-12-23 05:32:20 +01:00
658 lines
26 KiB
HTML
658 lines
26 KiB
HTML
<!DOCTYPE html>
|
||
<html lang="en">
|
||
|
||
<head>
|
||
<meta charset="utf-8">
|
||
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
|
||
|
||
<meta property="og:title" content="November, 2018" />
|
||
<meta property="og:description" content="2018-11-01
|
||
|
||
|
||
Finalize AReS Phase I and Phase II ToRs
|
||
Send a note about my dspace-statistics-api to the dspace-tech mailing list
|
||
|
||
|
||
2018-11-03
|
||
|
||
|
||
Linode has been sending mails a few times a day recently that CGSpace (linode18) has had high CPU usage
|
||
Today these are the top 10 IPs:
|
||
" />
|
||
<meta property="og:type" content="article" />
|
||
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/2018-11/" /><meta property="article:published_time" content="2018-11-01T16:41:30+02:00"/>
|
||
<meta property="article:modified_time" content="2018-11-22T09:39:09+02:00"/>
|
||
|
||
<meta name="twitter:card" content="summary"/>
|
||
<meta name="twitter:title" content="November, 2018"/>
|
||
<meta name="twitter:description" content="2018-11-01
|
||
|
||
|
||
Finalize AReS Phase I and Phase II ToRs
|
||
Send a note about my dspace-statistics-api to the dspace-tech mailing list
|
||
|
||
|
||
2018-11-03
|
||
|
||
|
||
Linode has been sending mails a few times a day recently that CGSpace (linode18) has had high CPU usage
|
||
Today these are the top 10 IPs:
|
||
"/>
|
||
<meta name="generator" content="Hugo 0.51" />
|
||
|
||
|
||
|
||
<script type="application/ld+json">
|
||
{
|
||
"@context": "http://schema.org",
|
||
"@type": "BlogPosting",
|
||
"headline": "November, 2018",
|
||
"url": "https://alanorth.github.io/cgspace-notes/2018-11/",
|
||
"wordCount": "2358",
|
||
"datePublished": "2018-11-01T16:41:30+02:00",
|
||
"dateModified": "2018-11-22T09:39:09+02:00",
|
||
"author": {
|
||
"@type": "Person",
|
||
"name": "Alan Orth"
|
||
},
|
||
"keywords": "Notes"
|
||
}
|
||
</script>
|
||
|
||
|
||
|
||
<link rel="canonical" href="https://alanorth.github.io/cgspace-notes/2018-11/">
|
||
|
||
<title>November, 2018 | CGSpace Notes</title>
|
||
|
||
<!-- combined, minified CSS -->
|
||
<link href="https://alanorth.github.io/cgspace-notes/css/style.css" rel="stylesheet" integrity="sha384-Upm5uY/SXdvbjuIGH6fBjF5vOYUr9DguqBskM+EQpLBzO9U+9fMVmWEt+TTlGrWQ" crossorigin="anonymous">
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
|
||
</head>
|
||
|
||
<body>
|
||
|
||
|
||
<div class="blog-masthead">
|
||
<div class="container">
|
||
<nav class="nav blog-nav">
|
||
<a class="nav-link " href="https://alanorth.github.io/cgspace-notes/">Home</a>
|
||
</nav>
|
||
</div>
|
||
</div>
|
||
|
||
|
||
|
||
<header class="blog-header">
|
||
<div class="container">
|
||
<h1 class="blog-title"><a href="https://alanorth.github.io/cgspace-notes/" rel="home">CGSpace Notes</a></h1>
|
||
<p class="lead blog-description">Documenting day-to-day work on the <a href="https://cgspace.cgiar.org">CGSpace</a> repository.</p>
|
||
</div>
|
||
</header>
|
||
|
||
|
||
|
||
<div class="container">
|
||
<div class="row">
|
||
<div class="col-sm-8 blog-main">
|
||
|
||
|
||
|
||
|
||
<article class="blog-post">
|
||
<header>
|
||
<h2 class="blog-post-title"><a href="https://alanorth.github.io/cgspace-notes/2018-11/">November, 2018</a></h2>
|
||
<p class="blog-post-meta"><time datetime="2018-11-01T16:41:30+02:00">Thu Nov 01, 2018</time> by Alan Orth in
|
||
|
||
<i class="fa fa-tag" aria-hidden="true"></i> <a href="/cgspace-notes/tags/notes" rel="tag">Notes</a>
|
||
|
||
</p>
|
||
</header>
|
||
<h2 id="2018-11-01">2018-11-01</h2>
|
||
|
||
<ul>
|
||
<li>Finalize AReS Phase I and Phase II ToRs</li>
|
||
<li>Send a note about my <a href="https://github.com/ilri/dspace-statistics-api">dspace-statistics-api</a> to the dspace-tech mailing list</li>
|
||
</ul>
|
||
|
||
<h2 id="2018-11-03">2018-11-03</h2>
|
||
|
||
<ul>
|
||
<li>Linode has been sending mails a few times a day recently that CGSpace (linode18) has had high CPU usage</li>
|
||
<li>Today these are the top 10 IPs:</li>
|
||
</ul>
|
||
|
||
<pre><code># zcat --force /var/log/nginx/*.log /var/log/nginx/*.log.1 | grep -E "03/Nov/2018" | awk '{print $1}' | sort | uniq -c | sort -n | tail -n 10
|
||
1300 66.249.64.63
|
||
1384 35.237.175.180
|
||
1430 138.201.52.218
|
||
1455 207.46.13.156
|
||
1500 40.77.167.175
|
||
1979 50.116.102.77
|
||
2790 66.249.64.61
|
||
3367 84.38.130.177
|
||
4537 70.32.83.92
|
||
22508 66.249.64.59
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>The <code>66.249.64.x</code> are definitely Google</li>
|
||
<li><code>70.32.83.92</code> is well known, probably CCAFS or something, as it’s only a few thousand requests and always to REST API</li>
|
||
<li><code>84.38.130.177</code> is some new IP in Latvia that is only hitting the XMLUI, using the following user agent:</li>
|
||
</ul>
|
||
|
||
<pre><code>Mozilla/5.0 (Windows NT 5.1) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.792.0 Safari/535.1
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>They at least seem to be re-using their Tomcat sessions:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ grep -c -E 'session_id=[A-Z0-9]{32}:ip_addr=84.38.130.177' dspace.log.2018-11-03 | sort | uniq
|
||
342
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li><code>50.116.102.77</code> is also a regular REST API user</li>
|
||
<li><code>40.77.167.175</code> and <code>207.46.13.156</code> seem to be Bing</li>
|
||
<li><code>138.201.52.218</code> seems to be on Hetzner in Germany, but is using this user agent:</li>
|
||
</ul>
|
||
|
||
<pre><code>Mozilla/5.0 (Macintosh; Intel Mac OS X 10.12; rv:62.0) Gecko/20100101 Firefox/62.0
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>And it doesn’t seem they are re-using their Tomcat sessions:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ grep -c -E 'session_id=[A-Z0-9]{32}:ip_addr=138.201.52.218' dspace.log.2018-11-03 | sort | uniq
|
||
1243
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>Ah, we’ve apparently seen this server exactly a year ago in 2017-11, making 40,000 requests in one day…</li>
|
||
<li>I wonder if it’s worth adding them to the list of bots in the nginx config?</li>
|
||
<li>Linode sent a mail that CGSpace (linode18) is using high outgoing bandwidth</li>
|
||
<li>Looking at the nginx logs again I see the following top ten IPs:</li>
|
||
</ul>
|
||
|
||
<pre><code># zcat --force /var/log/nginx/*.log /var/log/nginx/*.log.1 | grep -E "03/Nov/2018" | awk '{print $1}' | sort | uniq -c | sort -n | tail -n 10
|
||
1979 50.116.102.77
|
||
1980 35.237.175.180
|
||
2186 207.46.13.156
|
||
2208 40.77.167.175
|
||
2843 66.249.64.63
|
||
4220 84.38.130.177
|
||
4537 70.32.83.92
|
||
5593 66.249.64.61
|
||
12557 78.46.89.18
|
||
32152 66.249.64.59
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li><code>78.46.89.18</code> is new since I last checked a few hours ago, and it’s from Hetzner with the following user agent:</li>
|
||
</ul>
|
||
|
||
<pre><code>Mozilla/5.0 (Macintosh; Intel Mac OS X 10.12; rv:62.0) Gecko/20100101 Firefox/62.0
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>It’s making lots of requests and using quite a number of Tomcat sessions:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ grep -c -E 'session_id=[A-Z0-9]{32}:ip_addr=78.46.89.18' /home/cgspace.cgiar.org/log/dspace.log.2018-11-03 | sort | uniq
|
||
8449
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>I could add this IP to the list of bot IPs in nginx, but it seems like a futile effort when some new IP could come along and do the same thing</li>
|
||
<li>Perhaps I should think about adding rate limits to dynamic pages like <code>/discover</code> and <code>/browse</code></li>
|
||
<li>I think it’s reasonable for a human to click one of those links five or ten times a minute…</li>
|
||
<li>To contrast, <code>78.46.89.18</code> made about 300 requests per minute for a few hours today:</li>
|
||
</ul>
|
||
|
||
<pre><code># grep 78.46.89.18 /var/log/nginx/access.log | grep -o -E '03/Nov/2018:[0-9][0-9]:[0-9][0-9]' | sort | uniq -c | sort -n | tail -n 20
|
||
286 03/Nov/2018:18:02
|
||
287 03/Nov/2018:18:21
|
||
289 03/Nov/2018:18:23
|
||
291 03/Nov/2018:18:27
|
||
293 03/Nov/2018:18:34
|
||
300 03/Nov/2018:17:58
|
||
300 03/Nov/2018:18:22
|
||
300 03/Nov/2018:18:32
|
||
304 03/Nov/2018:18:12
|
||
305 03/Nov/2018:18:13
|
||
305 03/Nov/2018:18:24
|
||
312 03/Nov/2018:18:39
|
||
322 03/Nov/2018:18:17
|
||
326 03/Nov/2018:18:38
|
||
327 03/Nov/2018:18:16
|
||
330 03/Nov/2018:17:57
|
||
332 03/Nov/2018:18:19
|
||
336 03/Nov/2018:17:56
|
||
340 03/Nov/2018:18:14
|
||
341 03/Nov/2018:18:18
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>If they want to download all our metadata and PDFs they should use an API rather than scraping the XMLUI</li>
|
||
<li>I will add them to the list of bot IPs in nginx for now and think about enforcing rate limits in XMLUI later</li>
|
||
<li>Also, this is the third (?) time a mysterious IP on Hetzner has done this… who is this?</li>
|
||
</ul>
|
||
|
||
<h2 id="2018-11-04">2018-11-04</h2>
|
||
|
||
<ul>
|
||
<li>Forward Peter’s information about CGSpace financials to Modi from ICRISAT</li>
|
||
<li>Linode emailed about the CPU load and outgoing bandwidth on CGSpace (linode18) again</li>
|
||
<li>Here are the top ten IPs active so far this morning:</li>
|
||
</ul>
|
||
|
||
<pre><code># zcat --force /var/log/nginx/*.log /var/log/nginx/*.log.1 | grep -E "04/Nov/2018" | awk '{print $1}' | sort | uniq -c | sort -n | tail -n 10
|
||
1083 2a03:2880:11ff:2::face:b00c
|
||
1105 2a03:2880:11ff:d::face:b00c
|
||
1111 2a03:2880:11ff:f::face:b00c
|
||
1134 84.38.130.177
|
||
1893 50.116.102.77
|
||
2040 66.249.64.63
|
||
4210 66.249.64.61
|
||
4534 70.32.83.92
|
||
13036 78.46.89.18
|
||
20407 66.249.64.59
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li><code>78.46.89.18</code> is back… and still making tons of Tomcat sessions:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ grep -c -E 'session_id=[A-Z0-9]{32}:ip_addr=78.46.89.18' dspace.log.2018-11-04 | sort | uniq
|
||
8765
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>Also, now we have a ton of Facebook crawlers:</li>
|
||
</ul>
|
||
|
||
<pre><code># zcat --force /var/log/nginx/*.log /var/log/nginx/*.log.1 | grep -E "04/Nov/2018" | grep "2a03:2880:11ff:" | awk '{print $1}' | sort | uniq -c | sort -n
|
||
905 2a03:2880:11ff:b::face:b00c
|
||
955 2a03:2880:11ff:5::face:b00c
|
||
965 2a03:2880:11ff:e::face:b00c
|
||
984 2a03:2880:11ff:8::face:b00c
|
||
993 2a03:2880:11ff:3::face:b00c
|
||
994 2a03:2880:11ff:7::face:b00c
|
||
1006 2a03:2880:11ff:10::face:b00c
|
||
1011 2a03:2880:11ff:4::face:b00c
|
||
1023 2a03:2880:11ff:6::face:b00c
|
||
1026 2a03:2880:11ff:9::face:b00c
|
||
1039 2a03:2880:11ff:1::face:b00c
|
||
1043 2a03:2880:11ff:c::face:b00c
|
||
1070 2a03:2880:11ff::face:b00c
|
||
1075 2a03:2880:11ff:a::face:b00c
|
||
1093 2a03:2880:11ff:2::face:b00c
|
||
1107 2a03:2880:11ff:d::face:b00c
|
||
1116 2a03:2880:11ff:f::face:b00c
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>They are really making shit tons of Tomcat sessions:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ grep -c -E 'session_id=[A-Z0-9]{32}:ip_addr=2a03:2880:11ff' dspace.log.2018-11-04 | sort | uniq
|
||
14368
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>Their user agent is:</li>
|
||
</ul>
|
||
|
||
<pre><code>facebookexternalhit/1.1 (+http://www.facebook.com/externalhit_uatext.php)
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>I will add it to the Tomcat Crawler Session Manager valve</li>
|
||
<li>Later in the evening… ok, this Facebook bot is getting super annoying:</li>
|
||
</ul>
|
||
|
||
<pre><code># zcat --force /var/log/nginx/*.log /var/log/nginx/*.log.1 | grep -E "04/Nov/2018" | grep "2a03:2880:11ff:" | awk '{print $1}' | sort | uniq -c | sort -n
|
||
1871 2a03:2880:11ff:3::face:b00c
|
||
1885 2a03:2880:11ff:b::face:b00c
|
||
1941 2a03:2880:11ff:8::face:b00c
|
||
1942 2a03:2880:11ff:e::face:b00c
|
||
1987 2a03:2880:11ff:1::face:b00c
|
||
2023 2a03:2880:11ff:2::face:b00c
|
||
2027 2a03:2880:11ff:4::face:b00c
|
||
2032 2a03:2880:11ff:9::face:b00c
|
||
2034 2a03:2880:11ff:10::face:b00c
|
||
2050 2a03:2880:11ff:5::face:b00c
|
||
2061 2a03:2880:11ff:c::face:b00c
|
||
2076 2a03:2880:11ff:6::face:b00c
|
||
2093 2a03:2880:11ff:7::face:b00c
|
||
2107 2a03:2880:11ff::face:b00c
|
||
2118 2a03:2880:11ff:d::face:b00c
|
||
2164 2a03:2880:11ff:a::face:b00c
|
||
2178 2a03:2880:11ff:f::face:b00c
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>And still making shit tons of Tomcat sessions:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ grep -c -E 'session_id=[A-Z0-9]{32}:ip_addr=2a03:2880:11ff' dspace.log.2018-11-04 | sort | uniq
|
||
28470
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>And that’s even using the Tomcat Crawler Session Manager valve!</li>
|
||
<li>Maybe we need to limit more dynamic pages, like the “most popular” country, item, and author pages</li>
|
||
<li>It seems these are popular too, and there is no fucking way Facebook needs that information, yet they are requesting thousands of them!</li>
|
||
</ul>
|
||
|
||
<pre><code># grep 'face:b00c' /var/log/nginx/access.log /var/log/nginx/access.log.1 | grep -c 'most-popular/'
|
||
7033
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>I added the “most-popular” pages to the list that return <code>X-Robots-Tag: none</code> to try to inform bots not to index or follow those pages</li>
|
||
<li>Also, I implemented an nginx rate limit of twelve requests per minute on all dynamic pages… I figure a human user might legitimately request one every five seconds</li>
|
||
</ul>
|
||
|
||
<h2 id="2018-11-05">2018-11-05</h2>
|
||
|
||
<ul>
|
||
<li>I wrote a small Python script <a href="https://gist.github.com/alanorth/4ff81d5f65613814a66cb6f84fdf1fc5">add-dc-rights.py</a> to add usage rights (<code>dc.rights</code>) to CGSpace items based on the CSV Hector gave me from MARLO:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ ./add-dc-rights.py -i /tmp/marlo.csv -db dspace -u dspace -p 'fuuu'
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>The file <code>marlo.csv</code> was cleaned up and formatted in Open Refine</li>
|
||
<li>165 of the items in their 2017 data are from CGSpace!</li>
|
||
<li>I will add the data to CGSpace this week (done!)</li>
|
||
<li>Jesus, is Facebook <em>trying</em> to be annoying?</li>
|
||
</ul>
|
||
|
||
<pre><code># zcat --force /var/log/nginx/*.log /var/log/nginx/*.log.1 | grep -E "05/Nov/2018" | grep -c "2a03:2880:11ff:"
|
||
29889
|
||
# grep -c -E 'session_id=[A-Z0-9]{32}:ip_addr=2a03:2880:11ff' dspace.log.2018-11-05 | sort | uniq
|
||
29156
|
||
# zcat --force /var/log/nginx/*.log /var/log/nginx/*.log.1 | grep -E "05/Nov/2018" | grep "2a03:2880:11ff:" | grep -c -E "(handle|bitstream)"
|
||
29896
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>29,000 requests from Facebook, 29,000 Tomcat sessions, and none of the requests are to the dynamic pages I rate limited yesterday!</li>
|
||
</ul>
|
||
|
||
<h2 id="2018-11-06">2018-11-06</h2>
|
||
|
||
<ul>
|
||
<li>I updated all the <a href="https://github.com/ilri/DSpace/wiki/Scripts">DSpace helper Python scripts</a> to validate against PEP 8 using Flake8</li>
|
||
<li>While I was updating the <a href="https://gist.github.com/alanorth/ddd7f555f0e487fe0e9d3eb4ff26ce50">rest-find-collections.py</a> script I noticed it was using <code>expand=all</code> to get the collection and community IDs</li>
|
||
<li>I realized I actually only need <code>expand=collections,subCommunities</code>, and I wanted to see how much overhead the extra expands created so I did three runs of each:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ time ./rest-find-collections.py 10568/27629 --rest-url https://dspacetest.cgiar.org/rest
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>Average time with all expands was 14.3 seconds, and 12.8 seconds with <code>collections,subCommunities</code>, so <strong>1.5 seconds difference</strong>!</li>
|
||
</ul>
|
||
|
||
<h2 id="2018-11-07">2018-11-07</h2>
|
||
|
||
<ul>
|
||
<li>Update my <a href="https://github.com/ilri/dspace-statistics-api">dspace-statistics-api</a> to use a database management class with Python contexts so that connections and cursors are automatically opened and closed</li>
|
||
<li>Tag version 0.7.0 of the dspace-statistics-api</li>
|
||
</ul>
|
||
|
||
<h2 id="2018-11-08">2018-11-08</h2>
|
||
|
||
<ul>
|
||
<li>I deployed verison 0.7.0 of the dspace-statistics-api on DSpace Test (linode19) so I can test it for a few days (and check the Munin stats to see the change in database connections) before deploying on CGSpace</li>
|
||
<li>I also enabled systemd’s persistent journal by setting <a href="https://www.freedesktop.org/software/systemd/man/journald.conf.html"><code>Storage=persistent</code> in <em>journald.conf</em></a></li>
|
||
<li>Apparently <a href="https://www.freedesktop.org/software/systemd/man/journald.conf.html">Ubuntu 16.04 defaulted to using rsyslog for boot records until early 2018</a>, so I removed <code>rsyslog</code> too</li>
|
||
<li>Proof 277 IITA records on DSpace Test: <a href="https://dspacetest.cgiar.org/handle/10568/107871">IITA_ ALIZZY1802-csv_oct23</a>
|
||
|
||
<ul>
|
||
<li>There were a few issues with countries, a few language erorrs, a few whitespace errors, and then a handful of ISSNs in the ISBN field</li>
|
||
</ul></li>
|
||
</ul>
|
||
|
||
<h2 id="2018-11-11">2018-11-11</h2>
|
||
|
||
<ul>
|
||
<li>I added tests to the <a href="https://github.com/ilri/dspace-statistics-api">dspace-statistics-api</a>!</li>
|
||
<li>It runs with Python 3.5, 3.6, and 3.7 using pytest, including automatically on Travis CI!</li>
|
||
</ul>
|
||
|
||
<h2 id="2018-11-13">2018-11-13</h2>
|
||
|
||
<ul>
|
||
<li>Help troubleshoot an issue with Judy Kimani submitting to the <em>ILRI project reports, papers and documents</em> collection on CGSpace</li>
|
||
<li>For some reason there is an existing group for the “Accept/Reject” workflow step, but it’s empty</li>
|
||
<li>I added Judy to the group and told her to try again</li>
|
||
<li>Sisay changed his leave to be full days until December so I need to finish the IITA records that he was working on (<a href="https://dspacetest.cgiar.org/handle/10568/107871">IITA_ ALIZZY1802-csv_oct23</a></li>
|
||
<li>Sisay had said there were a few PDFs missing and Bosede sent them this week, so I had to find those items on DSpace Test and add the bitstreams to the items manually</li>
|
||
<li>As for the collection mappings I think I need to export the CSV from DSpace Test, add mappings for each type (ie Books go to IITA books collection, etc), then re-import to DSpace Test, then export from DSpace command line in “migrate” mode…</li>
|
||
<li>From there I should be able to script the removal of the old DSpace Test collection so they just go to the correct IITA collections on import into CGSpace</li>
|
||
</ul>
|
||
|
||
<h2 id="2018-11-14">2018-11-14</h2>
|
||
|
||
<ul>
|
||
<li>Finally import the 277 IITA (ALIZZY1802) records to CGSpace</li>
|
||
<li>I had to export them from DSpace Test and import them into a temporary collection on CGSpace first, then export the collection as CSV to map them to new owning collections (IITA books, IITA posters, etc) with OpenRefine because DSpace’s <code>dspace export</code> command doesn’t include the collections for the items!</li>
|
||
<li>Delete all old IITA collections on DSpace Test and run <code>dspace cleanup</code> to get rid of all the bitstreams</li>
|
||
</ul>
|
||
|
||
<h2 id="2018-11-15">2018-11-15</h2>
|
||
|
||
<ul>
|
||
<li>Deploy version 0.8.1 of the <a href="https://github.com/ilri/dspace-statistics-api">dspace-statistics-api</a> to CGSpace (linode18)</li>
|
||
</ul>
|
||
|
||
<h2 id="2018-11-18">2018-11-18</h2>
|
||
|
||
<ul>
|
||
<li>Request invoice from Wild Jordan for their meeting venue in January</li>
|
||
</ul>
|
||
|
||
<h2 id="2018-11-19">2018-11-19</h2>
|
||
|
||
<ul>
|
||
<li>Testing corrections and deletions for AGROVOC (<code>dc.subject</code>) that Sisay and Peter were working on earlier this month:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ ./fix-metadata-values.py -i 2018-11-19-correct-agrovoc.csv -f dc.subject -t correct -m 57 -db dspace -u dspace -p 'fuu' -d
|
||
$ ./delete-metadata-values.py -i 2018-11-19-delete-agrovoc.csv -f dc.subject -m 57 -db dspace -u dspace -p 'fuu' -d
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>Then I ran them on both CGSpace and DSpace Test, and started a full Discovery re-index on CGSpace:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ time schedtool -D -e ionice -c2 -n7 nice -n19 dspace index-discovery -b
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>Generate a new list of the top 1500 AGROVOC subjects on CGSpace to send to Peter and Sisay:</li>
|
||
</ul>
|
||
|
||
<pre><code>dspace=# \COPY (SELECT DISTINCT text_value, count(*) FROM metadatavalue WHERE metadata_field_id = 57 AND resource_type_id = 2 GROUP BY text_value ORDER BY count DESC LIMIT 1500) to /tmp/2018-11-19-top-1500-subject.csv WITH CSV HEADER;
|
||
</code></pre>
|
||
|
||
<h2 id="2018-11-20">2018-11-20</h2>
|
||
|
||
<ul>
|
||
<li>The Discovery re-indexing on CGSpace never finished yesterday… the command died after six minutes</li>
|
||
<li>The <code>dspace.log.2018-11-19</code> shows this at the time:</li>
|
||
</ul>
|
||
|
||
<pre><code>2018-11-19 15:23:04,221 ERROR com.atmire.dspace.discovery.AtmireSolrService @ DSpace kernel cannot be null
|
||
java.lang.IllegalStateException: DSpace kernel cannot be null
|
||
at org.dspace.utils.DSpace.getServiceManager(DSpace.java:63)
|
||
at org.dspace.utils.DSpace.getSingletonService(DSpace.java:87)
|
||
at com.atmire.dspace.discovery.AtmireSolrService.buildDocument(AtmireSolrService.java:102)
|
||
at com.atmire.dspace.discovery.AtmireSolrService.indexContent(AtmireSolrService.java:815)
|
||
at com.atmire.dspace.discovery.AtmireSolrService.updateIndex(AtmireSolrService.java:884)
|
||
at org.dspace.discovery.SolrServiceImpl.createIndex(SolrServiceImpl.java:370)
|
||
at org.dspace.discovery.IndexClient.main(IndexClient.java:117)
|
||
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
|
||
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
|
||
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
|
||
at java.lang.reflect.Method.invoke(Method.java:498)
|
||
at org.dspace.app.launcher.ScriptLauncher.runOneCommand(ScriptLauncher.java:226)
|
||
at org.dspace.app.launcher.ScriptLauncher.main(ScriptLauncher.java:78)
|
||
2018-11-19 15:23:04,223 INFO com.atmire.dspace.discovery.AtmireSolrService @ Processing (4629 of 76007): 72731
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>I looked in the Solr log around that time and I don’t see anything…</li>
|
||
<li>Working on Udana’s WLE records from last month, first the sixteen records in <a href="https://dspacetest.cgiar.org/handle/10568/108254">2018-11-20 RDL Temp</a>
|
||
|
||
<ul>
|
||
<li>these items will go to the <a href="https://dspacetest.cgiar.org/handle/10568/81592">Restoring Degraded Landscapes collection</a></li>
|
||
<li>a few items missing DOIs, but they are easily available on the publication page</li>
|
||
<li>clean up DOIs to use “<a href="https://doi.org"">https://doi.org"</a> format</li>
|
||
<li>clean up some cg.identifier.url to remove unneccessary query strings</li>
|
||
<li>remove columns with no metadata (river basin, place, target audience, isbn, uri, publisher, ispartofseries, subject)</li>
|
||
<li>fix column with invalid spaces in metadata field name (cg. subject. wle)</li>
|
||
<li>trim and collapse whitespace in all fields</li>
|
||
<li>remove some weird Unicode characters (0xfffd) from abstracts, citations, and titles using Open Refine: <code>value.replace('<27>','')</code></li>
|
||
<li>add dc.rights to some fields that I noticed while checking DOIs</li>
|
||
</ul></li>
|
||
<li>Then the 24 records in <a href="https://dspacetest.cgiar.org/handle/10568/108271">2018-11-20 VRC Temp</a>
|
||
|
||
<ul>
|
||
<li>these items will go to the <a href="https://dspacetest.cgiar.org/handle/10568/81589">Variability, Risks and Competing Uses collection</a></li>
|
||
<li>trim and collapse whitespace in all fields (lots in WLE subject!)</li>
|
||
<li>clean up some cg.identifier.url fields that had unneccessary anchors in their links</li>
|
||
<li>clean up DOIs to use “<a href="https://doi.org"">https://doi.org"</a> format</li>
|
||
<li>fix column with invalid spaces in metadata field name (cg. subject. wle)</li>
|
||
<li>remove columns with no metadata (place, target audience, isbn, uri, publisher, ispartofseries, subject)</li>
|
||
<li>remove some weird Unicode characters (0xfffd) from abstracts, citations, and titles using Open Refine: <code>value.replace('<27>','')</code></li>
|
||
<li>I notice a few items using DOIs pointing at ICARDA’s DSpace like: <a href="https://doi.org/20.500.11766/8178">https://doi.org/20.500.11766/8178</a>, which then points at the “real” DOI on the publisher’s site… these should be using the real DOI instead of ICARDA’s “fake” Handle DOI</li>
|
||
<li>Some items missing DOIs, but they clearly have them if you look at the publisher’s site</li>
|
||
</ul></li>
|
||
</ul>
|
||
|
||
<h2 id="2018-11-22">2018-11-22</h2>
|
||
|
||
<ul>
|
||
<li>Tezira is having problems submitting to the <a href="https://cgspace.cgiar.org/handle/10568/24452">ILRI brochures</a> collection for some reason
|
||
|
||
<ul>
|
||
<li>Judy Kimani was having issues resuming submissions in another ILRI collection recently, and the issue there was due to an empty group defined for the “accept/reject” step (aka workflow step 1)</li>
|
||
<li>The error then was “authorization denied for workflow step 1” where “workflow step 1” was the “accept/reject” step, which had a group defined, but was empty</li>
|
||
<li>Adding her to this group solved her issues</li>
|
||
<li>Tezira says she’s also getting the same “authorization denied” error for workflow step 1 when resuming submissions, so I told Abenet to delete the empty group</li>
|
||
</ul></li>
|
||
</ul>
|
||
|
||
<h2 id="2018-11-26">2018-11-26</h2>
|
||
|
||
<ul>
|
||
<li><a href="https://cgspace.cgiar.org/handle/10568/97709">This WLE item</a> is issued on 2018-10 and accessioned on 2018-10-22 but does not show up in the <a href="https://cgspace.cgiar.org/handle/10568/41888">WLE R4D Learning Series</a> collection on CGSpace for some reason, and therefore does not show up on the WLE publication website</li>
|
||
<li>I tried to remove that collection from Discovery and do a simple re-index:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ dspace index-discovery -r 10568/41888
|
||
$ time schedtool -D -e ionice -c2 -n7 nice -n19 dspace index-discovery
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>… but the item still doesn’t appear in the collection</li>
|
||
<li>Now I will try a full Discovery re-index:</li>
|
||
</ul>
|
||
|
||
<pre><code>$ time schedtool -D -e ionice -c2 -n7 nice -n19 dspace index-discovery -b
|
||
</code></pre>
|
||
|
||
<ul>
|
||
<li>Ah, Marianne had set the item as private when she uploaded it, so it was still private</li>
|
||
<li>I made it public and now it shows up in the collection list</li>
|
||
</ul>
|
||
|
||
<!-- vim: set sw=2 ts=2: -->
|
||
|
||
|
||
|
||
|
||
|
||
</article>
|
||
|
||
|
||
|
||
</div> <!-- /.blog-main -->
|
||
|
||
<aside class="col-sm-3 ml-auto blog-sidebar">
|
||
|
||
|
||
|
||
<section class="sidebar-module">
|
||
<h4>Recent Posts</h4>
|
||
<ol class="list-unstyled">
|
||
|
||
|
||
<li><a href="/cgspace-notes/2018-11/">November, 2018</a></li>
|
||
|
||
<li><a href="/cgspace-notes/2018-10/">October, 2018</a></li>
|
||
|
||
<li><a href="/cgspace-notes/2018-09/">September, 2018</a></li>
|
||
|
||
<li><a href="/cgspace-notes/2018-08/">August, 2018</a></li>
|
||
|
||
<li><a href="/cgspace-notes/2018-07/">July, 2018</a></li>
|
||
|
||
</ol>
|
||
</section>
|
||
|
||
|
||
|
||
|
||
<section class="sidebar-module">
|
||
<h4>Links</h4>
|
||
<ol class="list-unstyled">
|
||
|
||
<li><a href="https://cgspace.cgiar.org">CGSpace</a></li>
|
||
|
||
<li><a href="https://dspacetest.cgiar.org">DSpace Test</a></li>
|
||
|
||
<li><a href="https://github.com/ilri/DSpace">CGSpace @ GitHub</a></li>
|
||
|
||
</ol>
|
||
</section>
|
||
|
||
</aside>
|
||
|
||
|
||
</div> <!-- /.row -->
|
||
</div> <!-- /.container -->
|
||
|
||
|
||
|
||
<footer class="blog-footer">
|
||
<p>
|
||
|
||
Blog template created by <a href="https://twitter.com/mdo">@mdo</a>, ported to Hugo by <a href='https://twitter.com/mralanorth'>@mralanorth</a>.
|
||
|
||
</p>
|
||
<p>
|
||
<a href="#">Back to top</a>
|
||
</p>
|
||
</footer>
|
||
|
||
|
||
</body>
|
||
|
||
</html>
|