mirror of
https://github.com/alanorth/cgspace-notes.git
synced 2024-11-22 06:35:03 +01:00
Update notes for 2019-03-28
This commit is contained in:
parent
a44d7daa44
commit
f80b97238c
@ -936,5 +936,42 @@ $ grep -I -c 45.5.184.72 dspace.log.2019-03-26
|
||||
## 2019-03-28
|
||||
|
||||
- Run the corrections and deletions to AGROVOC (dc.subject) on DSpace Test and CGSpace, and then start a full re-index of Discovery
|
||||
- What the hell is going on with this CTA publication?
|
||||
|
||||
```
|
||||
# grep Spore-192-EN-web.pdf /var/log/nginx/access.log | awk '{print $1}' | sort | uniq -c | sort -n
|
||||
1 37.48.65.147
|
||||
1 80.113.172.162
|
||||
2 108.174.5.117
|
||||
2 83.110.14.208
|
||||
4 18.196.8.188
|
||||
84 18.195.78.144
|
||||
644 18.194.46.84
|
||||
1144 18.196.196.108
|
||||
```
|
||||
|
||||
- None of these 18.x.x.x IPs specify a user agent and they are all on Amazon!
|
||||
- Shortly after I started the re-indexing UptimeRobot began to complain that CGSpace was down, then up, then down, then up...
|
||||
- I see the load on the server is about 10.0 again for some reason though I don't know WHAT is causing that load
|
||||
- It could be the CPU steal metric, as if Linode has oversold the CPU resources on this VM host...
|
||||
- Here are the Munin graphs of CPU usage for the last day, week, and year:
|
||||
|
||||
![CPU day](/cgspace-notes/2019/03/cpu-day-fs8.png)
|
||||
|
||||
![CPU week](/cgspace-notes/2019/03/cpu-week-fs8.png)
|
||||
|
||||
![CPU year](/cgspace-notes/2019/03/cpu-year-fs8.png)
|
||||
|
||||
- What's clear from this is that some other VM on our host has heavy usage for about four hours at 6AM and 6PM and that during that time the load on our server spikes
|
||||
- CPU steal has drastically increased since March 25th
|
||||
- It might be time to move to a dedicated CPU VM instances, or even real servers
|
||||
- In other news, I see that it's not even the end of the month yet and we have 3.6 million hits already:
|
||||
|
||||
```
|
||||
# zcat --force /var/log/nginx/* | grep -cE "[0-9]{1,2}/Mar/2019"
|
||||
3654911
|
||||
```
|
||||
|
||||
- In other other news I see that DSpace has no statistics for years before 2019 currently, yet when I connect to Solr I see all the cores up
|
||||
|
||||
<!-- vim: set sw=2 ts=2: -->
|
||||
|
@ -25,7 +25,7 @@ I think I will need to ask Udana to re-copy and paste the abstracts with more ca
|
||||
<meta property="og:type" content="article" />
|
||||
<meta property="og:url" content="https://alanorth.github.io/cgspace-notes/2019-03/" />
|
||||
<meta property="article:published_time" content="2019-03-01T12:16:30+01:00"/>
|
||||
<meta property="article:modified_time" content="2019-03-27T09:51:30+02:00"/>
|
||||
<meta property="article:modified_time" content="2019-03-28T19:10:06+02:00"/>
|
||||
|
||||
<meta name="twitter:card" content="summary"/>
|
||||
<meta name="twitter:title" content="March, 2019"/>
|
||||
@ -55,9 +55,9 @@ I think I will need to ask Udana to re-copy and paste the abstracts with more ca
|
||||
"@type": "BlogPosting",
|
||||
"headline": "March, 2019",
|
||||
"url": "https://alanorth.github.io/cgspace-notes/2019-03/",
|
||||
"wordCount": "5900",
|
||||
"wordCount": "6151",
|
||||
"datePublished": "2019-03-01T12:16:30+01:00",
|
||||
"dateModified": "2019-03-27T09:51:30+02:00",
|
||||
"dateModified": "2019-03-28T19:10:06+02:00",
|
||||
"author": {
|
||||
"@type": "Person",
|
||||
"name": "Alan Orth"
|
||||
@ -1222,6 +1222,53 @@ $ ./delete-metadata-values.py -i /tmp/2019-03-26-AGROVOC-79-deletions.csv -db ds
|
||||
|
||||
<ul>
|
||||
<li>Run the corrections and deletions to AGROVOC (dc.subject) on DSpace Test and CGSpace, and then start a full re-index of Discovery</li>
|
||||
<li>What the hell is going on with this CTA publication?</li>
|
||||
</ul>
|
||||
|
||||
<pre><code># grep Spore-192-EN-web.pdf /var/log/nginx/access.log | awk '{print $1}' | sort | uniq -c | sort -n
|
||||
1 37.48.65.147
|
||||
1 80.113.172.162
|
||||
2 108.174.5.117
|
||||
2 83.110.14.208
|
||||
4 18.196.8.188
|
||||
84 18.195.78.144
|
||||
644 18.194.46.84
|
||||
1144 18.196.196.108
|
||||
</code></pre>
|
||||
|
||||
<ul>
|
||||
<li>None of these 18.x.x.x IPs specify a user agent and they are all on Amazon!</li>
|
||||
<li>Shortly after I started the re-indexing UptimeRobot began to complain that CGSpace was down, then up, then down, then up…</li>
|
||||
<li>I see the load on the server is about 10.0 again for some reason though I don’t know WHAT is causing that load
|
||||
|
||||
<ul>
|
||||
<li>It could be the CPU steal metric, as if Linode has oversold the CPU resources on this VM host…</li>
|
||||
</ul></li>
|
||||
<li>Here are the Munin graphs of CPU usage for the last day, week, and year:</li>
|
||||
</ul>
|
||||
|
||||
<p><img src="/cgspace-notes/2019/03/cpu-day-fs8.png" alt="CPU day" /></p>
|
||||
|
||||
<p><img src="/cgspace-notes/2019/03/cpu-week-fs8.png" alt="CPU week" /></p>
|
||||
|
||||
<p><img src="/cgspace-notes/2019/03/cpu-year-fs8.png" alt="CPU year" /></p>
|
||||
|
||||
<ul>
|
||||
<li>What’s clear from this is that some other VM on our host has heavy usage for about four hours at 6AM and 6PM and that during that time the load on our server spikes
|
||||
|
||||
<ul>
|
||||
<li>CPU steal has drastically increased since March 25th</li>
|
||||
<li>It might be time to move to a dedicated CPU VM instances, or even real servers</li>
|
||||
</ul></li>
|
||||
<li>In other news, I see that it’s not even the end of the month yet and we have 3.6 million hits already:</li>
|
||||
</ul>
|
||||
|
||||
<pre><code># zcat --force /var/log/nginx/* | grep -cE "[0-9]{1,2}/Mar/2019"
|
||||
3654911
|
||||
</code></pre>
|
||||
|
||||
<ul>
|
||||
<li>In other other news I see that DSpace has no statistics for years before 2019 currently, yet when I connect to Solr I see all the cores up</li>
|
||||
</ul>
|
||||
|
||||
<!-- vim: set sw=2 ts=2: -->
|
||||
|
BIN
docs/2019/03/cpu-day-fs8.png
Normal file
BIN
docs/2019/03/cpu-day-fs8.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 17 KiB |
BIN
docs/2019/03/cpu-week-fs8.png
Normal file
BIN
docs/2019/03/cpu-week-fs8.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 21 KiB |
BIN
docs/2019/03/cpu-year-fs8.png
Normal file
BIN
docs/2019/03/cpu-year-fs8.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 13 KiB |
@ -4,7 +4,7 @@
|
||||
|
||||
<url>
|
||||
<loc>https://alanorth.github.io/cgspace-notes/2019-03/</loc>
|
||||
<lastmod>2019-03-27T09:51:30+02:00</lastmod>
|
||||
<lastmod>2019-03-28T19:10:06+02:00</lastmod>
|
||||
</url>
|
||||
|
||||
<url>
|
||||
@ -214,7 +214,7 @@
|
||||
|
||||
<url>
|
||||
<loc>https://alanorth.github.io/cgspace-notes/</loc>
|
||||
<lastmod>2019-03-27T09:51:30+02:00</lastmod>
|
||||
<lastmod>2019-03-28T19:10:06+02:00</lastmod>
|
||||
<priority>0</priority>
|
||||
</url>
|
||||
|
||||
@ -225,7 +225,7 @@
|
||||
|
||||
<url>
|
||||
<loc>https://alanorth.github.io/cgspace-notes/tags/notes/</loc>
|
||||
<lastmod>2019-03-27T09:51:30+02:00</lastmod>
|
||||
<lastmod>2019-03-28T19:10:06+02:00</lastmod>
|
||||
<priority>0</priority>
|
||||
</url>
|
||||
|
||||
@ -237,13 +237,13 @@
|
||||
|
||||
<url>
|
||||
<loc>https://alanorth.github.io/cgspace-notes/posts/</loc>
|
||||
<lastmod>2019-03-27T09:51:30+02:00</lastmod>
|
||||
<lastmod>2019-03-28T19:10:06+02:00</lastmod>
|
||||
<priority>0</priority>
|
||||
</url>
|
||||
|
||||
<url>
|
||||
<loc>https://alanorth.github.io/cgspace-notes/tags/</loc>
|
||||
<lastmod>2019-03-27T09:51:30+02:00</lastmod>
|
||||
<lastmod>2019-03-28T19:10:06+02:00</lastmod>
|
||||
<priority>0</priority>
|
||||
</url>
|
||||
|
||||
|
BIN
static/2019/03/cpu-day-fs8.png
Normal file
BIN
static/2019/03/cpu-day-fs8.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 17 KiB |
BIN
static/2019/03/cpu-week-fs8.png
Normal file
BIN
static/2019/03/cpu-week-fs8.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 21 KiB |
BIN
static/2019/03/cpu-year-fs8.png
Normal file
BIN
static/2019/03/cpu-year-fs8.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 13 KiB |
Loading…
Reference in New Issue
Block a user