mirror of
https://github.com/alanorth/cgspace-notes.git
synced 2024-06-09 13:25:10 +02:00
130 lines
4.8 KiB
Markdown
130 lines
4.8 KiB
Markdown
---
|
|
title: "December, 2023"
|
|
date: 2023-12-01T08:48:36+03:00
|
|
author: "Alan Orth"
|
|
categories: ["Notes"]
|
|
---
|
|
|
|
## 2023-12-01
|
|
|
|
- There is still high load on CGSpace and I don't know why
|
|
- I don't see a high number of sessions compared to previous days in the last few weeks
|
|
|
|
<!-- more -->
|
|
|
|
```console
|
|
$ for file in dspace.log.2023-11-[23]*; do echo "$file"; grep -a -oE 'session_id=[A-Z0-9]{32}' "$file" | sort | uniq | wc -l; done
|
|
dspace.log.2023-11-20
|
|
22865
|
|
dspace.log.2023-11-21
|
|
20296
|
|
dspace.log.2023-11-22
|
|
19688
|
|
dspace.log.2023-11-23
|
|
17906
|
|
dspace.log.2023-11-24
|
|
18453
|
|
dspace.log.2023-11-25
|
|
17513
|
|
dspace.log.2023-11-26
|
|
19037
|
|
dspace.log.2023-11-27
|
|
21103
|
|
dspace.log.2023-11-28
|
|
23023
|
|
dspace.log.2023-11-29
|
|
23545
|
|
dspace.log.2023-11-30
|
|
21298
|
|
```
|
|
|
|
- Even the number of unique IPs is not very high compared to the last week or so:
|
|
|
|
```console
|
|
# awk '{print $1}' /var/log/nginx/{access,library-access,oai,rest}.log.1 | sort | uniq | wc -l
|
|
17023
|
|
# awk '{print $1}' /var/log/nginx/{access,library-access,oai,rest}.log.2.gz | sort | uniq | wc -l
|
|
17294
|
|
# awk '{print $1}' /var/log/nginx/{access,library-access,oai,rest}.log.3.gz | sort | uniq | wc -l
|
|
22057
|
|
# awk '{print $1}' /var/log/nginx/{access,library-access,oai,rest}.log.4.gz | sort | uniq | wc -l
|
|
32956
|
|
# awk '{print $1}' /var/log/nginx/{access,library-access,oai,rest}.log.5.gz | sort | uniq | wc -l
|
|
11415
|
|
# awk '{print $1}' /var/log/nginx/{access,library-access,oai,rest}.log.6.gz | sort | uniq | wc -l
|
|
15444
|
|
# awk '{print $1}' /var/log/nginx/{access,library-access,oai,rest}.log.7.gz | sort | uniq | wc -l
|
|
12648
|
|
```
|
|
|
|
- It doesn't make any sense so I think I'm going to restart the server...
|
|
- After restarting the server the load went down to normal levels... who knows...
|
|
- I started trying to see how I'm going to generate the fake statistics for the Alliance bitstream that was replaced
|
|
- I exported all the statistics for the owningItem now:
|
|
|
|
```console
|
|
$ chrt -b 0 ./run.sh -s http://localhost:8081/solr/statistics -a export -o /tmp/stats-export.json -f 'owningItem:b5862bfa-9799-4167-b1cf-76f0f4ea1e18' -k uid
|
|
```
|
|
|
|
- Importing them into DSpace Test didn't show the statistics in the Atmire module, but I see them in Solr...
|
|
|
|
## 2023-12-02
|
|
|
|
- Export CGSpace to check for missing Initiative collection mappings
|
|
- Start a harvest on AReS
|
|
|
|
## 2023-12-04
|
|
|
|
- Send a message to Altmetric support because the item IWMI highlighted last month still doesn't show the attention score for the Handle after I tweeted it several times weeks ago
|
|
- Spent some time writing a Python script to fix the literal MaxMind City JSON objects in our Solr statistics
|
|
- There are about 1.6 million of these, so I exported them using solr-import-export-json with the query `city:com*` but ended up finding many that have missing bundles, container bitstreams, etc:
|
|
|
|
```
|
|
city:com* AND -bundleName:[* TO *] AND -containerBitstream:[* TO *] AND -file_id:[* TO *] AND -owningItem:[* TO *] AND -version_id:[* TO *]
|
|
```
|
|
|
|
- (Note the negation to find fields that are missing)
|
|
- I don't know what I want to do with these yet
|
|
|
|
## 2023-12-05
|
|
|
|
- I finished the `fix_maxmind_stats.py` script and fixed 1.6 million records and imported them on CGSpace after testing on DSpace 7 Test
|
|
- Altmetric said there was a glitch regarding the Handle and DOI linking and they successfully re-scraped the item page and linked them
|
|
- They sent me a list of current production IPs and I notice that some of them are in our nginx bot network list:
|
|
|
|
```console
|
|
$ for network in $(csvcut -c network /tmp/ips.csv | sed 1d | sort -u); do grepcidr $network ~/src/git/rmg-ansible-public/roles/dspace/files/nginx/bot-networks.conf; done
|
|
108.128.0.0/13 'bot';
|
|
46.137.0.0/16 'bot';
|
|
52.208.0.0/13 'bot';
|
|
52.48.0.0/13 'bot';
|
|
54.194.0.0/15 'bot';
|
|
54.216.0.0/14 'bot';
|
|
54.220.0.0/15 'bot';
|
|
54.228.0.0/15 'bot';
|
|
63.32.242.35/32 'bot';
|
|
63.32.0.0/14 'bot';
|
|
99.80.0.0/15 'bot'
|
|
```
|
|
|
|
- I will remove those for now so that Altmetric doesn't have any unexpected issues harvesting
|
|
|
|
## 2023-12-08
|
|
|
|
- Finalized the script to generate Solr statistics for Alliance research Mirjam
|
|
- The script is `ilri/generate_solr_statistics.py`
|
|
- I generated ~3,200 statistics based on her records of the download statistics of [that item](https://hdl.handle.net/10568/131997) and imported them on CGSpace
|
|
- Peter asked for lists of affiliations, investors, and publishers to do some cleanups
|
|
- I generated a list from a CSV export instead of doing it based on a SQL dump...
|
|
|
|
```console
|
|
$ csvcut -c 'cg.contributor.affiliation[en_US]' /tmp/initiatives.csv \
|
|
| sed -e 1d -e 's/^"//' -e 's/"$//' -e 's/||/\n/g' -e '/^$/d' \
|
|
| sort | uniq -c | sort -hr \
|
|
| awk 'BEGIN { FS = "^[[:space:]]+[[:digit:]]+[[:space:]]+" } {print $2}'\
|
|
| sed -e '1i cg.contributor.affiliation' -e 's/^\(.*\)$/"\1"/' \
|
|
> /tmp/2023-12-08-initiatives-affiliations.csv
|
|
```
|
|
|
|
<!-- vim: set sw=2 ts=2: -->
|