mirror of
https://github.com/ilri/dspace-statistics-api.git
synced 2025-05-10 23:26:02 +02:00
Compare commits
80 Commits
Author | SHA1 | Date | |
---|---|---|---|
7499b89d99
|
|||
2c1e4952b1
|
|||
379f202c3f
|
|||
560fa6056d
|
|||
385a34e5d0
|
|||
d0ea62d2bd
|
|||
366ae25b8e
|
|||
0f3054ae03
|
|||
6bf34235d4
|
|||
e604d8ca81
|
|||
fc35b816f3
|
|||
9e6a2f7559
|
|||
46cfc3ffbc
|
|||
2850035a4c
|
|||
c0b550109a
|
|||
bfceffd84d
|
|||
d0552f5047
|
|||
c3a0bf7f44
|
|||
6e47e9c9ee
|
|||
cd90d618d6
|
|||
280d211d56
|
|||
806d63137f
|
|||
f7c7390e4f
|
|||
702724e8a4
|
|||
36818d03ef
|
|||
4cf8656b35
|
|||
f30a464cd1
|
|||
93ae12e313
|
|||
dc978e9333
|
|||
295436fea0
|
|||
46a1476ab0
|
|||
87dbb6c4df
|
|||
3160c44566
|
|||
4b72f626d9
|
|||
2d3b7620e3
|
|||
6e4bc630f7
|
|||
44884140e5
|
|||
74ff86ee3b
|
|||
3327884f21
|
|||
8f7450f67a
|
|||
28d61fb041
|
|||
cbc98991b4
|
|||
6c28be0463
|
|||
42e8f17305
|
|||
19a45f3f6f
|
|||
505ef31101
|
|||
1543cacc54
|
|||
2cab456f16
|
|||
53615dea2d
|
|||
2d8d1e6833
|
|||
e26e595ea1
|
|||
a9151b5bbf
|
|||
76833d6f5f
|
|||
a51422273c
|
|||
89621af85d
|
|||
c554404d7f
|
|||
90d7a452bd
|
|||
431a1c9d64
|
|||
e1b9d1284f
|
|||
bac764a0a4
|
|||
1a650e57c0
|
|||
2db5e02be9
|
|||
9e942736b1
|
|||
ea85393b13
|
|||
cbeb7c89a7
|
|||
b0d81a543c
|
|||
84801a4ab5
|
|||
4e8621e3d9
|
|||
2c8430171d
|
|||
fb60133713
|
|||
9e01a80011
|
|||
a263996582
|
|||
ed9d25294e
|
|||
5e165d2e88
|
|||
8e29fd8a43
|
|||
24af83b03f
|
|||
a87aaba812
|
|||
57faec59c8
|
|||
06ab254017
|
|||
5b5cab8b34
|
9
.travis.yml
Normal file
9
.travis.yml
Normal file
@ -0,0 +1,9 @@
|
||||
language: python
|
||||
python:
|
||||
- "3.5"
|
||||
- "3.6"
|
||||
- "3.7"
|
||||
install:
|
||||
- pip install -r requirements.txt
|
||||
|
||||
# vim: ts=2 sw=2 et
|
67
CHANGELOG.md
67
CHANGELOG.md
@ -4,6 +4,73 @@ All notable changes to this project will be documented in this file.
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## [0.4.1] - 2018-09-26
|
||||
### Changed
|
||||
- Use execute_values() to batch insert records to PostgreSQL
|
||||
|
||||
## [0.4.0] - 2018-09-25
|
||||
### Fixed
|
||||
- Invalid OnCalendar syntax in dspace-statistics-indexer.timer
|
||||
- Major logic error in indexer.py
|
||||
|
||||
## [0.3.2] - 2018-09-25
|
||||
## Changed
|
||||
- /item/id route now returns HTTP 404 if an item is not found
|
||||
|
||||
## [0.3.1] - 2018-09-25
|
||||
### Changed
|
||||
- Force SolrClient's kazoo dependency to version 2.5.0 to work with Python 3.7
|
||||
- Add Python 3.7 to Travis CI configuration
|
||||
|
||||
## [0.3.0] - 2018-09-25
|
||||
### Added
|
||||
- requirements.txt for pip
|
||||
- Travis CI build configuration for Python 3.5 and 3.6
|
||||
- Documentation on using the API
|
||||
|
||||
### Changed
|
||||
- The "all items" route from / to /items
|
||||
|
||||
## [0.2.1] - 2018-09-24
|
||||
### Changed
|
||||
- Environment settings in example systemd unit files
|
||||
- Use psycopg2.extras.DictCursor for PostgreSQL connection
|
||||
|
||||
## [0.2.0] - 2018-09-24
|
||||
### Changed
|
||||
- Use PostgreSQL instead of SQLite because UPSERT support needs a very new libsqlite3 whereas it's already in PostgreSQL 9.5+
|
||||
|
||||
## [0.1.0] - 2018-09-24
|
||||
### Changed
|
||||
- Rename project to "DSpace Statistics API"
|
||||
- Use read-only database connection in API
|
||||
- Update systemd units for CGSpace→DSpace rename
|
||||
- Use UPSERT to simplify database schema and Python logic
|
||||
|
||||
### Added
|
||||
- Example systemd service and timer unit for indexer service
|
||||
- Add top-level route to expose all item statistics
|
||||
|
||||
### Removed
|
||||
- Ability to customize SOLR_CORE variable
|
||||
|
||||
## [0.0.4] - 2018-09-23
|
||||
### Added
|
||||
- Added example systemd unit file for API
|
||||
- Added indexer.py to ingest views and downloads from Solr to a SQLite database
|
||||
|
||||
### Changed
|
||||
- Refactor Solr configuration and connection
|
||||
- /item route now expects id as part of the URI instead of a query parameter: /item/id
|
||||
- View and download stats are now fetched from a SQLite database
|
||||
|
||||
## [0.0.3] - 2018-09-20
|
||||
### Changed
|
||||
- Refactor environment variables into config module
|
||||
- Simplify Solr query for "downloads"
|
||||
- Optimize Solr query by using rows=0
|
||||
- Fix Solr queries for item views
|
||||
|
||||
## [0.0.2] - 2018-09-18
|
||||
### Added
|
||||
- Ability to get Solr parameters from environment (`SOLR_SERVER` and `SOLR_CORE`)
|
||||
|
21
README.md
21
README.md
@ -1,19 +1,30 @@
|
||||
# CGSpace Statistics API
|
||||
# DSpace Statistics API
|
||||
A quick and dirty REST API to expose Solr view and download statistics for items in a DSpace repository.
|
||||
|
||||
Written and tested in Python 3.6. SolrClient (0.2.1) does not currently run in Python 3.7.0.
|
||||
Written and tested in Python 3.5, 3.6, and 3.7. Requires PostgreSQL version 9.5 or greater for [`UPSERT` support](https://wiki.postgresql.org/wiki/UPSERT).
|
||||
|
||||
## Installation
|
||||
Create a virtual environment and run it:
|
||||
|
||||
$ virtualenv -p /usr/bin/python3.6 venv
|
||||
$ python -m venv venv
|
||||
$ . venv/bin/activate
|
||||
$ pip install falcon gunicorn SolrClient
|
||||
$ pip install -r requirements.txt
|
||||
$ gunicorn app:api
|
||||
|
||||
## Using the API
|
||||
The API exposes the following endpoints:
|
||||
|
||||
- GET `/items` — return views and downloads for all items that Solr knows about¹. Accepts `limit` and `page` query parameters for pagination of results.
|
||||
- GET `/item/id` — return views and downloads for a single item (*id* must be a positive integer). Returns HTTP 404 if an item id is not found.
|
||||
|
||||
¹ We are querying the Solr statistics core, which technically only knows about items that have either views or downloads.
|
||||
|
||||
## Todo
|
||||
|
||||
- Take a list of items (POST in JSON?)
|
||||
- Add API documentation
|
||||
- Close up DB connection when gunicorn shuts down gracefully
|
||||
- Better logging
|
||||
- Tests
|
||||
|
||||
## License
|
||||
This work is licensed under the [GPLv3](https://www.gnu.org/licenses/gpl-3.0.en.html).
|
||||
|
86
app.py
86
app.py
@ -1,46 +1,72 @@
|
||||
# Tested with Python 3.6
|
||||
# See DSpace Solr docs for tips about parameters
|
||||
# https://wiki.duraspace.org/display/DSPACE/Solr
|
||||
|
||||
from database import database_connection
|
||||
import falcon
|
||||
import os
|
||||
from SolrClient import SolrClient
|
||||
from solr import solr_connection
|
||||
|
||||
# Check if Solr connection information was provided in the environment
|
||||
solr_server = os.environ.get('SOLR_SERVER', 'http://localhost:8080/solr')
|
||||
solr_core = os.environ.get('SOLR_CORE', 'statistics')
|
||||
db = database_connection()
|
||||
db.set_session(readonly=True)
|
||||
solr = solr_connection()
|
||||
|
||||
class ItemResource:
|
||||
class AllItemsResource:
|
||||
def on_get(self, req, resp):
|
||||
"""Handles GET requests"""
|
||||
# Return HTTPBadRequest if id parameter is not present and valid
|
||||
item_id = req.get_param_as_int("id", required=True, min=0)
|
||||
limit = req.get_param_as_int("limit", min=0, max=100) or 100
|
||||
page = req.get_param_as_int("page", min=0) or 0
|
||||
offset = limit * page
|
||||
|
||||
solr = SolrClient(solr_server)
|
||||
cursor = db.cursor()
|
||||
|
||||
# Get views
|
||||
res = solr.query(solr_core, {
|
||||
'q':'type:0',
|
||||
'fq':'owningItem:{0} AND isBot:false AND statistics_type:view AND -bundleName:ORIGINAL'.format(item_id)
|
||||
})
|
||||
# get total number of items so we can estimate the pages
|
||||
cursor.execute('SELECT COUNT(id) FROM items')
|
||||
pages = round(cursor.fetchone()[0] / limit)
|
||||
|
||||
views = res.get_num_found()
|
||||
# get statistics, ordered by id, and use limit and offset to page through results
|
||||
cursor.execute('SELECT id, views, downloads FROM items ORDER BY id ASC LIMIT {} OFFSET {}'.format(limit, offset))
|
||||
results = cursor.fetchmany(limit)
|
||||
cursor.close()
|
||||
|
||||
# Get downloads
|
||||
res = solr.query(solr_core, {
|
||||
'q':'type:0',
|
||||
'fq':'owningItem:{0} AND isBot:false AND statistics_type:view AND -(bundleName:[* TO *] -bundleName:ORIGINAL)'.format(item_id)
|
||||
})
|
||||
# create a list to hold dicts of item stats
|
||||
statistics = list()
|
||||
|
||||
downloads = res.get_num_found()
|
||||
# iterate over results and build statistics object
|
||||
for item in results:
|
||||
statistics.append({ 'id': item['id'], 'views': item['views'], 'downloads': item['downloads'] })
|
||||
|
||||
statistics = {
|
||||
'id': item_id,
|
||||
'views': views,
|
||||
'downloads': downloads
|
||||
message = {
|
||||
'currentPage': page,
|
||||
'totalPages': pages,
|
||||
'limit': limit,
|
||||
'statistics': statistics
|
||||
}
|
||||
|
||||
resp.media = statistics
|
||||
resp.media = message
|
||||
|
||||
class ItemResource:
|
||||
def on_get(self, req, resp, item_id):
|
||||
"""Handles GET requests"""
|
||||
|
||||
cursor = db.cursor()
|
||||
cursor.execute('SELECT views, downloads FROM items WHERE id={}'.format(item_id))
|
||||
if cursor.rowcount == 0:
|
||||
raise falcon.HTTPNotFound(
|
||||
title='Item not found',
|
||||
description='The item with id "{}" was not found.'.format(item_id)
|
||||
)
|
||||
else:
|
||||
results = cursor.fetchone()
|
||||
|
||||
statistics = {
|
||||
'id': item_id,
|
||||
'views': results['views'],
|
||||
'downloads': results['downloads']
|
||||
}
|
||||
|
||||
resp.media = statistics
|
||||
|
||||
cursor.close()
|
||||
|
||||
api = falcon.API()
|
||||
api.add_route('/item', ItemResource())
|
||||
api.add_route('/items', AllItemsResource())
|
||||
api.add_route('/item/{item_id:int}', ItemResource())
|
||||
|
||||
# vim: set sw=4 ts=4 expandtab:
|
||||
|
11
config.py
Normal file
11
config.py
Normal file
@ -0,0 +1,11 @@
|
||||
import os
|
||||
|
||||
# Check if Solr connection information was provided in the environment
|
||||
SOLR_SERVER = os.environ.get('SOLR_SERVER', 'http://localhost:8080/solr')
|
||||
|
||||
DATABASE_NAME = os.environ.get('DATABASE_NAME', 'dspacestatistics')
|
||||
DATABASE_USER = os.environ.get('DATABASE_USER', 'dspacestatistics')
|
||||
DATABASE_PASS = os.environ.get('DATABASE_PASS', 'dspacestatistics')
|
||||
DATABASE_HOST = os.environ.get('DATABASE_HOST', 'localhost')
|
||||
|
||||
# vim: set sw=4 ts=4 expandtab:
|
20
contrib/dspace-statistics-api.service
Normal file
20
contrib/dspace-statistics-api.service
Normal file
@ -0,0 +1,20 @@
|
||||
[Unit]
|
||||
Description=DSpace Statistics API
|
||||
After=network.target
|
||||
|
||||
[Service]
|
||||
Environment=DATABASE_NAME=dspacestatistics
|
||||
Environment=DATABASE_USER=dspacestatistics
|
||||
Environment=DATABASE_PASS=dspacestatistics
|
||||
Environment=DATABASE_HOST=localhost
|
||||
User=nobody
|
||||
Group=nogroup
|
||||
WorkingDirectory=/opt/ilri/dspace-statistics-api
|
||||
ExecStart=/opt/ilri/dspace-statistics-api/venv/bin/gunicorn \
|
||||
--bind 127.0.0.1:5000 \
|
||||
app:api
|
||||
ExecReload=/bin/kill -s HUP $MAINPID
|
||||
ExecStop=/bin/kill -s TERM $MAINPID
|
||||
|
||||
[Install]
|
||||
WantedBy=multi-user.target
|
17
contrib/dspace-statistics-indexer.service
Normal file
17
contrib/dspace-statistics-indexer.service
Normal file
@ -0,0 +1,17 @@
|
||||
[Unit]
|
||||
Description=DSpace Statistics Indexer
|
||||
After=tomcat7.target
|
||||
|
||||
[Service]
|
||||
Environment=SOLR_SERVER=http://localhost:8081/solr
|
||||
Environment=DATABASE_NAME=dspacestatistics
|
||||
Environment=DATABASE_USER=dspacestatistics
|
||||
Environment=DATABASE_PASS=dspacestatistics
|
||||
Environment=DATABASE_HOST=localhost
|
||||
User=nobody
|
||||
Group=nogroup
|
||||
WorkingDirectory=/opt/ilri/dspace-statistics-api
|
||||
ExecStart=/opt/ilri/dspace-statistics-api/venv/bin/python indexer.py
|
||||
|
||||
[Install]
|
||||
WantedBy=multi-user.target
|
12
contrib/dspace-statistics-indexer.timer
Normal file
12
contrib/dspace-statistics-indexer.timer
Normal file
@ -0,0 +1,12 @@
|
||||
[Unit]
|
||||
Description=DSpace Statistics Indexer
|
||||
|
||||
[Timer]
|
||||
# twice a day, at 6AM and 6PM
|
||||
OnCalendar=*-*-* 06,18:00:00
|
||||
# Add a random delay of 0–3600 seconds
|
||||
RandomizedDelaySec=3600
|
||||
Persistent=true
|
||||
|
||||
[Install]
|
||||
WantedBy=timers.target
|
12
database.py
Normal file
12
database.py
Normal file
@ -0,0 +1,12 @@
|
||||
from config import DATABASE_NAME
|
||||
from config import DATABASE_USER
|
||||
from config import DATABASE_PASS
|
||||
from config import DATABASE_HOST
|
||||
import psycopg2, psycopg2.extras
|
||||
|
||||
def database_connection():
|
||||
connection = psycopg2.connect("dbname={} user={} password={} host='{}'".format(DATABASE_NAME, DATABASE_USER, DATABASE_PASS, DATABASE_HOST), cursor_factory=psycopg2.extras.DictCursor)
|
||||
|
||||
return connection
|
||||
|
||||
# vim: set sw=4 ts=4 expandtab:
|
173
indexer.py
Executable file
173
indexer.py
Executable file
@ -0,0 +1,173 @@
|
||||
#!/usr/bin/env python
|
||||
#
|
||||
# indexer.py
|
||||
#
|
||||
# Copyright 2018 Alan Orth.
|
||||
#
|
||||
# This program is free software: you can redistribute it and/or modify
|
||||
# it under the terms of the GNU General Public License as published by
|
||||
# the Free Software Foundation, either version 3 of the License, or
|
||||
# (at your option) any later version.
|
||||
#
|
||||
# This program is distributed in the hope that it will be useful,
|
||||
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
# GNU General Public License for more details.
|
||||
#
|
||||
# You should have received a copy of the GNU General Public License
|
||||
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||
#
|
||||
# ---
|
||||
#
|
||||
# Connects to a DSpace Solr statistics core and ingests item views and downloads
|
||||
# into a PostgreSQL database for use by other applications (like an API).
|
||||
#
|
||||
# This script is written for Python 3.5+ and requires several modules that you
|
||||
# can install with pip (I recommend using a Python virtual environment):
|
||||
#
|
||||
# $ pip install SolrClient psycopg2-binary
|
||||
#
|
||||
# See: https://solrclient.readthedocs.io/en/latest/SolrClient.html
|
||||
# See: https://wiki.duraspace.org/display/DSPACE/Solr
|
||||
|
||||
from database import database_connection
|
||||
import json
|
||||
import psycopg2.extras
|
||||
from solr import solr_connection
|
||||
|
||||
def index_views():
|
||||
# get total number of distinct facets for items with a minimum of 1 view,
|
||||
# otherwise Solr returns all kinds of weird ids that are actually not in
|
||||
# the database. Also, stats are expensive, but we need stats.calcdistinct
|
||||
# so we can get the countDistinct summary.
|
||||
#
|
||||
# see: https://lucene.apache.org/solr/guide/6_6/the-stats-component.html
|
||||
res = solr.query('statistics', {
|
||||
'q':'type:2',
|
||||
'fq':'isBot:false AND statistics_type:view',
|
||||
'facet':True,
|
||||
'facet.field':'id',
|
||||
'facet.mincount':1,
|
||||
'facet.limit':1,
|
||||
'facet.offset':0,
|
||||
'stats':True,
|
||||
'stats.field':'id',
|
||||
'stats.calcdistinct':True
|
||||
}, rows=0)
|
||||
|
||||
# get total number of distinct facets (countDistinct)
|
||||
results_totalNumFacets = json.loads(res.get_json())['stats']['stats_fields']['id']['countDistinct']
|
||||
|
||||
# divide results into "pages" (cast to int to effectively round down)
|
||||
results_per_page = 100
|
||||
results_num_pages = int(results_totalNumFacets / results_per_page)
|
||||
results_current_page = 0
|
||||
|
||||
cursor = db.cursor()
|
||||
|
||||
# create an empty list to store values for batch insertion
|
||||
data = []
|
||||
|
||||
while results_current_page <= results_num_pages:
|
||||
print('Indexing item views (page {} of {})'.format(results_current_page, results_num_pages))
|
||||
|
||||
res = solr.query('statistics', {
|
||||
'q':'type:2',
|
||||
'fq':'isBot:false AND statistics_type:view',
|
||||
'facet':True,
|
||||
'facet.field':'id',
|
||||
'facet.mincount':1,
|
||||
'facet.limit':results_per_page,
|
||||
'facet.offset':results_current_page * results_per_page
|
||||
}, rows=0)
|
||||
|
||||
# SolrClient's get_facets() returns a dict of dicts
|
||||
views = res.get_facets()
|
||||
# in this case iterate over the 'id' dict and get the item ids and views
|
||||
for item_id, item_views in views['id'].items():
|
||||
data.append((item_id, item_views))
|
||||
|
||||
# do a batch insert of values from the current "page" of results
|
||||
sql = 'INSERT INTO items(id, views) VALUES %s ON CONFLICT(id) DO UPDATE SET downloads=excluded.views'
|
||||
psycopg2.extras.execute_values(cursor, sql, data, template='(%s, %s)')
|
||||
db.commit()
|
||||
|
||||
# clear all items from the list so we can populate it with the next batch
|
||||
data.clear()
|
||||
|
||||
results_current_page += 1
|
||||
|
||||
cursor.close()
|
||||
|
||||
def index_downloads():
|
||||
# get the total number of distinct facets for items with at least 1 download
|
||||
res = solr.query('statistics', {
|
||||
'q':'type:0',
|
||||
'fq':'isBot:false AND statistics_type:view AND bundleName:ORIGINAL',
|
||||
'facet':True,
|
||||
'facet.field':'owningItem',
|
||||
'facet.mincount':1,
|
||||
'facet.limit':1,
|
||||
'facet.offset':0,
|
||||
'stats':True,
|
||||
'stats.field':'owningItem',
|
||||
'stats.calcdistinct':True
|
||||
}, rows=0)
|
||||
|
||||
# get total number of distinct facets (countDistinct)
|
||||
results_totalNumFacets = json.loads(res.get_json())['stats']['stats_fields']['owningItem']['countDistinct']
|
||||
|
||||
# divide results into "pages" (cast to int to effectively round down)
|
||||
results_per_page = 100
|
||||
results_num_pages = int(results_totalNumFacets / results_per_page)
|
||||
results_current_page = 0
|
||||
|
||||
cursor = db.cursor()
|
||||
|
||||
# create an empty list to store values for batch insertion
|
||||
data = []
|
||||
|
||||
while results_current_page <= results_num_pages:
|
||||
print('Indexing item downloads (page {} of {})'.format(results_current_page, results_num_pages))
|
||||
|
||||
res = solr.query('statistics', {
|
||||
'q':'type:0',
|
||||
'fq':'isBot:false AND statistics_type:view AND bundleName:ORIGINAL',
|
||||
'facet':True,
|
||||
'facet.field':'owningItem',
|
||||
'facet.mincount':1,
|
||||
'facet.limit':results_per_page,
|
||||
'facet.offset':results_current_page * results_per_page
|
||||
}, rows=0)
|
||||
|
||||
# SolrClient's get_facets() returns a dict of dicts
|
||||
downloads = res.get_facets()
|
||||
# in this case iterate over the 'owningItem' dict and get the item ids and downloads
|
||||
for item_id, item_downloads in downloads['owningItem'].items():
|
||||
data.append((item_id, item_downloads))
|
||||
|
||||
# do a batch insert of values from the current "page" of results
|
||||
sql = 'INSERT INTO items(id, downloads) VALUES %s ON CONFLICT(id) DO UPDATE SET downloads=excluded.downloads'
|
||||
psycopg2.extras.execute_values(cursor, sql, data, template='(%s, %s)')
|
||||
db.commit()
|
||||
|
||||
# clear all items from the list so we can populate it with the next batch
|
||||
data.clear()
|
||||
|
||||
results_current_page += 1
|
||||
|
||||
cursor.close()
|
||||
|
||||
db = database_connection()
|
||||
solr = solr_connection()
|
||||
|
||||
# create table to store item views and downloads
|
||||
cursor = db.cursor()
|
||||
cursor.execute('''CREATE TABLE IF NOT EXISTS items
|
||||
(id INT PRIMARY KEY, views INT DEFAULT 0, downloads INT DEFAULT 0)''')
|
||||
index_views()
|
||||
index_downloads()
|
||||
|
||||
db.close()
|
||||
|
||||
# vim: set sw=4 ts=4 expandtab:
|
12
requirements.txt
Normal file
12
requirements.txt
Normal file
@ -0,0 +1,12 @@
|
||||
certifi==2018.8.24
|
||||
chardet==3.0.4
|
||||
falcon==1.4.1
|
||||
gunicorn==19.9.0
|
||||
idna==2.7
|
||||
kazoo==2.5.0
|
||||
psycopg2-binary==2.7.5
|
||||
python-mimeparse==1.6.0
|
||||
requests==2.19.1
|
||||
six==1.11.0
|
||||
SolrClient==0.2.1
|
||||
urllib3==1.23
|
Reference in New Issue
Block a user